Sample records for existing models developed

  1. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  2. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    PubMed

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  3. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  4. Integrated Electronic Warfare Systems Aboard the United States Navy 21st Century Warship

    DTIC Science & Technology

    2009-12-01

    automated operation using a Human-In-the-Loop that could be integrated into existing and future combat systems. A model was developed that demonstrates...complete range of automated operation using a Human-In-the-Loop that could be integrated into existing and future combat systems. A model was developed...44 1. Base Case Model

  5. Development of the information model for consumer assessment of key quality indicators by goods labelling

    NASA Astrophysics Data System (ADS)

    Koshkina, S.; Ostrinskaya, L.

    2018-04-01

    An information model for “key” quality indicators of goods has been developed. This model is based on the assessment of f standardization existing state and the product labeling quality. According to the authors’ opinion, the proposed “key” indicators are the most significant for purchasing decision making. Customers will be able to use this model through their mobile technical devices. The developed model allows to decompose existing processes in data flows and to reveal the levels of possible architectural solutions. In-depth analysis of the presented information model decomposition levels will allow determining the stages of its improvement and to reveal additional indicators of the goods quality that are of interest to customers in the further research. Examining the architectural solutions for the customer’s information environment functioning when integrating existing databases will allow us to determine the boundaries of the model flexibility and customizability.

  6. Review of existing terrestrial bioaccumulation models and terrestrial bioaccumulation modeling needs for organic chemicals

    EPA Science Inventory

    Protocols for terrestrial bioaccumulation assessments are far less-developed than for aquatic systems. This manuscript reviews modeling approaches that can be used to assess the terrestrial bioaccumulation potential of commercial organic chemicals. Models exist for plant, inver...

  7. Development, Testing, and Validation of a Model-Based Tool to Predict Operator Responses in Unexpected Workload Transitions

    NASA Technical Reports Server (NTRS)

    Sebok, Angelia; Wickens, Christopher; Sargent, Robert

    2015-01-01

    One human factors challenge is predicting operator performance in novel situations. Approaches such as drawing on relevant previous experience, and developing computational models to predict operator performance in complex situations, offer potential methods to address this challenge. A few concerns with modeling operator performance are that models need to realistic, and they need to be tested empirically and validated. In addition, many existing human performance modeling tools are complex and require that an analyst gain significant experience to be able to develop models for meaningful data collection. This paper describes an effort to address these challenges by developing an easy to use model-based tool, using models that were developed from a review of existing human performance literature and targeted experimental studies, and performing an empirical validation of key model predictions.

  8. Modeling the Environmental Impact of Air Traffic Operations

    NASA Technical Reports Server (NTRS)

    Chen, Neil

    2011-01-01

    There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.

  9. Development of an acute oral toxicity dataset to facilitate assessment of existing QSARs and development of new models (WC10)

    EPA Science Inventory

    Acute oral toxicity data are used to meet both regulatory and non-regulatory needs. Recently, there have been efforts to explore alternative approaches for predicting acute oral toxicity such as QSARs. Evaluating the performance and scope of existing models and investigating the ...

  10. Pre-existing periodontitis exacerbates experimental arthritis in a mouse model.

    PubMed

    Cantley, Melissa D; Haynes, David R; Marino, Victor; Bartold, P Mark

    2011-06-01

    Previous studies have shown a higher incidence of alveolar bone loss in patients with rheumatoid arthritis (RA) and that patients with periodontitis are at a greater risk of developing RA. The aim of this study was to develop an animal model to assess the relationship between pre-existing periodontitis and experimental arthritis (EA). Periodontitis was first induced in mice by oral gavage with Porphyromonas gingivalis followed by EA using the collagen antibody-induced arthritis model. These animals were compared with animals with periodontitis alone, EA alone and no disease (controls). Visual changes in paw swelling were assessed to determine clinical development of EA. Alveolar bone and joint changes were assessed using micro-CT, histological analyses and immunohistochemistry. Serum levels of C-reactive protein were used to monitor systemic inflammation. Mice with pre-existing periodontitis developed more severe arthritis, which developed at a faster rate. Mice with periodontitis only also showed evidence of loss of bone within the radiocarpal joint. There was also evidence of alveolar bone loss in mice with EA alone. The results of this study indicate that pre-existing periodontitis exacerbated experimental arthritis in a mouse model. © 2011 John Wiley & Sons A/S.

  11. Incorporation of Electrical Systems Models Into an Existing Thermodynamic Cycle Code

    NASA Technical Reports Server (NTRS)

    Freeh, Josh

    2003-01-01

    Integration of entire system includes: Fuel cells, motors, propulsors, thermal/power management, compressors, etc. Use of existing, pre-developed NPSS capabilities includes: 1) Optimization tools; 2) Gas turbine models for hybrid systems; 3) Increased interplay between subsystems; 4) Off-design modeling capabilities; 5) Altitude effects; and 6) Existing transient modeling architecture. Other factors inclde: 1) Easier transfer between users and groups of users; 2) General aerospace industry acceptance and familiarity; and 3) Flexible analysis tool that can also be used for ground power applications.

  12. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  13. Archetype modeling methodology.

    PubMed

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. A review of statistical updating methods for clinical prediction models.

    PubMed

    Su, Ting-Li; Jaki, Thomas; Hickey, Graeme L; Buchan, Iain; Sperrin, Matthew

    2018-01-01

    A clinical prediction model is a tool for predicting healthcare outcomes, usually within a specific population and context. A common approach is to develop a new clinical prediction model for each population and context; however, this wastes potentially useful historical information. A better approach is to update or incorporate the existing clinical prediction models already developed for use in similar contexts or populations. In addition, clinical prediction models commonly become miscalibrated over time, and need replacing or updating. In this article, we review a range of approaches for re-using and updating clinical prediction models; these fall in into three main categories: simple coefficient updating, combining multiple previous clinical prediction models in a meta-model and dynamic updating of models. We evaluated the performance (discrimination and calibration) of the different strategies using data on mortality following cardiac surgery in the United Kingdom: We found that no single strategy performed sufficiently well to be used to the exclusion of the others. In conclusion, useful tools exist for updating existing clinical prediction models to a new population or context, and these should be implemented rather than developing a new clinical prediction model from scratch, using a breadth of complementary statistical methods.

  15. 76 FR 253 - Airworthiness Directives; ROLLADEN-SCHNEIDER Flugzeugbau GmbH Model LS6 Gliders

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-04

    ... Airworthiness Directives; ROLLADEN-SCHNEIDER Flugzeugbau GmbH Model LS6 Gliders AGENCY: Federal Aviation... condition does not exist in the Model LS6-c gliders. FAA's Determination We are issuing this AD rescission... Model LS6 glider, and the unsafe condition described previously is not likely to exist or develop in the...

  16. Improved dual-porosity models for petrophysical analysis of vuggy reservoirs

    NASA Astrophysics Data System (ADS)

    Wang, Haitao

    2017-08-01

    A new vug interconnection, isolated vug (IVG), was investigated through resistivity modeling and the dual-porosity model for connected vug (CVG) vuggy reservoirs was tested. The vuggy models were built by pore-scale modeling, and their electrical resistivity was calculated by the finite difference method. For CVG vuggy reservoirs, the CVG reduced formation factors and increased the porosity exponents, and the existing dual-porosity model failed to match these results. Based on the existing dual-porosity model, a conceptual dual-porosity model for CVG was developed by introducing a decoupled term to reduce the resistivity of the model. For IVG vuggy reservoirs, IVG increased the formation factors and porosity exponents. The existing dual-porosity model succeeded due to accurate calculation of the formation factors of the deformed interparticle porous media caused by the insertion of the IVG. Based on the existing dual-porosity model, a new porosity model for IVG vuggy reservoirs was developed by simultaneously recalculating the formation factors of the altered interparticle pore-scale models. The formation factors and porosity exponents from the improved and extended dual-porosity models for CVG and IVG vuggy reservoirs well matched the simulated formation factors and porosity exponents. This work is helpful for understanding the influence of connected and disconnected vugs on resistivity factors—an issue of particular importance in carbonates.

  17. Impact of ergonomic intervention in manual orange harvester among the workers of hilly region in India.

    PubMed

    Pranav, P K; Patel, Thaneswer

    2016-04-07

    Manual orange harvesting is very laborious, time consuming and unsafe operation whereas neither mechanical harvesting nor mechanized hand harvesting is possible in north-east India due to its hilly terrains. The awkward postures and repetitive nature of work in orange harvesting, demands a comfortable and appropriate hand harvester for hilly region. The purpose of this study was to develop a manual orange harvester for hilly regions considering the ergonomic parameters, and compare the performance with the existing models of the manual harvester. In this study twenty healthy experienced orchard workers (10 male and 10 female) participated who did not have any previous functional musculoskeletal disorders. We developed a manual orange harvester by eliminating the problems associated with the existing harvesters. The developed model along with existing models was evaluated extensively in the field. During evaluations, heart rate of the subjects was measured and oxygen consumption was predicted to calculate the energy expenditure rate (EER) from the established relationship in the laboratory before the field experiments. Further, performance parameters of orange harvester i.e. plucking rate (PR), damaged quantity (DQ), plucking energy requirement (PER) and discomfort rating were also observed. The PR was 425, 300 and 287 pieces per hour for the developed model (DM), first existing model (EM1) and second existing model (EM2), respectively. The DM showed lower PER (2.14 kJ/piece) followed by EM2 (2.95 kJ/piece) and EM1 (4.02 kJ/piece) which is considered as overall performance as it includes energy per unit of plucking. Further, the body part discomfort score revealed that DM was more comfortable in use followed by EM2 and EM1. The performance of the DM was found better in terms of plucking rate, energy requirement and body part discomfort than the other existing models. Shoulders and neck are the most affected body parts where all subjects felt severe discomfort.

  18. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  19. Replicating Health Economic Models: Firm Foundations or a House of Cards?

    PubMed

    Bermejo, Inigo; Tappenden, Paul; Youn, Ji-Hee

    2017-11-01

    Health economic evaluation is a framework for the comparative analysis of the incremental health gains and costs associated with competing decision alternatives. The process of developing health economic models is usually complex, financially expensive and time-consuming. For these reasons, model development is sometimes based on previous model-based analyses; this endeavour is usually referred to as model replication. Such model replication activity may involve the comprehensive reproduction of an existing model or 'borrowing' all or part of a previously developed model structure. Generally speaking, the replication of an existing model may require substantially less effort than developing a new de novo model by bypassing, or undertaking in only a perfunctory manner, certain aspects of model development such as the development of a complete conceptual model and/or comprehensive literature searching for model parameters. A further motivation for model replication may be to draw on the credibility or prestige of previous analyses that have been published and/or used to inform decision making. The acceptability and appropriateness of replicating models depends on the decision-making context: there exists a trade-off between the 'savings' afforded by model replication and the potential 'costs' associated with reduced model credibility due to the omission of certain stages of model development. This paper provides an overview of the different levels of, and motivations for, replicating health economic models, and discusses the advantages, disadvantages and caveats associated with this type of modelling activity. Irrespective of whether replicated models should be considered appropriate or not, complete replicability is generally accepted as a desirable property of health economic models, as reflected in critical appraisal checklists and good practice guidelines. To this end, the feasibility of comprehensive model replication is explored empirically across a small number of recent case studies. Recommendations are put forward for improving reporting standards to enhance comprehensive model replicability.

  20. A Transactional Model of Bullying and Victimization

    ERIC Educational Resources Information Center

    Georgiou, Stelios N.; Fanti, Kostas A.

    2010-01-01

    The purpose of the current study was to develop and test a transactional model, based on longitudinal data, capable to describe the existing interrelation between maternal behavior and child bullying and victimization experiences over time. The results confirmed the existence of such a model for bullying, but not for victimization in terms of…

  1. Impact of Placement Type on the Development of Clinical Competency in Speech-Language Pathology Students

    ERIC Educational Resources Information Center

    Sheepway, Lyndal; Lincoln, Michelle; McAllister, Sue

    2014-01-01

    Background: Speech-language pathology students gain experience and clinical competency through clinical education placements. However, currently little empirical information exists regarding how competency develops. Existing research about the effectiveness of placement types and models in developing competency is generally descriptive and based…

  2. Developing Culturally Responsive Teaching through Professional Noticing within Teacher Educator Modelling

    ERIC Educational Resources Information Center

    Averill, Robin; Anderson, Dayle; Drake, Michael

    2015-01-01

    Much evidence exists that culturally responsive and equitable teaching practices are challenging to develop. Evidence exists that in-the-moment coaching of "rehearsals" of practice can help foster mathematics teaching strategies, but how such coaching can assist the development of culturally responsive practice is less clear. Drawn from…

  3. Development of the Transportation Revenue Estimator and Needs Determination System (TRENDS) forecasting model : MPO sub-models and maintenance.

    DOT National Transportation Integrated Search

    2011-11-01

    This report summarizes the technical work performed developing and incorporating Metropolitan Planning : Organization sub-models into the existing Texas Revenue Estimator and Needs Determination System : (TRENDS) model. Additionally, this report expl...

  4. Current State of the Art Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2017-08-01

    In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.

  5. Computer evaluation of existing and proposed fire lookouts

    Treesearch

    Romain M. Mees

    1976-01-01

    A computer simulation model has been developed for evaluating the fire detection capabilities of existing and proposed lookout stations. The model uses coordinate location of fires and lookouts, tower elevation, and topographic data to judge location of stations, and to determine where a fire can be seen. The model was tested by comparing it with manual detection on a...

  6. Traffic Flow Density Distribution Based on FEM

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Cui, Jianming

    In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.

  7. Effect law of Damage Characteristics of Rock Similar Material with Pre-Existing Cracks

    NASA Astrophysics Data System (ADS)

    Li, S. G.; Cheng, X. Y.; Liu, C.

    2017-11-01

    In order to further study the failure mechanism for rock similar materials, this study established the damage model based on accumulative AE events, investigated the damage characteristics for rock similar material samples with pre-existing cracks of varying width under uniaxial compression load. The equipment used in this study is the self-developed YYW-II strain controlled unconfined compression apparatus and the PCIE-8 acoustic emission (AE) monitoring system. The influences of the width of the pre-existing cracks to the damage characteristics of rock similar materials are analyzed. Results show that, (1) the damage model can better describe the damage characteristics of rock similar materials; (2) the tested samples have three stages during failure: initial damage stage, stable development of damage stage, and accelerated development of damage stage; (3) with the width of pre-existing cracks vary from 3mm to 5mm, the damage of rock similar materials increases gradually. The outcomes of this study provided additional values to the research of the failure mechanism for geotechnical similar material models.

  8. From Existence to Essence: A Conceptual Model for an Appalachian Studies Curriculum.

    ERIC Educational Resources Information Center

    Best, Billy F.

    Comprised of 4 chapters, this dissertation explores the existential premise "existence precedes essence" as applicable to development of a conceptual model for an Appalachian studies curriculum. Entitled "Personal Considerations: Pedagogy of a Hillbilly", the 1st chapter details the conflicts between the Appalachian institution…

  9. Introduction to World Peace through World Law. Revised Edition.

    ERIC Educational Resources Information Center

    Clark, Grenville; Sohn, Louis

    Two models for changing existing international organizations into effective instruments of world governance are presented. The first model revises the present Charter of the United Nations; the second calls for a new world security and development organization which would supplement the existing machinery of the United Nations for peacekeeping,…

  10. Development of a rotor wake-vortex model, volume 1

    NASA Technical Reports Server (NTRS)

    Majjigi, R. K.; Gliebe, P. R.

    1984-01-01

    Certain empirical rotor wake and turbulence relationships were developed using existing low speed rotor wave data. A tip vortex model was developed by replacing the annulus wall with a row of image vortices. An axisymmetric turbulence spectrum model, developed in the context of rotor inflow turbulence, was adapted to predicting the turbulence spectrum of the stator gust upwash.

  11. A PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODEL FOR TRICHLOROETHYLENE WITH SPECIFICITY FOR THE LONG EVANS RAT

    EPA Science Inventory

    A PBPK model for TCE with specificity for the male LE rat that accurately predicts TCE tissue time-course data has not been developed, although other PBPK models for TCE exist. Development of such a model was the present aim. The PBPK model consisted of 5 compartments: fat; slowl...

  12. Theoretical models of parental HIV disclosure: a critical review.

    PubMed

    Qiao, Shan; Li, Xiaoming; Stanton, Bonita

    2013-01-01

    This study critically examined three major theoretical models related to parental HIV disclosure (i.e., the Four-Phase Model [FPM], the Disclosure Decision Making Model [DDMM], and the Disclosure Process Model [DPM]), and the existing studies that could provide empirical support to these models or their components. For each model, we briefly reviewed its theoretical background, described its components and/or mechanisms, and discussed its strengths and limitations. The existing empirical studies supported most theoretical components in these models. However, hypotheses related to the mechanisms proposed in the models have not yet tested due to a lack of empirical evidence. This study also synthesized alternative theoretical perspectives and new issues in disclosure research and clinical practice that may challenge the existing models. The current study underscores the importance of including components related to social and cultural contexts in theoretical frameworks, and calls for more adequately designed empirical studies in order to test and refine existing theories and to develop new ones.

  13. Developing tolled-route demand estimation capabilities for Texas : opportunities for enhancement of existing models.

    DOT National Transportation Integrated Search

    2014-08-01

    The travel demand models developed and applied by the Transportation Planning and Programming Division : (TPP) of the Texas Department of Transportation (TxDOT) are daily three-step models (i.e., trip generation, trip : distribution, and traffic assi...

  14. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  15. Implementation of partnership management model of SMK (Vocational High School) with existing industries in mechanical engineering expertise in Central Java

    NASA Astrophysics Data System (ADS)

    Sumbodo, Wirawan; Pardjono, Samsudi, Rahadjo, Winarno Dwi

    2018-03-01

    This study aims to determine the existing conditions of implementation of partnership management model of SMK with the industry on the mechanical engineering expertise in Central Java. The method used is descriptive analysis. The research result shows that the implementation of partnership management model of SMK based on new existing industry produces ready graduates of 62.5% which belongs to low category, although the partnership program of SMK with the industry is done well with the average score of 3.17. As many as 37.5% of SMK graduates of Mechanical Engineering Expertise Program choose to continue their studies or to be an entrepreneur. It is expected that the partnership model of SMK with the industry can be developed into a reference for government policy in developing SMK that is able to produce graduates who are ready to work according to the needs of partner industry.

  16. A Methodology for Cybercraft Requirement Definition and Initial System Design

    DTIC Science & Technology

    2008-06-01

    the software development concepts of the SDLC , requirements, use cases and domain modeling . It ...collectively as Software Development 5 Life Cycle ( SDLC ) models . While there are numerous models that fit under the SDLC definition, all are based on... developed that provided expanded understanding of the domain, it is necessary to either update an existing domain model or create another domain

  17. A method to assess the allocation suitability of recreational activities: An economic approach

    NASA Astrophysics Data System (ADS)

    Wang, Hsiao-Lin

    1996-03-01

    Most existing methods of planning focus on development of a recreational area; less consideration is placed on the allocation of recreational activities within a recreational area. Most existing research emphasizes the economic benefits of developing a recreational area; few authors assessed the allocation suitability of recreational activities from an economic point of view. The purpose of this work was to develop a model to assess the allocation suitability of recreational activities according to the application of a concept of analysis of cost and benefit under a premise of ecological concern. The model was verified with a case study of Taiwan. We suggest that the proposed model should form a critical part of recreational planning.

  18. Incorporating Learning Theory into Existing Systems Engineering Models

    DTIC Science & Technology

    2013-09-01

    3. Social  Cognition 22 Table 1. Classification of learning theories Behaviorism Cognitivism Constructivism Connectivism...Introdution to design of large scale systems. New York: Mcgraw-Hill. Grusec. J. (1992). Social learning theory and development psychology: The... LEARNING THEORY INTO EXISTING SYSTEMS ENGINEERING MODELS by Valentine Leo September 2013 Thesis Advisor: Gary O. Langford Co-Advisor

  19. Setting priorities in health research using the model proposed by the World Health Organization: development of a quantitative methodology using tuberculosis in South Africa as a worked example.

    PubMed

    Hacking, Damian; Cleary, Susan

    2016-02-09

    Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non-cost-effective interventions. The amended model was successfully constructed using limited data sources. The generalizability of the data used is the main limitation of the model. More complex formulas are required to deal with such potential confounding variables; however, the results act as starting point for development of a more robust model.

  20. BEopt-CA (Ex) -- A Tool for Optimal Integration of EE/DR/ES+PV in Existing California Homes. Cooperative Research and Development Final Report, CRADA Number CRD-11-429

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Craig

    Opportunities for combining energy efficiency, demand response, and energy storage with PV are often missed, because the required knowledge and expertise for these different technologies exist in separate organizations or individuals. Furthermore, there is a lack of quantitative tools to optimize energy efficiency, demand response and energy storage with PV, especially for existing buildings. Our goal is to develop a modeling tool, BEopt-CA (Ex), with capabilities to facilitate identification and implementation of a balanced integration of energy efficiency (EE), demand response (DR), and energy storage (ES) with photovoltaics (PV) within the residential retrofit market. To achieve this goal, we willmore » adapt and extend an existing tool -- BEopt -- that is designed to identify optimal combinations of efficiency and PV in new home designs. In addition, we will develop multifamily residential modeling capabilities for use in California, to facilitate integration of distributed solar power into the grid in order to maximize its value to California ratepayers. The project is follow-on research that leverages previous California Solar Initiative RD&D investment in the BEopt software. BEopt facilitates finding the least cost combination of energy efficiency and renewables to support integrated DSM (iDSM) and Zero Net Energy (ZNE) in California residential buildings. However, BEopt is currently focused on modeling single-family houses and does not include satisfactory capabilities for modeling multifamily homes. The project brings BEopt's existing modeling and optimization capabilities to multifamily buildings, including duplexes, triplexes, townhouses, flats, and low-rise apartment buildings.« less

  1. DEVELOPMENT AND EVALUATION OF AN INTEGRATED MODEL TO FACILITATE RISK-BASED CORRECTIVE ACTION AT SUPERFUND SITES

    EPA Science Inventory

    We developed a numerical model to predict chemical concentrations in indoor environments resulting from soil vapor intrusion and volatilization from groundwater. The model, which integrates new and existing algorithms for chemical fate and transport, was originally...

  2. Modeling Aromatic Liquids:  Toluene, Phenol, and Pyridine.

    PubMed

    Baker, Christopher M; Grant, Guy H

    2007-03-01

    Aromatic groups are now acknowledged to play an important role in many systems of interest. However, existing molecular mechanics methods provide a poor representation of these groups. In a previous paper, we have shown that the molecular mechanics treatment of benzene can be improved by the incorporation of an explicit representation of the aromatic π electrons. Here, we develop this concept further, developing charge-separation models for toluene, phenol, and pyridine. Monte Carlo simulations are used to parametrize the models, via the reproduction of experimental thermodynamic data, and our models are shown to outperform an existing atom-centered model. The models are then used to make predictions about the structures of the liquids at the molecular level and are tested further through their application to the modeling of gas-phase dimers and cation-π interactions.

  3. Stratification established by peeling detrainment from gravity currents: laboratory experiments and models

    NASA Astrophysics Data System (ADS)

    Hogg, Charlie; Dalziel, Stuart; Huppert, Herbert; Imberger, Jorg; Department of Applied Mathematics; Theoretical Physics Team; CentreWater Research Team

    2014-11-01

    Dense gravity currents feed fluid into confined basins in lakes, the oceans and many industrial applications. Existing models of the circulation and mixing in such basins are often based on the currents entraining ambient fluid. However, recent observations have suggested that uni-directional entrainment into a gravity current does not fully describe the mixing in such currents. Laboratory experiments were carried out which visualised peeling detrainment from the gravity current occurring when the ambient fluid was stratified. A theoretical model of the observed peeling detrainment was developed to predict the stratification in the basin. This new model gives a better approximation of the stratification observed in the experiments than the pre-existing entraining model. The model can now be developed such that it integrates into operational models of lakes.

  4. A nonequilibrium model for reactive contaminant transport through fractured porous media: Model development and semianalytical solution

    NASA Astrophysics Data System (ADS)

    Joshi, Nitin; Ojha, C. S. P.; Sharma, P. K.

    2012-10-01

    In this study a conceptual model that accounts for the effects of nonequilibrium contaminant transport in a fractured porous media is developed. Present model accounts for both physical and sorption nonequilibrium. Analytical solution was developed using the Laplace transform technique, which was then numerically inverted to obtain solute concentration in the fracture matrix system. The semianalytical solution developed here can incorporate both semi-infinite and finite fracture matrix extent. In addition, the model can account for flexible boundary conditions and nonzero initial condition in the fracture matrix system. The present semianalytical solution was validated against the existing analytical solutions for the fracture matrix system. In order to differentiate between various sorption/transport mechanism different cases of sorption and mass transfer were analyzed by comparing the breakthrough curves and temporal moments. It was found that significant differences in the signature of sorption and mass transfer exists. Applicability of the developed model was evaluated by simulating the published experimental data of Calcium and Strontium transport in a single fracture. The present model simulated the experimental data reasonably well in comparison to the model based on equilibrium sorption assumption in fracture matrix system, and multi rate mass transfer model.

  5. Acute Kidney Injury Risk Prediction in Patients Undergoing Coronary Angiography in a National Veterans Health Administration Cohort With External Validation.

    PubMed

    Brown, Jeremiah R; MacKenzie, Todd A; Maddox, Thomas M; Fly, James; Tsai, Thomas T; Plomondon, Mary E; Nielson, Christopher D; Siew, Edward D; Resnic, Frederic S; Baker, Clifton R; Rumsfeld, John S; Matheny, Michael E

    2015-12-11

    Acute kidney injury (AKI) occurs frequently after cardiac catheterization and percutaneous coronary intervention. Although a clinical risk model exists for percutaneous coronary intervention, no models exist for both procedures, nor do existing models account for risk factors prior to the index admission. We aimed to develop such a model for use in prospective automated surveillance programs in the Veterans Health Administration. We collected data on all patients undergoing cardiac catheterization or percutaneous coronary intervention in the Veterans Health Administration from January 01, 2009 to September 30, 2013, excluding patients with chronic dialysis, end-stage renal disease, renal transplant, and missing pre- and postprocedural creatinine measurement. We used 4 AKI definitions in model development and included risk factors from up to 1 year prior to the procedure and at presentation. We developed our prediction models for postprocedural AKI using the least absolute shrinkage and selection operator (LASSO) and internally validated using bootstrapping. We developed models using 115 633 angiogram procedures and externally validated using 27 905 procedures from a New England cohort. Models had cross-validated C-statistics of 0.74 (95% CI: 0.74-0.75) for AKI, 0.83 (95% CI: 0.82-0.84) for AKIN2, 0.74 (95% CI: 0.74-0.75) for contrast-induced nephropathy, and 0.89 (95% CI: 0.87-0.90) for dialysis. We developed a robust, externally validated clinical prediction model for AKI following cardiac catheterization or percutaneous coronary intervention to automatically identify high-risk patients before and immediately after a procedure in the Veterans Health Administration. Work is ongoing to incorporate these models into routine clinical practice. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  6. Long-term athletic development- part 1: a pathway for all youth.

    PubMed

    Lloyd, Rhodri S; Oliver, Jon L; Faigenbaum, Avery D; Howard, Rick; De Ste Croix, Mark B A; Williams, Craig A; Best, Thomas M; Alvar, Brent A; Micheli, Lyle J; Thomas, D Phillip; Hatfield, Disa L; Cronin, John B; Myer, Gregory D

    2015-05-01

    The concept of developing talent and athleticism in youth is the goal of many coaches and sports systems. Consequently, an increasing number of sporting organizations have adopted long-term athletic development models in an attempt to provide a structured approach to the training of youth. It is clear that maximizing sporting talent is an important goal of long-term athletic development models. However, ensuring that youth of all ages and abilities are provided with a strategic plan for the development of their health and physical fitness is also important to maximize physical activity participation rates, reduce the risk of sport- and activity-related injury, and to ensure long-term health and well-being. Critical reviews of independent models of long-term athletic development are already present within the literature; however, to the best of our knowledge, a comprehensive examination and review of the most prominent models does not exist. Additionally, considerations of modern day issues that may impact on the success of any long-term athletic development model are lacking, as are proposed solutions to address such issues. Therefore, within this 2-part commentary, Part 1 provides a critical review of existing models of practice for long-term athletic development and introduces a composite youth development model that includes the integration of talent, psychosocial and physical development across maturation. Part 2 identifies limiting factors that may restrict the success of such models and offers potential solutions.

  7. Validation of Risk Assessment Models of Venous Thromboembolism in Hospitalized Medical Patients.

    PubMed

    Greene, M Todd; Spyropoulos, Alex C; Chopra, Vineet; Grant, Paul J; Kaatz, Scott; Bernstein, Steven J; Flanders, Scott A

    2016-09-01

    Patients hospitalized for acute medical illness are at increased risk for venous thromboembolism. Although risk assessment is recommended and several at-admission risk assessment models have been developed, these have not been adequately derived or externally validated. Therefore, an optimal approach to evaluate venous thromboembolism risk in medical patients is not known. We conducted an external validation study of existing venous thromboembolism risk assessment models using data collected on 63,548 hospitalized medical patients as part of the Michigan Hospital Medicine Safety (HMS) Consortium. For each patient, cumulative venous thromboembolism risk scores and risk categories were calculated. Cox regression models were used to quantify the association between venous thromboembolism events and assigned risk categories. Model discrimination was assessed using Harrell's C-index. Venous thromboembolism incidence in hospitalized medical patients is low (1%). Although existing risk assessment models demonstrate good calibration (hazard ratios for "at-risk" range 2.97-3.59), model discrimination is generally poor for all risk assessment models (C-index range 0.58-0.64). The performance of several existing risk assessment models for predicting venous thromboembolism among acutely ill, hospitalized medical patients at admission is limited. Given the low venous thromboembolism incidence in this nonsurgical patient population, careful consideration of how best to utilize existing venous thromboembolism risk assessment models is necessary, and further development and validation of novel venous thromboembolism risk assessment models for this patient population may be warranted. Published by Elsevier Inc.

  8. The Key to Employability Developing a Practical Model of Graduate Employability

    ERIC Educational Resources Information Center

    Pool, Lorraine Dacre; Sewell, Peter

    2007-01-01

    Purpose: The purpose of this paper is to introduce a straightforward, practical model of employability that will allow the concept to be explained easily and that can be used as a framework for working with students to develop their employability. Design/methodology/approach: The model was developed from existing research into employability issues…

  9. Digital Broadband Content: Digital Content Strategies and policies. OECD Digital Economy Papers, No. 119

    ERIC Educational Resources Information Center

    OECD Publishing (NJ1), 2006

    2006-01-01

    The development of digital content raises new issues as rapid technological developments challenge existing business models and government policies. This OECD study identifies and discusses six groups of business and public policy issues and illustrates these with existing and potential OECD Digital Content Strategies and Policies: (1) Innovation…

  10. Evaluation of Proteus as a Tool for the Rapid Development of Models of Hydrologic Systems

    NASA Astrophysics Data System (ADS)

    Weigand, T. M.; Farthing, M. W.; Kees, C. E.; Miller, C. T.

    2013-12-01

    Models of modern hydrologic systems can be complex and involve a variety of operators with varying character. The goal is to implement approximations of such models that are both efficient for the developer and computationally efficient, which is a set of naturally competing objectives. Proteus is a Python-based toolbox that supports prototyping of model formulations as well as a wide variety of modern numerical methods and parallel computing. We used Proteus to develop numerical approximations for three models: Richards' equation, a brine flow model derived using the Thermodynamically Constrained Averaging Theory (TCAT), and a multiphase TCAT-based tumor growth model. For Richards' equation, we investigated discontinuous Galerkin solutions with higher order time integration based on the backward difference formulas. The TCAT brine flow model was implemented using Proteus and a variety of numerical methods were compared to hand coded solutions. Finally, an existing tumor growth model was implemented in Proteus to introduce more advanced numerics and allow the code to be run in parallel. From these three example models, Proteus was found to be an attractive open-source option for rapidly developing high quality code for solving existing and evolving computational science models.

  11. Prognosis model for stand development

    Treesearch

    Albert R. Stage

    1973-01-01

    Describes a set of computer programs for developing prognoses of the development of existing stand under alternative regimes of management. Calibration techniques, modeling procedures, and a procedure for including stochastic variation are described. Implementation of the system for lodgepole pine, including assessment of losses attributed to an infestation of mountain...

  12. An Alternative Theoretical Model: Examining Psychosocial Identity Development of International Students in the United States

    ERIC Educational Resources Information Center

    Kim, Eunyoung

    2012-01-01

    Despite the plethora of college student identity development research, very little attention has been paid to the identity formation of international students. Rather than adopting existing identity theories in college student development, this exploratory qualitative study proposes a new psychosocial identity development model for international…

  13. A Critical Review for Developing Accurate and Dynamic Predictive Models Using Machine Learning Methods in Medicine and Health Care.

    PubMed

    Alanazi, Hamdan O; Abdullah, Abdul Hanan; Qureshi, Kashif Naseer

    2017-04-01

    Recently, Artificial Intelligence (AI) has been used widely in medicine and health care sector. In machine learning, the classification or prediction is a major field of AI. Today, the study of existing predictive models based on machine learning methods is extremely active. Doctors need accurate predictions for the outcomes of their patients' diseases. In addition, for accurate predictions, timing is another significant factor that influences treatment decisions. In this paper, existing predictive models in medicine and health care have critically reviewed. Furthermore, the most famous machine learning methods have explained, and the confusion between a statistical approach and machine learning has clarified. A review of related literature reveals that the predictions of existing predictive models differ even when the same dataset is used. Therefore, existing predictive models are essential, and current methods must be improved.

  14. WHAT ARE THE BEST MEANS TO ASSESS SITES AND MOVE TOWARD CLOSURE, USING APPROPRIATE SITE SPECIFIC RISK EVALUATIONS?

    EPA Science Inventory

    To facilitate evaluation of existing site characterization data, ORD has developed on-line tools and models that integrate data and models into innovative applications. Forty calculators have been developed in four groups: parameter estimators, models, scientific demos and unit ...

  15. Innovating Method of Existing Mechanical Product Based on TRIZ Theory

    NASA Astrophysics Data System (ADS)

    Zhao, Cunyou; Shi, Dongyan; Wu, Han

    Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.

  16. Model representations of kerogen structures: An insight from density functional theory calculations and spectroscopic measurements

    DOE PAGES

    Weck, Philippe F.; Kim, Eunja; Wang, Yifeng; ...

    2017-08-01

    Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematicallymore » compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.« less

  17. Model representations of kerogen structures: An insight from density functional theory calculations and spectroscopic measurements.

    PubMed

    Weck, Philippe F; Kim, Eunja; Wang, Yifeng; Kruichak, Jessica N; Mills, Melissa M; Matteo, Edward N; Pellenq, Roland J-M

    2017-08-01

    Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematically compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.

  18. Model representations of kerogen structures: An insight from density functional theory calculations and spectroscopic measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weck, Philippe F.; Kim, Eunja; Wang, Yifeng

    Molecular structures of kerogen control hydrocarbon production in unconventional reservoirs. Significant progress has been made in developing model representations of various kerogen structures. These models have been widely used for the prediction of gas adsorption and migration in shale matrix. However, using density functional perturbation theory (DFPT) calculations and vibrational spectroscopic measurements, we here show that a large gap may still remain between the existing model representations and actual kerogen structures, therefore calling for new model development. Using DFPT, we calculated Fourier transform infrared (FTIR) spectra for six most widely used kerogen structure models. The computed spectra were then systematicallymore » compared to the FTIR absorption spectra collected for kerogen samples isolated from Mancos, Woodford and Marcellus formations representing a wide range of kerogen origin and maturation conditions. Limited agreement between the model predictions and the measurements highlights that the existing kerogen models may still miss some key features in structural representation. A combination of DFPT calculations with spectroscopic measurements may provide a useful diagnostic tool for assessing the adequacy of a proposed structural model as well as for future model development. This approach may eventually help develop comprehensive infrared (IR)-fingerprints for tracing kerogen evolution.« less

  19. Integrating O/S models during conceptual design, part 1

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    The University of Dayton is pleased to submit this report to the National Aeronautics and Space Administration (NASA), Langley Research Center, which integrates a set of models for determining operational capabilities and support requirements during the conceptual design of proposed space systems. This research provides for the integration of the reliability and maintainability (R&M) model, both new and existing simulation models, and existing operations and support (O&S) costing equations in arriving at a complete analysis methodology. Details concerning the R&M model and the O&S costing model may be found in previous reports accomplished under this grant (NASA Research Grant NAG1-1327). In the process of developing this comprehensive analysis approach, significant enhancements were made to the R&M model, updates to the O&S costing model were accomplished, and a new simulation model developed. This is the 1st part of a 3 part technical report.

  20. A Multidimensional Model for Child Maltreatment Prevention Readiness in Low- and Middle-Income Countries

    ERIC Educational Resources Information Center

    Mikton, Christopher; Mehra, Radhika; Butchart, Alexander; Addiss, David; Almuneef, Maha; Cardia, Nancy; Cheah, Irene; Chen, JingQi; Makoae, Mokhantso; Raleva, Marija

    2011-01-01

    The study's aim was to develop a multidimensional model for the assessment of child maltreatment prevention readiness in low- and middle-income countries. The model was developed based on a conceptual review of relevant existing models and approaches, an international expert consultation, and focus groups in six countries. The final model…

  1. Library Legislation.

    ERIC Educational Resources Information Center

    Kavass, Igor

    Examination of several library legislation models developed to meet the needs of developed and developing nations reveals that our traditional notion of the library's role in society must be abandoned if we wish to reconcile its benefits to its costs. Four models currently exist: many nations, particularly Asian, have no legislation; most nations,…

  2. Developing rural palliative care: validating a conceptual model.

    PubMed

    Kelley, Mary Lou; Williams, Allison; DeMiglio, Lily; Mettam, Hilary

    2011-01-01

    The purpose of this research was to validate a conceptual model for developing palliative care in rural communities. This model articulates how local rural healthcare providers develop palliative care services according to four sequential phases. The model has roots in concepts of community capacity development, evolves from collaborative, generalist rural practice, and utilizes existing health services infrastructure. It addresses how rural providers manage challenges, specifically those related to: lack of resources, minimal community understanding of palliative care, health professionals' resistance, the bureaucracy of the health system, and the obstacles of providing services in rural environments. Seven semi-structured focus groups were conducted with interdisciplinary health providers in 7 rural communities in two Canadian provinces. Using a constant comparative analysis approach, focus group data were analyzed by examining participants' statements in relation to the model and comparing emerging themes in the development of rural palliative care to the elements of the model. The data validated the conceptual model as the model was able to theoretically predict and explain the experiences of the 7 rural communities that participated in the study. New emerging themes from the data elaborated existing elements in the model and informed the requirement for minor revisions. The model was validated and slightly revised, as suggested by the data. The model was confirmed as being a useful theoretical tool for conceptualizing the development of rural palliative care that is applicable in diverse rural communities.

  3. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repositorymore » designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are needed for repository modeling are severely lacking. In addition, most of existing reactive transport codes were developed for non-radioactive contaminants, and they need to be adapted to account for radionuclide decay and in-growth. The accessibility to the source codes is generally limited. Because the problems of interest for the Waste IPSC are likely to result in relatively large computational models, a compact memory-usage footprint and a fast/robust solution procedure will be needed. A robust massively parallel processing (MPP) capability will also be required to provide reasonable turnaround times on the analyses that will be performed with the code. A performance assessment (PA) calculation for a waste disposal system generally requires a large number (hundreds to thousands) of model simulations to quantify the effect of model parameter uncertainties on the predicted repository performance. A set of codes for a PA calculation must be sufficiently robust and fast in terms of code execution. A PA system as a whole must be able to provide multiple alternative models for a specific set of physical/chemical processes, so that the users can choose various levels of modeling complexity based on their modeling needs. This requires PA codes, preferably, to be highly modularized. Most of the existing codes have difficulties meeting these requirements. Based on the gap analysis results, we have made the following recommendations for the code selection and code development for the NEAMS waste IPSC: (1) build fully coupled high-fidelity THCMBR codes using the existing SIERRA codes (e.g., ARIA and ADAGIO) and platform, (2) use DAKOTA to build an enhanced performance assessment system (EPAS), and build a modular code architecture and key code modules for performance assessments. The key chemical calculation modules will be built by expanding the existing CANTERA capabilities as well as by extracting useful components from other existing codes.« less

  4. Urbanization and watershed sustainability: Collaborative simulation modeling of future development states

    NASA Astrophysics Data System (ADS)

    Randhir, Timothy O.; Raposa, Sarah

    2014-11-01

    Urbanization has a significant impact on water resources and requires a watershed-based approach to evaluate impacts of land use and urban development on watershed processes. This study uses a simulation with urban policy scenarios to model and strategize transferable recommendations for municipalities and cities to guide urban decisions using watershed ecohydrologic principles. The watershed simulation model is used to evaluation intensive (policy in existing built regions) and extensive (policy outside existing build regions) urban development scenarios with and without implementation of Best Management practices (BMPs). Water quantity and quality changes are simulated to assess effectiveness of five urban development scenarios. It is observed that optimal combination of intensive and extensive strategies can be used to sustain urban ecosystems. BMPs are found critical to reduce storm water and water quality impacts on urban development. Conservation zoning and incentives for voluntary adoption of BMPs can be used in sustaining urbanizing watersheds.

  5. Mathematical model for dynamic cell formation in fast fashion apparel manufacturing stage

    NASA Astrophysics Data System (ADS)

    Perera, Gayathri; Ratnayake, Vijitha

    2018-05-01

    This paper presents a mathematical programming model for dynamic cell formation to minimize changeover-related costs (i.e., machine relocation costs and machine setup cost) and inter-cell material handling cost to cope with the volatile production environments in apparel manufacturing industry. The model is formulated through findings of a comprehensive literature review. Developed model is validated based on data collected from three different factories in apparel industry, manufacturing fast fashion products. A program code is developed using Lingo 16.0 software package to generate optimal cells for developed model and to determine the possible cost-saving percentage when the existing layouts used in three factories are replaced by generated optimal cells. The optimal cells generated by developed mathematical model result in significant cost saving when compared with existing product layouts used in production/assembly department of selected factories in apparel industry. The developed model can be considered as effective in minimizing the considered cost terms in dynamic production environment of fast fashion apparel manufacturing industry. Findings of this paper can be used for further researches on minimizing the changeover-related costs in fast fashion apparel production stage.

  6. Systems development of a stall/spin research facility using remotely controlled/augmented aircraft models. Volume 1: Systems overview

    NASA Technical Reports Server (NTRS)

    Montoya, R. J.; Jai, A. R.; Parker, C. D.

    1979-01-01

    A ground based, general purpose, real time, digital control system simulator (CSS) is specified, developed, and integrated with the existing instrumentation van of the testing facility. This CSS is built around a PDP-11/55, and its operational software was developed to meet the dual goal of providing the immediate capability to represent the F-18 drop model control laws and the flexibility for expansion to represent more complex control laws typical of control configured vehicles. Overviews of the two CSS's developed are reviewed as well as the overall system after their integration with the existing facility. Also the latest version of the F-18 drop model control laws (REV D) is described and the changes needed for its incorporation in the digital and analog CSS's are discussed.

  7. Point Source X-Ray Lithography System for Sub-0.15 Micron Design Rules

    DTIC Science & Technology

    1998-05-22

    consist of a SAL developed stepper, an SRL developed Dense Plasma Focus , (DPF), X-Ray source, and a CXrL developed beam line. The system will be...existing machine that used spark gap switching, SRL has developed an all solid state driver and improved head electrode assembly for their dense plasma ... focus X-Ray source. Likewise, SAL has used their existing Model 4 stepper installed at CXrL as a design starting point, and has developed an advanced

  8. Associative learning is necessary but not sufficient for mirror neuron development.

    PubMed

    Bonaiuto, James

    2014-04-01

    Existing computational models of the mirror system demonstrate the additional circuitry needed for mirror neurons to display the range of properties that they exhibit. Such models emphasize the need for existing connectivity to form visuomotor associations, processing to reduce the space of possible inputs, and demonstrate the role neurons with mirror properties might play in monitoring one's own actions.

  9. Cooperative inference: Features, objects, and collections.

    PubMed

    Searcy, Sophia Ray; Shafto, Patrick

    2016-10-01

    Cooperation plays a central role in theories of development, learning, cultural evolution, and education. We argue that existing models of learning from cooperative informants have fundamental limitations that prevent them from explaining how cooperation benefits learning. First, existing models are shown to be computationally intractable, suggesting that they cannot apply to realistic learning problems. Second, existing models assume a priori agreement about which concepts are favored in learning, which leads to a conundrum: Learning fails without precise agreement on bias yet there is no single rational choice. We introduce cooperative inference, a novel framework for cooperation in concept learning, which resolves these limitations. Cooperative inference generalizes the notion of cooperation used in previous models from omission of labeled objects to the omission values of features, labels for objects, and labels for collections of objects. The result is an approach that is computationally tractable, does not require a priori agreement about biases, applies to both Boolean and first-order concepts, and begins to approximate the richness of real-world concept learning problems. We conclude by discussing relations to and implications for existing theories of cognition, cognitive development, and cultural evolution. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. MBA in Education Leadership: A Model for Developing an Interdisciplinary Principal Preparation Program

    ERIC Educational Resources Information Center

    Smith, Rachel A.; Somers, John

    2016-01-01

    This paper presents a model for developing an interdisciplinary principal preparation program, an MBA in Education Leadership, which integrates best practices in both education and business within an educational context. The paper addresses gaps that exist in many traditional principal preparation programs and provides an alternative model, which…

  11. Habitat Suitability Index Models: Black-shouldered kite

    USGS Publications Warehouse

    Faanes, Craig A.; Howard, Rebecca J.

    1987-01-01

    A review and synthesis of existing information were used to develop a model for evaluating black-shouldered kite habitat quality. The model is scaled to produce an index between 0 (unsuitable habitat) to 1.0 (optimal habitat). Habitat suitability index models are designed for use with the Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service. Guidelines for model application are provided.

  12. Center for Modeling of Turbulence and Transition: Research Briefs, 1995

    NASA Technical Reports Server (NTRS)

    1995-01-01

    This research brief contains the progress reports of the research staff of the Center for Modeling of Turbulence and Transition (CMOTT) from July 1993 to July 1995. It also constitutes a progress report to the Institute of Computational Mechanics in Propulsion located at the Ohio Aerospace Institute and the Lewis Research Center. CMOTT has been in existence for about four years. In the first three years, its main activities were to develop and validate turbulence and combustion models for propulsion systems, in an effort to remove the deficiencies of existing models. Three workshops on computational turbulence modeling were held at LeRC (1991, 1993, 1994). At present, CMOTT is integrating the CMOTT developed/improved models into CFD tools which can be used by the propulsion systems community. This activity has resulted in an increased collaboration with the Lewis CFD researchers.

  13. Center for modeling of turbulence and transition: Research briefs, 1995

    NASA Astrophysics Data System (ADS)

    1995-10-01

    This research brief contains the progress reports of the research staff of the Center for Modeling of Turbulence and Transition (CMOTT) from July 1993 to July 1995. It also constitutes a progress report to the Institute of Computational Mechanics in Propulsion located at the Ohio Aerospace Institute and the Lewis Research Center. CMOTT has been in existence for about four years. In the first three years, its main activities were to develop and validate turbulence and combustion models for propulsion systems, in an effort to remove the deficiencies of existing models. Three workshops on computational turbulence modeling were held at LeRC (1991, 1993, 1994). At present, CMOTT is integrating the CMOTT developed/improved models into CFD tools which can be used by the propulsion systems community. This activity has resulted in an increased collaboration with the Lewis CFD researchers.

  14. A systematic review of predictive models for asthma development in children.

    PubMed

    Luo, Gang; Nkoy, Flory L; Stone, Bryan L; Schmick, Darell; Johnson, Michael D

    2015-11-28

    Asthma is the most common pediatric chronic disease affecting 9.6 % of American children. Delay in asthma diagnosis is prevalent, resulting in suboptimal asthma management. To help avoid delay in asthma diagnosis and advance asthma prevention research, researchers have proposed various models to predict asthma development in children. This paper reviews these models. A systematic review was conducted through searching in PubMed, EMBASE, CINAHL, Scopus, the Cochrane Library, the ACM Digital Library, IEEE Xplore, and OpenGrey up to June 3, 2015. The literature on predictive models for asthma development in children was retrieved, with search results limited to human subjects and children (birth to 18 years). Two independent reviewers screened the literature, performed data extraction, and assessed article quality. The literature search returned 13,101 references in total. After manual review, 32 of these references were determined to be relevant and are discussed in the paper. We identify several limitations of existing predictive models for asthma development in children, and provide preliminary thoughts on how to address these limitations. Existing predictive models for asthma development in children have inadequate accuracy. Efforts to improve these models' performance are needed, but are limited by a lack of a gold standard for asthma development in children.

  15. Common data model for natural language processing based on two existing standard information models: CDA+GrAF.

    PubMed

    Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D

    2012-08-01

    An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Determination of Network Attributes from a High Resolution Terrain Data Base

    DTIC Science & Technology

    1987-09-01

    and existing models is in the method used to make decisions. All of ,he models- reviewed when developing the ALARM strategy depended either on threshold...problems with the methods currently accepted and used to *model the decision process. These methods are recognized because they have their uses...observation, detection, and lines of sight along a narrow strip of terrain relative to the overall size of the sectors of the two forces. Existing methods of

  17. Forestry sector analysis for developing countries: issues and methods.

    Treesearch

    R.W. Haynes

    1993-01-01

    A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...

  18. The public health nutrition intervention management bi-cycle: a model for training and practice improvement.

    PubMed

    Hughes, Roger; Margetts, Barrie

    2012-11-01

    The present paper describes a model for public health nutrition practice designed to facilitate practice improvement and provide a step-wise approach to assist with workforce development. The bi-cycle model for public health nutrition practice has been developed based on existing cyclical models for intervention management but modified to integrate discrete capacity-building practices. Education and practice settings. This model will have applications for educators and practitioners. Modifications to existing models have been informed by the authors' observations and experiences as practitioners and educators, and reflect a conceptual framework with applications in workforce development and practice improvement. From a workforce development and educational perspective, the model is designed to reflect adult learning principles, exposing students to experiential, problem-solving and practical learning experiences that reflect the realities of work as a public health nutritionist. In doing so, it assists the development of competency beyond knowing to knowing how, showing how and doing. This progression of learning from knowledge to performance is critical to effective competency development for effective practice. Public health nutrition practice is dynamic and varied, and models need to be adaptable and applicable to practice context to have utility. The paper serves to stimulate debate in the public health nutrition community, to encourage critical feedback about the validity, applicability and utility of this model in different practice contexts.

  19. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  20. Enhanced model of photovoltaic cell/panel/array considering the direct and reverse modes

    NASA Astrophysics Data System (ADS)

    Zegaoui, Abdallah; Boutoubat, Mohamed; Sawicki, Jean-Paul; Kessaissia, Fatma Zohra; Djahbar, Abdelkader; Aillerie, Michel

    2018-05-01

    This paper presents an improved generalized physical model for photovoltaic, PV cells, panels and arrays taking into account the behavior of these devices when considering their biasing existing in direct and reverse modes. Existing PV physical models generally are very efficient for simulating influence of irradiation changes on the short circuit current but they could not visualize the influences of temperature changes. The Enhanced Direct and Reverse Mode model, named EDRM model, enlightens the influence on the short-circuit current of both temperature and irradiation in the reverse mode of the considered PV devices. Due to its easy implementation, the proposed model can be a useful power tool for the development of new photovoltaic systems taking into account and in a more exhaustive manner, environmental conditions. The developed model was tested on a marketed PV panel and it gives a satisfactory results compared with parameters given in the manufacturer datasheet.

  1. A Developed Meta-model for Selection of Cotton Fabrics Using Design of Experiments and TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Chatterjee, Prasenjit

    2017-12-01

    Selection of cotton fabrics for providing optimal clothing comfort is often considered as a multi-criteria decision making problem consisting of an array of candidate alternatives to be evaluated based of several conflicting properties. In this paper, design of experiments and technique for order preference by similarity to ideal solution (TOPSIS) are integrated so as to develop regression meta-models for identifying the most suitable cotton fabrics with respect to the computed TOPSIS scores. The applicability of the adopted method is demonstrated using two real time examples. These developed models can also identify the statistically significant fabric properties and their interactions affecting the measured TOPSIS scores and final selection decisions. There exists good degree of congruence between the ranking patterns as derived using these meta-models and the existing methods for cotton fabric ranking and subsequent selection.

  2. A cost-effectiveness comparison of existing and Landsat-aided snow water content estimation systems

    NASA Technical Reports Server (NTRS)

    Sharp, J. M.; Thomas, R. W.

    1975-01-01

    This study describes how Landsat imagery can be cost-effectively employed to augment an operational hydrologic model. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model presently used by the California Department of Water Resources. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the Landsat-aided approach.

  3. DARPA Emerging Technologies

    DTIC Science & Technology

    2016-01-01

    development requires wind tunnels and ranges that do not currently exist. Furthermore, continued technology matura- tion is needed for thermal management...designed with conceptual design engine model (at existing technology level), or existing propul- sion system, or modified propulsion system (e.g...internal cameras reading gauges and dials and switch positions , directly tapping into current or future avion- ics service buses and integrating

  4. Rectenna thermal model development

    NASA Technical Reports Server (NTRS)

    Kadiramangalam, Murall; Alden, Adrian; Speyer, Daniel

    1992-01-01

    Deploying rectennas in space requires adapting existing designs developed for terrestrial applications to the space environment. One of the major issues in doing so is to understand the thermal performance of existing designs in the space environment. Toward that end, a 3D rectenna thermal model has been developed, which involves analyzing shorted rectenna elements and finite size rectenna element arrays. A shorted rectenna element is a single element whose ends are connected together by a material of negligible thermal resistance. A shorted element is a good approximation to a central element of a large array. This model has been applied to Brown's 2.45 GHz rectenna design. Results indicate that Brown's rectenna requires redesign or some means of enhancing the heat dissipation in order for the diode temperature to be maintained below 200 C above an output power density of 620 W/sq.m. The model developed in this paper is very general and can be used for the analysis and design of any type of rectenna design of any frequency.

  5. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  6. Enhanced Vapor-Phase Diffusion in Porous Media - LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ho, C.K.; Webb, S.W.

    1999-01-01

    As part of the Laboratory-Directed Research and Development (LDRD) Program at Sandia National Laboratories, an investigation into the existence of enhanced vapor-phase diffusion (EVD) in porous media has been conducted. A thorough literature review was initially performed across multiple disciplines (soil science and engineering), and based on this review, the existence of EVD was found to be questionable. As a result, modeling and experiments were initiated to investigate the existence of EVD. In this LDRD, the first mechanistic model of EVD was developed which demonstrated the mechanisms responsible for EVD. The first direct measurements of EVD have also been conductedmore » at multiple scales. Measurements have been made at the pore scale, in a two- dimensional network as represented by a fracture aperture, and in a porous medium. Significant enhancement of vapor-phase transport relative to Fickian diffusion was measured in all cases. The modeling and experimental results provide additional mechanisms for EVD beyond those presented by the generally accepted model of Philip and deVries (1957), which required a thermal gradient for EVD to exist. Modeling and experimental results show significant enhancement under isothermal conditions. Application of EVD to vapor transport in the near-surface vadose zone show a significant variation between no enhancement, the model of Philip and deVries, and the present results. Based on this information, the model of Philip and deVries may need to be modified, and additional studies are recommended.« less

  7. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    NASA Astrophysics Data System (ADS)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  8. Chapter 2: Fire and Fuels Extension: Model description

    Treesearch

    Sarah J. Beukema; Elizabeth D. Reinhardt; Julee A. Greenough; Donald C. E. Robinson; Werner A. Kurz

    2003-01-01

    The Fire and Fuels Extension to the Forest Vegetation Simulator is a model that simulates fuel dynamics and potential fire behavior over time, in the context of stand development and management. Existing models are used to represent forest stand development (the Forest Vegetation Simulator, Wykoff and others 1982), fire behavior (Rothermel 1972, Van Wagner 1977, and...

  9. An Exploratory Study of the Elements to Develop a Coaching Model

    ERIC Educational Resources Information Center

    Brown, Gwendolyn

    2010-01-01

    This exploratory study examined the elements of a coaching model based on the best practices that first focus on providing managers with the ability to develop workers and increase productivity, before using existing models that only support the process of managing workers, when it becomes apparent that the worker is not meeting expected…

  10. Detecting Responses of Loblolly Pine Stand Development to Site-Preparation Intensity: A Modeling Approach

    Treesearch

    Mingguang Xu; Timothy B. Harrington; M. Boyd Edwards

    1997-01-01

    Data from an existing site preparation experiment in the Georgia Piedmont were subjected to a modeling approach to analyze effects of site preparation intensity on stand development of loblolly pine (Pinus taeda L.) 5 to 12 years since treatment. An average stand height model that incorporated indicator variables for treatment provided an accurate...

  11. Colour Model for Outdoor Machine Vision for Tropical Regions and its Comparison with the CIE Model

    NASA Astrophysics Data System (ADS)

    Sahragard, Nasrolah; Ramli, Abdul Rahman B.; Hamiruce Marhaban, Mohammad; Mansor, Shattri B.

    2011-02-01

    Accurate modeling of daylight and surface reflectance are very useful for most outdoor machine vision applications specifically those which are based on color recognition. Existing daylight CIE model has drawbacks that limit its ability to predict the color of incident light. These limitations include lack of considering ambient light, effects of light reflected off the ground, and context specific information. Previously developed color model is only tested for a few geographical places in North America and its accountability is under question for other places in the world. Besides, existing surface reflectance models are not easily applied to outdoor images. A reflectance model with combined diffuse and specular reflection in normalized HSV color space could be used to predict color. In this paper, a new daylight color model showing the color of daylight for a broad range of sky conditions is developed which will suit weather conditions of tropical places such as Malaysia. A comparison of this daylight color model and daylight CIE model will be discussed. The colors of matte and specular surfaces have been estimated by use of the developed color model and surface reflection function in this paper. The results are shown to be highly reliable.

  12. Flow and transport model of the Savannah River Site Old Burial Grounds using Data Fusion modeling (DFM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    The Data Fusion Modeling (DFM) approach has been used to develop a groundwater flow and transport model of the Old Burial Grounds (OBG) at the US Department of Energy`s Savannah River Site (SRS). The resulting DFM model was compared to an existing model that was calibrated via the typical trial-and-error method. The OBG was chosen because a substantial amount of hydrogeologic information is available, a FACT (derivative of VAM3DCG) flow and transport model of the site exists, and the calibration and numerics were challenging with standard approaches. The DFM flow model developed here is similar to the flow model bymore » Flach et al. This allows comparison of the two flow models and validates the utility of DFM. The contaminant of interest for this study is tritium, because it is a geochemically conservative tracer that has been monitored along the seepline near the F-Area effluent and Fourmile Branch for several years.« less

  13. Collaborative Research: Robust Climate Projections and Stochastic Stability of Dynamical Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilya Zaliapin

    This project focused on conceptual exploration of El Nino/Southern Oscillation (ENSO) variability and sensitivity using a Delay Differential Equation developed in the project. We have (i) established the existence and continuous dependence of solutions of the model (ii) explored multiple models solutions, and the distribution of solutions extrema, and (iii) established and explored the phase locking phenomenon and the existence of multiple solutions for the same values of model parameters. In addition, we have applied to our model the concept of pullback attractor, which greatly facilitated predictive understanding of the nonlinear model's behavior.

  14. Multitasking TORT under UNICOS: Parallel performance models and measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnett, A.; Azmy, Y.Y.

    1999-09-27

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.

  15. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azmy, Y.Y.; Barnett, D.A.

    1999-09-27

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead.

  16. Turbulence Modeling Workshop

    NASA Technical Reports Server (NTRS)

    Rubinstein, R. (Editor); Rumsey, C. L. (Editor); Salas, M. D. (Editor); Thomas, J. L. (Editor); Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    Advances in turbulence modeling are needed in order to calculate high Reynolds number flows near the onset of separation and beyond. To this end, the participants in this workshop made the following recommendations. (1) A national/international database and standards for turbulence modeling assessment should be established. Existing experimental data sets should be reviewed and categorized. Advantage should be taken of other efforts already under-way, such as that of the European Research Community on Flow, Turbulence, and Combustion (ERCOFTAC) consortium. Carefully selected "unit" experiments will be needed, as well as advances in instrumentation, to fill the gaps in existing datasets. A high priority should be given to document existing turbulence model capabilities in a standard form, including numerical implementation issues such as grid quality and resolution. (2) NASA should support long-term research on Algebraic Stress Models and Reynolds Stress Models. The emphasis should be placed on improving the length-scale equation, since it is the least understood and is a key component of two-equation and higher models. Second priority should be given to the development of improved near-wall models. Direct Numerical Simulations (DNS) and Large Eddy Simulations (LES) would provide valuable guidance in developing and validating new Reynolds-averaged Navier-Stokes (RANS) models. Although not the focus of this workshop, DNS, LES, and hybrid methods currently represent viable approaches for analysis on a limited basis. Therefore, although computer limitations require the use of RANS methods for realistic configurations at high Reynolds number in the foreseeable future, a balanced effort in turbulence modeling development, validation, and implementation should include these approaches as well.

  17. Development of a Dynamic Visco-elastic Vehicle-Soil Interaction Model for Rut Depth, and Power Determinations

    DTIC Science & Technology

    2011-09-06

    Presentation Outline A) Review of Soil Model governing equations B) Development of pedo -transfer functions (terrain database to engineering properties) C...lateral earth pressure) UNCLASSIFIED B) Development of pedo -transfer functions Engineering parameters needed by soil model - compression index - rebound...inches, RCI for fine- grained soils, CI for coarse-grained soils. UNCLASSIFIED Pedo -transfer function • Need to transfer existing terrain database

  18. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  19. CLIMACS: a computer model of forest stand development for western Oregon and Washington.

    Treesearch

    Virginia H. Dale; Miles Hemstrom

    1984-01-01

    A simulation model for the development of timber stands in the Pacific Northwest is described. The model grows individual trees of 21 species in a 0.20-hectare (0.08-acre) forest gap. The model provides a means of assimilating existing information, indicates where knowledge is deficient, suggests where the forest system is most sensitive, and provides a first testing...

  20. Early experiences in evolving an enterprise-wide information model for laboratory and clinical observations.

    PubMed

    Chen, Elizabeth S; Zhou, Li; Kashyap, Vipul; Schaeffer, Molly; Dykes, Patricia C; Goldberg, Howard S

    2008-11-06

    As Electronic Healthcare Records become more prevalent, there is an increasing need to ensure unambiguous data capture, interpretation, and exchange within and across heterogeneous applications. To address this need, a common, uniform, and comprehensive approach for representing clinical information is essential. At Partners HealthCare System, we are investigating the development and implementation of enterprise-wide information models to specify the representation of clinical information to support semantic interoperability. This paper summarizes our early experiences in: (1) defining a process for information model development, (2) reviewing and comparing existing healthcare information models, (3) identifying requirements for representation of laboratory and clinical observations, and (4) exploring linkages to existing terminology and data standards. These initial findings provide insight to the various challenges ahead and guidance on next steps for adoption of information models at our organization.

  1. A Novel Evaluation Model for the Vehicle Navigation Device Market Using Hybrid MCDM Techniques

    NASA Astrophysics Data System (ADS)

    Lin, Chia-Li; Hsieh, Meng-Shu; Tzeng, Gwo-Hshiung

    The developing strategy of ND is also presented to initiate the product roadmap. Criteria for evaluation are constructed via reviewing papers, interviewing experts and brain-storming. The ISM (interpretive structural modeling) method was used to construct the relationship between each criterion. The existing NDs were sampled to benchmark the gap between the consumer’s aspired/desired utilities with respect to the utilities of existing/developing NDs. The VIKOR method was applied to rank the sampled NDs. This paper will propose the key driving criteria of purchasing new ND and compare the consumer behavior of various characters. Those conclusions can be served as a reference for ND producers for improving existing functions or planning further utilities in the next e-era ND generation.

  2. Application of natural analog studies to exploration for ore deposits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson, D.L.

    1995-09-01

    Natural analogs are viewed as similarities in nature and are routinely utilized by exploration geologists in their search for economic mineral deposits. Ore deposit modeling is undertaken by geologists to direct their exploration activities toward favorable geologic environments and, therefore, successful programs. Two types of modeling are presented: (i) empirical model development based on the study of known ore deposit characteristics, and (ii) concept model development based on theoretical considerations and field observations that suggest a new deposit type, not known to exist in nature, may exist and justifies an exploration program. Key elements that are important in empirical modelmore » development are described, and examples of successful applications of these natural analogs to exploration are presented. A classical example of successful concept model development, the discovery of the McLaughlin gold mine in California, is presented. The utilization of natural analogs is an important facet of mineral exploration. Natural analogs guide explorationists in their search for new discoveries, increase the probability of success, and may decrease overall exploration expenditure.« less

  3. Extending the diffuse layer model of surface acidity behavior: I. Model development

    EPA Science Inventory

    Considerable disenchantment exists within the environmental research community concerning our current ability to accurately model surface-complexation-mediated low-porewater-concentration ionic contaminant partitioning with natural surfaces. Several authors attribute this unaccep...

  4. Automotive Maintenance Data Base for Model Years 1976-1979. Part I

    DOT National Transportation Integrated Search

    1980-12-01

    An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...

  5. Comparison of Coupled Radiative Flow Solutions with Project Fire 2 Flight Data

    NASA Technical Reports Server (NTRS)

    Olynick, David R.; Henline, W. D.; Chambers, Lin Hartung; Candler, G. V.

    1995-01-01

    A nonequilibrium, axisymmetric, Navier-Stokes flow solver with coupled radiation has been developed for use in the design or thermal protection systems for vehicles where radiation effects are important. The present method has been compared with an existing now and radiation solver and with the Project Fire 2 experimental data. Good agreement has been obtained over the entire Fire 2 trajectory with the experimentally determined values of the stagnation radiation intensity in the 0.2-6.2 eV range and with the total stagnation heating. The effects of a number of flow models are examined to determine which combination of physical models produces the best agreement with the experimental data. These models include radiation coupling, multitemperature thermal models, and finite rate chemistry. Finally, the computational efficiency of the present model is evaluated. The radiation properties model developed for this study is shown to offer significant computational savings compared to existing codes.

  6. Toward the Development and Validation of a Career Coach Competency Model

    ERIC Educational Resources Information Center

    Hatala, John-Paul; Hisey, Lee

    2011-01-01

    The career coaching profession is a dynamic field that has grown over the last decade. However, there exists a limitation to this field's development, as there is no universally accepted definition or empirically based competencies. There were three phases to the study. In the first phase, a conceptual model was developed that highlights four…

  7. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction.

    PubMed

    Lu, Jingtao; Goldsmith, Michael-Rock; Grulke, Christopher M; Chang, Daniel T; Brooks, Raina D; Leonard, Jeremy A; Phillips, Martin B; Hypes, Ethan D; Fair, Matthew J; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C; Tan, Yu-Mei

    2016-02-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals.

  8. Developing a Physiologically-Based Pharmacokinetic Model Knowledgebase in Support of Provisional Model Construction

    PubMed Central

    Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei

    2016-01-01

    Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706

  9. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  10. Single tree biomass modelling using airborne laser scanning

    NASA Astrophysics Data System (ADS)

    Kankare, Ville; Räty, Minna; Yu, Xiaowei; Holopainen, Markus; Vastaranta, Mikko; Kantola, Tuula; Hyyppä, Juha; Hyyppä, Hannu; Alho, Petteri; Viitala, Risto

    2013-11-01

    Accurate forest biomass mapping methods would provide the means for e.g. detecting bioenergy potential, biofuel and forest-bound carbon. The demand for practical biomass mapping methods at all forest levels is growing worldwide, and viable options are being developed. Airborne laser scanning (ALS) is a promising forest biomass mapping technique, due to its capability of measuring the three-dimensional forest vegetation structure. The objective of the study was to develop new methods for tree-level biomass estimation using metrics derived from ALS point clouds and to compare the results with field references collected using destructive sampling and with existing biomass models. The study area was located in Evo, southern Finland. ALS data was collected in 2009 with pulse density equalling approximately 10 pulses/m2. Linear models were developed for the following tree biomass components: total, stem wood, living branch and total canopy biomass. ALS-derived geometric and statistical point metrics were used as explanatory variables when creating the models. The total and stem biomass root mean square error per cents equalled 26.3% and 28.4% for Scots pine (Pinus sylvestris L.), and 36.8% and 27.6% for Norway spruce (Picea abies (L.) H. Karst.), respectively. The results showed that higher estimation accuracy for all biomass components can be achieved with models created in this study compared to existing allometric biomass models when ALS-derived height and diameter were used as input parameters. Best results were achieved when adding field-measured diameter and height as inputs in the existing biomass models. The only exceptions to this were the canopy and living branch biomass estimations for spruce. The achieved results are encouraging for the use of ALS-derived metrics in biomass mapping and for further development of the models.

  11. Demand modelling of passenger air travel: An analysis and extension. Volume 1: Background and summary

    NASA Technical Reports Server (NTRS)

    Jacobson, I. D.

    1978-01-01

    The framework for a model of travel demand which will be useful in predicting the total market for air travel between two cities is discussed. Variables to be used in determining the need for air transportation where none currently exists and the effect of changes in system characteristics on attracting latent demand are identified. Existing models are examined in order to provide insight into their strong points and shortcomings. Much of the existing behavioral research in travel demand is incorporated to allow the inclusion of non-economic factors, such as convenience. The model developed is characterized as a market segmentation model. This is a consequence of the strengths of disaggregation and its natural evolution to a usable aggregate formulation. The need for this approach both pedagogically and mathematically is discussed.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is oftenmore » secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.« less

  13. AWARE: Adaptive Software Monitoring and Dynamic Reconfiguration for Critical Infrastructure Protection

    DTIC Science & Technology

    2015-04-29

    in which we applied these adaptation patterns to an adaptive news web server intended to tolerate extremely heavy, unexpected loads. To address...collection of existing models used as benchmarks for OO-based refactoring and an existing web -based repository called REMODD to provide users with model...invariant properties. Specifically, we developed Avida- MDE (based on the Avida digital evolution platform) to support the automatic generation of software

  14. Simulation of wheat growth and development based on organ-level photosynthesis and assimilate allocation.

    PubMed

    Evers, J B; Vos, J; Yin, X; Romero, P; van der Putten, P E L; Struik, P C

    2010-05-01

    Intimate relationships exist between form and function of plants, determining many processes governing their growth and development. However, in most crop simulation models that have been created to simulate plant growth and, for example, predict biomass production, plant structure has been neglected. In this study, a detailed simulation model of growth and development of spring wheat (Triticum aestivum) is presented, which integrates degree of tillering and canopy architecture with organ-level light interception, photosynthesis, and dry-matter partitioning. An existing spatially explicit 3D architectural model of wheat development was extended with routines for organ-level microclimate, photosynthesis, assimilate distribution within the plant structure according to organ demands, and organ growth and development. Outgrowth of tiller buds was made dependent on the ratio between assimilate supply and demand of the plants. Organ-level photosynthesis, biomass production, and bud outgrowth were simulated satisfactorily. However, to improve crop simulation results more efforts are needed mechanistically to model other major plant physiological processes such as nitrogen uptake and distribution, tiller death, and leaf senescence. Nevertheless, the work presented here is a significant step forwards towards a mechanistic functional-structural plant model, which integrates plant architecture with key plant processes.

  15. A survey of Existing V&V, UQ and M&S Data and Knowledge Bases in Support of the Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hyung Lee; Rich Johnson, Ph.D.; Kimberlyn C. Moussesau

    2011-12-01

    The Nuclear Energy - Knowledge base for Advanced Modeling and Simulation (NE-KAMS) is being developed at the Idaho National Laboratory in conjunction with Bettis Laboratory, Sandia National Laboratories, Argonne National Laboratory, Oak Ridge National Laboratory, Utah State University and others. The objective of this consortium is to establish a comprehensive knowledge base to provide Verification and Validation (V&V) and Uncertainty Quantification (UQ) and other resources for advanced modeling and simulation (M&S) in nuclear reactor design and analysis. NE-KAMS will become a valuable resource for the nuclear industry, the national laboratories, the U.S. NRC and the public to help ensure themore » safe operation of existing and future nuclear reactors. A survey and evaluation of the state-of-the-art of existing V&V and M&S databases, including the Department of Energy and commercial databases, has been performed to ensure that the NE-KAMS effort will not be duplicating existing resources and capabilities and to assess the scope of the effort required to develop and implement NE-KAMS. The survey and evaluation have indeed highlighted the unique set of value-added functionality and services that NE-KAMS will provide to its users. Additionally, the survey has helped develop a better understanding of the architecture and functionality of these data and knowledge bases that can be used to leverage the development of NE-KAMS.« less

  16. Development of an Experimental Model of Diabetes Co-Existing with Metabolic Syndrome in Rats.

    PubMed

    Suman, Rajesh Kumar; Ray Mohanty, Ipseeta; Borde, Manjusha K; Maheshwari, Ujwala; Deshmukh, Y A

    2016-01-01

    Background. The incidence of metabolic syndrome co-existing with diabetes mellitus is on the rise globally. Objective. The present study was designed to develop a unique animal model that will mimic the pathological features seen in individuals with diabetes and metabolic syndrome, suitable for pharmacological screening of drugs. Materials and Methods. A combination of High-Fat Diet (HFD) and low dose of streptozotocin (STZ) at 30, 35, and 40 mg/kg was used to induce metabolic syndrome in the setting of diabetes mellitus in Wistar rats. Results. The 40 mg/kg STZ produced sustained hyperglycemia and the dose was thus selected for the study to induce diabetes mellitus. Various components of metabolic syndrome such as dyslipidemia {(increased triglyceride, total cholesterol, LDL cholesterol, and decreased HDL cholesterol)}, diabetes mellitus (blood glucose, HbA1c, serum insulin, and C-peptide), and hypertension {systolic blood pressure} were mimicked in the developed model of metabolic syndrome co-existing with diabetes mellitus. In addition to significant cardiac injury, atherogenic index, inflammation (hs-CRP), decline in hepatic and renal function were observed in the HF-DC group when compared to NC group rats. The histopathological assessment confirmed presence of edema, necrosis, and inflammation in heart, pancreas, liver, and kidney of HF-DC group as compared to NC. Conclusion. The present study has developed a unique rodent model of metabolic syndrome, with diabetes as an essential component.

  17. Hydrous ferric oxide: evaluation of Cd-HFO surface complexation models combining Cd(K) EXAFS data, potentiometric titration results, and surface site structures identified from mineralogical knowledge.

    PubMed

    Spadini, Lorenzo; Schindler, Paul W; Charlet, Laurent; Manceau, Alain; Vala Ragnarsdottir, K

    2003-10-01

    The surface properties of ferrihydrite were studied by combining wet chemical data, Cd(K) EXAFS data, and a surface structure and protonation model of the ferrihydrite surface. Acid-base titration experiments and Cd(II)-ferrihydrite sorption experiments were performed within 3<-log[H(+)]<10.5 and 0.5<[Cd(t)]<12 mM in 0.3 M NaClO(4) at 25 degrees C, where [Cd(t)] refers to total Cd concentration. Measurements at -5.5triple bond Fe-OH(-1/2),logk((int))=-8.29, assuming the existence of a unique intrinsic microscopic constant, logk((int)), and consequently the existence of a single significant type of acid-base reactive functional groups. The surface structure model indicates that these groups are terminal water groups. The Cd(II) data were modeled assuming the existence of a single reactive site. The model fits the data set at low Cd(II) concentration and up to 50% surface coverage. At high coverage more Cd(II) ions than predicted are adsorbed, which is indicative of the existence of a second type of site of lower affinity. This agrees with the surface structure and protonation model developed, which indicates comparable concentrations of high- and low-affinity sites. The model further shows that for each class of low- and high-affinity sites there exists a variety of corresponding Cd surface complex structure, depending on the model crystal faces on which the complexes develop. Generally, high-affinity surface structures have surface coordinations of 3 and 4, as compared to 1 and 2 for low-affinity surface structures.

  18. Development of a sheep challenge model for Rift Valley fever

    USDA-ARS?s Scientific Manuscript database

    Rift Valley fever (RVF) is a zoonotic disease that causes severe epizootic disease in ruminants, characterized by mass abortion and high mortality rates in younger animals. The development of a reliable challenge model is an important prerequisite for evaluation of existing and novel vaccines. A stu...

  19. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    USDA-ARS?s Scientific Manuscript database

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  20. The Information Technology Model Curriculum

    ERIC Educational Resources Information Center

    Ekstrom, Joseph J.; Gorka, Sandra; Kamali, Reza; Lawson, Eydie; Lunt, Barry; Miller, Jacob; Reichgelt, Han

    2006-01-01

    The last twenty years has seen the development of demand for a new type of computing professional, which has resulted in the emergence of the academic discipline of Information Technology (IT). Numerous colleges and universities across the country and abroad have responded by developing programs without the advantage of an existing model for…

  1. Automotive Maintenance Data Base for Model Years 1976-1979. Part II : Appendix E and F

    DOT National Transportation Integrated Search

    1980-12-01

    An update of the existing data base was developed to include life cycle maintenance costs of representative vehicles for the model years 1976-1979. Repair costs as a function of time are also developed for a passenger car in each of the compact, subc...

  2. Teacher Learning in the Digital Age: Online Professional Development in STEM Education

    ERIC Educational Resources Information Center

    Dede, Chris, Ed.; Eisenkraft, Arthur, Ed.; Frumin, Kim, Ed.; Hartley, Alex, Ed.

    2016-01-01

    With an emphasis on science, technology, engineering, and mathematics (STEM) training, "Teacher Learning in the Digital Age" examines exemplary models of online and blended teacher professional development, including information on the structure and design of each model, intended audience, and existing research and evaluation data. From…

  3. E-Learning Quality Assurance: A Process-Oriented Lifecycle Model

    ERIC Educational Resources Information Center

    Abdous, M'hammed

    2009-01-01

    Purpose: The purpose of this paper is to propose a process-oriented lifecycle model for ensuring quality in e-learning development and delivery. As a dynamic and iterative process, quality assurance (QA) is intertwined with the e-learning development process. Design/methodology/approach: After reviewing the existing literature, particularly…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burge, S.W.

    Erosion has been identified as one of the significant design issues in fluid beds. A cooperative R&D venture of industry, research, and government organizations was recently formed to meet the industry need for a better understanding of erosion in fluid beds. Research focussed on bed hydrodynamics, which are considered to be the primary erosion mechanism. As part of this work, ANL developed an analytical model (FLUFIX) for bed hydrodynamics. Partial validation was performed using data from experiments sponsored by the research consortium. Development of a three-dimensional fluid bed hydrodynamic model was part of Asea-Babcock`s in-kind contribution to the R&D venture.more » This model, FORCE2, was developed by Babcock & Wilcox`s Research and Development Division existing B&W program and on the gas-solids modeling and was based on an existing B&W program and on the gas-solids modeling technology developed by ANL and others. FORCE2 contains many of the features needed to model plant size beds and, therefore can be used along with the erosion technology to assess metal wastage in industrial equipment. As part of the development efforts, FORCE2 was partially validated using ANL`s two-dimensional model, FLUFIX, and experimental data. Time constraints as well as the lack of good hydrodynamic data, particularly at the plant scale, prohibited a complete validation of FORCE2. This report describes this initial validation of FORCE2.« less

  5. Publishing and sharing of hydrologic models through WaterHUB

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.

    2011-12-01

    Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.

  6. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    NASA Technical Reports Server (NTRS)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.

  7. Retargeting of existing FORTRAN program and development of parallel compilers

    NASA Technical Reports Server (NTRS)

    Agrawal, Dharma P.

    1988-01-01

    The software models used in implementing the parallelizing compiler for the B-HIVE multiprocessor system are described. The various models and strategies used in the compiler development are: flexible granularity model, which allows a compromise between two extreme granularity models; communication model, which is capable of precisely describing the interprocessor communication timings and patterns; loop type detection strategy, which identifies different types of loops; critical path with coloring scheme, which is a versatile scheduling strategy for any multicomputer with some associated communication costs; and loop allocation strategy, which realizes optimum overlapped operations between computation and communication of the system. Using these models, several sample routines of the AIR3D package are examined and tested. It may be noted that automatically generated codes are highly parallelized to provide the maximized degree of parallelism, obtaining the speedup up to a 28 to 32-processor system. A comparison of parallel codes for both the existing and proposed communication model, is performed and the corresponding expected speedup factors are obtained. The experimentation shows that the B-HIVE compiler produces more efficient codes than existing techniques. Work is progressing well in completing the final phase of the compiler. Numerous enhancements are needed to improve the capabilities of the parallelizing compiler.

  8. Using Student Perceptions of the Learning Environment to Evaluate the Effectiveness of a Teacher Professional Development Programme

    ERIC Educational Resources Information Center

    Soebari, Titien S.; Aldridge, Jill M.

    2015-01-01

    The focus of this article is two-fold. First, it describes a model that can be used to guide the evaluation of teacher professional development. The model combines important components of existing models and incorporates the use of students' perceptions for examining teacher change. Second, the article reports the evaluation of a teacher…

  9. Modeling reinforced concrete durability.

    DOT National Transportation Integrated Search

    2014-06-01

    This project developed a next-generation modeling approach for projecting the extent of : reinforced concrete corrosion-related damage, customized for new and existing Florida Department of : Transportation bridges and suitable for adapting to broade...

  10. Developing a business-practice model for pharmacy services in ambulatory settings.

    PubMed

    Harris, Ila M; Baker, Ed; Berry, Tricia M; Halloran, Mary Ann; Lindauer, Kathleen; Ragucci, Kelly R; McGivney, Melissa Somma; Taylor, A Thomas; Haines, Stuart T

    2008-02-01

    A business-practice model is a guide, or toolkit, to assist managers and clinical pharmacy practitioners in the exploration, proposal, development and implementation of new clinical pharmacy services and/or the enhancement of existing services. This document was developed by the American College of Clinical Pharmacy Task Force on Ambulatory Practice to assist clinical pharmacy practitioners and administrators in the development of business-practice models for new and existing clinical pharmacy services in ambulatory settings. This document provides detailed instructions, examples, and resources on conducting a market assessment and a needs assessment, types of clinical services, operations, legal and regulatory issues, marketing and promotion, service development and exit plan, evaluation of service outcomes, and financial considerations in the development of a clinical pharmacy service in the ambulatory environment. Available literature is summarized, and an appendix provides valuable citations and resources. As ambulatory care practices continue to evolve, there will be increased knowledge of how to initiate and expand the services. This document is intended to serve as an essential resource to assist in the growth and development of clinical pharmacy services in the ambulatory environment.

  11. FARSITE: Fire Area Simulator-model development and evaluation

    Treesearch

    Mark A. Finney

    1998-01-01

    A computer simulation model, FARSITE, includes existing fire behavior models for surface, crown, spotting, point-source fire acceleration, and fuel moisture. The model's components and assumptions are documented. Simulations were run for simple conditions that illustrate the effect of individual fire behavior models on two-dimensional fire growth.

  12. Habitat Suitability Index Models: Muskellunge

    USGS Publications Warehouse

    Cook, Mark F.; Solomon, R. Charles

    1987-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the muskellunge (Esox masquinongy Mitchell). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  13. Blind Biobanking of the Prostatectomy Specimen: Critical Evaluation of the Existing Techniques and Development of the New 4-Level Tissue Extraction Model With High Sampling Efficacy.

    PubMed

    Tolkach, Yuri; Eminaga, Okyaz; Wötzel, Fabian; Huss, Sebastian; Bettendorf, Olaf; Eltze, Elke; Abbas, Mahmoud; Imkamp, Florian; Semjonow, Axel

    2017-03-01

    Fresh tissue is mandatory to perform high-quality translation studies. Several models for tissue extraction from prostatectomy specimens without guidance by frozen sections are already introduced. However, little is known about the sampling efficacy of these models, which should provide representative tissue in adequate volumes, account for multifocality and heterogeneity of tumor, not violate the routine final pathological examination, and perform quickly without frozen section-based histological control. The aim of the study was to evaluate the sampling efficacy of the existing tissue extraction models without guidance by frozen sections ("blind") and to develop an optimized model for tissue extraction. Five hundred thirty-three electronic maps of the tumor distribution in prostates from a single-center cohort of the patients subjected to radical prostatectomy were used for analysis. Six available models were evaluated in silico for their sampling efficacy. Additionally, a novel model achieving the best sampling efficacy was developed. The available models showed high efficacies for sampling "any part" from the tumor (up to 100%), but were uniformly low in efficacy to sample all tumor foci from the specimens (with the best technique sampling only 51.6% of the all tumor foci). The novel 4-level extraction model achieved a sampling efficacy of 93.1% for all tumor foci. The existing "blind" tissue extraction models from prostatectomy specimens without frozen sections control are suitable to target tumor tissues but these tissues do not represent the whole tumor. The novel 4-level model provides the highest sampling efficacy and a promising potential for integration into routine. Prostate 77: 396-405, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Empirical models for the prediction of ground motion duration for intraplate earthquakes

    NASA Astrophysics Data System (ADS)

    Anbazhagan, P.; Neaz Sheikh, M.; Bajaj, Ketan; Mariya Dayana, P. J.; Madhura, H.; Reddy, G. R.

    2017-07-01

    Many empirical relationships for the earthquake ground motion duration were developed for interplate region, whereas only a very limited number of empirical relationships exist for intraplate region. Also, the existing relationships were developed based mostly on the scaled recorded interplate earthquakes to represent intraplate earthquakes. To the author's knowledge, none of the existing relationships for the intraplate regions were developed using only the data from intraplate regions. Therefore, an attempt is made in this study to develop empirical predictive relationships of earthquake ground motion duration (i.e., significant and bracketed) with earthquake magnitude, hypocentral distance, and site conditions (i.e., rock and soil sites) using the data compiled from intraplate regions of Canada, Australia, Peninsular India, and the central and southern parts of the USA. The compiled earthquake ground motion data consists of 600 records with moment magnitudes ranging from 3.0 to 6.5 and hypocentral distances ranging from 4 to 1000 km. The non-linear mixed-effect (NLMEs) and logistic regression techniques (to account for zero duration) were used to fit predictive models to the duration data. The bracketed duration was found to be decreased with an increase in the hypocentral distance and increased with an increase in the magnitude of the earthquake. The significant duration was found to be increased with the increase in the magnitude and hypocentral distance of the earthquake. Both significant and bracketed durations were predicted higher in rock sites than in soil sites. The predictive relationships developed herein are compared with the existing relationships for interplate and intraplate regions. The developed relationship for bracketed duration predicts lower durations for rock and soil sites. However, the developed relationship for a significant duration predicts lower durations up to a certain distance and thereafter predicts higher durations compared to the existing relationships.

  15. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  16. Municipal water-based heat pump heating and/or cooling systems: Findings and recommendations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloomquist, R.G.; Wegman, S.

    1998-04-01

    The purpose of the present work was to determine if existing heat pump systems based on municipal water systems meet existing water quality standards, to analyze water that has passed through a heat pump or heat exchanger to determine if corrosion products can be detected, to determine residual chlorine levels in municipal waters on the inlet as well as the outlet side of such installations, to analyses for bacterial contaminants and/or regrowth due to the presence of a heat pump or heat exchanger, to develop and suggest criteria for system design and construction, to provide recommendations and specifications for materialmore » and fluid selection, and to develop model rules and regulations for the installation, operation, and monitoring of new and existing systems. In addition, the Washington State University (WSU) has evaluated availability of computer models that would allow for water system mapping, water quality modeling and system operation.« less

  17. Identification of green skills acquisition in Indonesian TVET curricula

    NASA Astrophysics Data System (ADS)

    Setiawan, Agus

    2017-09-01

    Recently, many countries have put the focus on green growth which specifically aims at achieving a resilient, low-carbon, and resource-efficient economy model that leads to higher quality of life. Environmental pollution and climate change are negatively affecting the sustainability of various economical activities across the world, with Indonesia being one of them. To mitigate the environmental problems, the existing economy should be shifted to a greener economy model which will create green jobs and greening the existing occupation in the industries. Green jobs require workers with green skills. Therefore, development of green skills in TVET institutions is urgently needed. By referencing the existing green skills frame work, green skills acquisition has not been clearly integrated into the existing Indonesian TVET curriculum. However, approach to integrate green skills into TVET curriculum can be carried out through the development of hard skills and soft skills in the domain of knowledge, abilities, and attitudes where green skills is an imparting of both hard skills and soft skills.

  18. Estimating Water Storage Capacity of Existing and Potentially Restorable Wetland Depressions in a Subbasin of the Red River of the North

    USGS Publications Warehouse

    Gleason, Robert A.; Tangen, Brian A.; Laubhan, Murray K.; Kermes, Kevin E.; Euliss, Ned H.

    2007-01-01

    Executive Summary Concern over flooding along rivers in the Prairie Pothole Region has stimulated interest in developing spatially distributed hydrologic models to simulate the effects of wetland water storage on peak river flows. Such models require spatial data on the storage volume and interception area of existing and restorable wetlands in the watershed of interest. In most cases, information on these model inputs is lacking because resolution of existing topographic maps is inadequate to estimate volume and areas of existing and restorable wetlands. Consequently, most studies have relied on wetland area to volume or interception area relationships to estimate wetland basin storage characteristics by using available surface area data obtained as a product from remotely sensed data (e.g., National Wetlands Inventory). Though application of areal input data to estimate volume and interception areas is widely used, a drawback is that there is little information available to provide guidance regarding the application, limitations, and biases associated with such approaches. Another limitation of previous modeling efforts is that water stored by wetlands within a watershed is treated as a simple lump storage component that is filled prior to routing overflow to a pour point or gaging station. This approach does not account for dynamic wetland processes that influence water stored in prairie wetlands. Further, most models have not considered the influence of human-induced hydrologic changes, such as land use, that greatly influence quantity of surface water inputs and, ultimately, the rate that a wetland basin fills and spills. The goals of this study were to (1) develop and improve methodologies for estimating and spatially depicting wetland storage volumes and interceptions areas and (2) develop models and approaches for estimating/simulating the water storage capacity of potentially restorable and existing wetlands under various restoration, land use, and climatic scenarios. To address these goals, we developed models and approaches to spatially represent storage volumes and interception areas of existing and potentially restorable wetlands in the upper Mustinka subbasin within Grant County, Minn. We then developed and applied a model to simulate wetland water storage increases that would result from restoring 25 and 50 percent of the farmed and drained wetlands in the upper Mustinka subbasin. The model simulations were performed during the growing season (May-October) for relatively wet (1993; 0.79 m of precipitation) and dry (1987; 0.40 m of precipitation) years. Results from the simulations indicated that the 25 percent restoration scenario would increase water storage by 21-24 percent and that a 50 percent scenario would increase storage by 34-38 percent. Additionally, we estimated that wetlands in the subbasin have potential to store 11.57-20.98 percent of the total precipitation that fell over the entire subbasin area (52,758 ha). Our simulation results indicated that there is considerable potential to enhance water storage in the subbasin; however, evaluation and calibration of the model is necessary before simulation results can be applied to management and planning decisions. In this report we present guidance for the development and application of models (e.g., surface area-volume predictive models, hydrology simulation model) to simulate wetland water storage to provide a basis from which to understand and predict the effects of natural or human-induced hydrologic alterations. In developing these approaches, we tried to use simple and widely available input data to simulate wetland hydrology and predict wetland water storage for a specific precipitation event or a series of events. Further, the hydrology simulation model accounted for land use and soil type, which influence surface water inputs to wetlands. Although information presented in this report is specific to the Mustinka subbasin, the approaches

  19. Time delays, population, and economic development

    NASA Astrophysics Data System (ADS)

    Gori, Luca; Guerrini, Luca; Sodini, Mauro

    2018-05-01

    This research develops an augmented Solow model with population dynamics and time delays. The model produces either a single stationary state or multiple stationary states (able to characterise different development regimes). The existence of time delays may cause persistent fluctuations in both economic and demographic variables. In addition, the work identifies in a simple way the reasons why economics affects demographics and vice versa.

  20. Establishing an Educational Game Development Model: From the Experience of Teaching Search Engine Optimization

    ERIC Educational Resources Information Center

    Lui, Richard W. C.; Au, Cheuk Hang

    2018-01-01

    This article describes how different literatures have suggested the positive role of educational games in students' learning, but it can be hard to find an existing game for student learning. Some lecturers may try to develop a game for their courses, but there were not many effective models for educational board game development. The authors have…

  1. Study plan for the regional aquifer-system analysis of alluvial basins in south-central Arizona and adjacent states

    USGS Publications Warehouse

    Anderson, T.W.

    1980-01-01

    The U.S. Geological Survey has started a 4-year study of the alluvial basins in south-central Arizona and parts of California , Nevada, and New Mexico to describe the hydrologic setting, available groundwater resources, and effects of historical development on the groundwater system. To aid in the study, mathematical models of selected basins will be developed for appraising local and regional flow systems. Major components necessary to accomplish the study objectives include the accumulation of existing data on groundwater quantity and quality, entering the data into a computer file, identification of data deficiencies, and development of a program to remedy the deficiencies by collection of additional data. The approach to the study will be to develop and calibrate models of selected basins for which sufficient data exist and to develop interpretation-transfer techniques whereby general predevelopment and postdevelopment conceptual models of the hydrologic system in other basins may be synthesized. The end result of the project will be a better definition of the hydrologic parameters and a better understanding of the workings of the hydrologic system. The models can be used to study the effects of management alternatives and water-resources development on the system. (USGS)

  2. Developing a theoretical framework for complex community-based interventions.

    PubMed

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  3. IDSE Version 1 User's Manual

    NASA Technical Reports Server (NTRS)

    Mayer, Richard

    1988-01-01

    The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.

  4. River Inflows into Lakes: Basin Temperature Profiles Driven By Peeling Detrainment from Dense Underflows

    NASA Astrophysics Data System (ADS)

    Hogg, C. A. R.; Huppert, H. E.; Imberger, J.; Dalziel, S. B.

    2014-12-01

    Dense gravity currents from river inflows feed fluid into confined basins in lakes. Large inflows can influence temperature profiles in the basins. Existing parameterisations of the circulation and mixing of such inflows are often based on the entrainment of ambient fluid into the underflowing gravity currents. However, recent observations have suggested that uni-directional entrainment into a gravity current does not fully describe the transfer between such gravity currents and the ambient water. Laboratory experiments visualised peeling detrainment from the gravity current occurring when the ambient fluid was stratified. A theoretical model of the observed peeling detrainment was developed to predict the temperature profile in the basin. This new model gives a better approximation of the temperature profile observed in the experiments than the pre-existing entraining model. The model can now be developed such that it integrates into operational models of lake basins.

  5. Generalized Ordinary Differential Equation Models 1

    PubMed Central

    Miao, Hongyu; Wu, Hulin; Xue, Hongqi

    2014-01-01

    Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method. PMID:25544787

  6. Generalized Ordinary Differential Equation Models.

    PubMed

    Miao, Hongyu; Wu, Hulin; Xue, Hongqi

    2014-10-01

    Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method.

  7. Opportunities for increasing utility of models for rangeland management

    USDA-ARS?s Scientific Manuscript database

    A tremendous need exists for ecosystem models to assist in rangeland management, but the utility of models developed to date has been minimal for enterprise-level decision making. Three areas in which models have had limited effectiveness for land managers are 1) addressing contemporary needs associ...

  8. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  9. Empirical support for global integrated assessment modeling: Productivity trends and technological change in developing countries' agriculture and electric power sectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sathaye, Jayant A.

    2000-04-01

    Integrated assessment (IA) modeling of climate policy is increasingly global in nature, with models incorporating regional disaggregation. The existing empirical basis for IA modeling, however, largely arises from research on industrialized economies. Given the growing importance of developing countries in determining long-term global energy and carbon emissions trends, filling this gap with improved statistical information on developing countries' energy and carbon-emissions characteristics is an important priority for enhancing IA modeling. Earlier research at LBNL on this topic has focused on assembling and analyzing statistical data on productivity trends and technological change in the energy-intensive manufacturing sectors of five developing countries,more » India, Brazil, Mexico, Indonesia, and South Korea. The proposed work will extend this analysis to the agriculture and electric power sectors in India, South Korea, and two other developing countries. They will also examine the impact of alternative model specifications on estimates of productivity growth and technological change for each of the three sectors, and estimate the contribution of various capital inputs--imported vs. indigenous, rigid vs. malleable-- in contributing to productivity growth and technological change. The project has already produced a data resource on the manufacturing sector which is being shared with IA modelers. This will be extended to the agriculture and electric power sectors, which would also be made accessible to IA modeling groups seeking to enhance the empirical descriptions of developing country characteristics. The project will entail basic statistical and econometric analysis of productivity and energy trends in these developing country sectors, with parameter estimates also made available to modeling groups. The parameter estimates will be developed using alternative model specifications that could be directly utilized by the existing IAMs for the manufacturing, agriculture, and electric power sectors.« less

  10. From translational research to open technology innovation systems.

    PubMed

    Savory, Clive; Fortune, Joyce

    2015-01-01

    The purpose of this paper is to question whether the emphasis placed within translational research on a linear model of innovation provides the most effective model for managing health technology innovation. Several alternative perspectives are presented that have potential to enhance the existing model of translational research. A case study is presented of innovation of a clinical decision support system. The paper concludes from the case study that an extending the triple helix model of technology transfer, to one based on a quadruple helix, present a basis for improving the performance translational research. A case study approach is used to help understand development of an innovative technology within a teaching hospital. The case is then used to develop and refine a model of the health technology innovation system. The paper concludes from the case study that existing models of translational research could be refined further through the development of a quadruple helix model of heath technology innovation that encompasses greater emphasis on user-led and open innovation perspectives. The paper presents several implications for future research based on the need to enhance the model of health technology innovation used to guide policy and practice. The quadruple helix model of innovation that is proposed can potentially guide alterations to the existing model of translational research in the healthcare sector. Several suggestions are made for how innovation activity can be better supported at both a policy and operational level. This paper presents a synthesis of the innovation literature applied to a theoretically important case of open innovation in the UK National Health Service. It draws in perspectives from other industrial sectors and applies them specifically to the management and organisation of innovation activities around health technology and the services in which they are embedded.

  11. Gene expression models for prediction of longitudinal dispersion coefficient in streams

    NASA Astrophysics Data System (ADS)

    Sattar, Ahmed M. A.; Gharabaghi, Bahram

    2015-05-01

    Longitudinal dispersion is the key hydrologic process that governs transport of pollutants in natural streams. It is critical for spill action centers to be able to predict the pollutant travel time and break-through curves accurately following accidental spills in urban streams. This study presents a novel gene expression model for longitudinal dispersion developed using 150 published data sets of geometric and hydraulic parameters in natural streams in the United States, Canada, Europe, and New Zealand. The training and testing of the model were accomplished using randomly-selected 67% (100 data sets) and 33% (50 data sets) of the data sets, respectively. Gene expression programming (GEP) is used to develop empirical relations between the longitudinal dispersion coefficient and various control variables, including the Froude number which reflects the effect of reach slope, aspect ratio, and the bed material roughness on the dispersion coefficient. Two GEP models have been developed, and the prediction uncertainties of the developed GEP models are quantified and compared with those of existing models, showing improved prediction accuracy in favor of GEP models. Finally, a parametric analysis is performed for further verification of the developed GEP models. The main reason for the higher accuracy of the GEP models compared to the existing regression models is that exponents of the key variables (aspect ratio and bed material roughness) are not constants but a function of the Froude number. The proposed relations are both simple and accurate and can be effectively used to predict the longitudinal dispersion coefficients in natural streams.

  12. Weight and the Future of Space Flight Hardware Cost Modeling

    NASA Technical Reports Server (NTRS)

    Prince, Frank A.

    2003-01-01

    Weight has been used as the primary input variable for cost estimating almost as long as there have been parametric cost models. While there are good reasons for using weight, serious limitations exist. These limitations have been addressed by multi-variable equations and trend analysis in models such as NAFCOM, PRICE, and SEER; however, these models have not be able to address the significant time lags that can occur between the development of similar space flight hardware systems. These time lags make the cost analyst's job difficult because insufficient data exists to perform trend analysis, and the current set of parametric models are not well suited to accommodating process improvements in space flight hardware design, development, build and test. As a result, people of good faith can have serious disagreement over the cost for new systems. To address these shortcomings, new cost modeling approaches are needed. The most promising approach is process based (sometimes called activity) costing. Developing process based models will require a detailed understanding of the functions required to produce space flight hardware combined with innovative approaches to estimating the necessary resources. Particularly challenging will be the lack of data at the process level. One method for developing a model is to combine notional algorithms with a discrete event simulation and model changes to the total cost as perturbations to the program are introduced. Despite these challenges, the potential benefits are such that efforts should be focused on developing process based cost models.

  13. Urban Runoff: Model Ordinances for Aquatic Buffers

    EPA Pesticide Factsheets

    Aquatic Buffers serve as natural boundaries between local waterways and existing development. The model and example ordinaces below provide suggested language or technical guidance designed to create the most effective stream buffer zones possible.

  14. The Development Model Electronic Commerce of Regional Agriculture

    NASA Astrophysics Data System (ADS)

    Kang, Jun; Cai, Lecai; Li, Hongchan

    With the developing of the agricultural information, it is inevitable trend of the development of agricultural electronic commercial affairs. On the basis of existing study on the development application model of e-commerce, combined with the character of the agricultural information, compared with the developing model from the theory and reality, a new development model electronic commerce of regional agriculture base on the government is put up, and such key issues as problems of the security applications, payment mode, sharing mechanisms, and legal protection are analyzed, etc. The among coordination mechanism of the region is discussed on, it is significance for regulating the development of agricultural e-commerce and promoting the regional economical development.

  15. MHD-model for low-frequency waves in a tokamak with toroidal plasma rotation and problem of existence of global geodesic acoustic modes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lakhin, V. P.; Sorokina, E. A., E-mail: sorokina.ekaterina@gmail.com, E-mail: vilkiae@gmail.com; Ilgisonis, V. I.

    2015-12-15

    A set of reduced linear equations for the description of low-frequency perturbations in toroidally rotating plasma in axisymmetric tokamak is derived in the framework of ideal magnetohydrodynamics. The model suitable for the study of global geodesic acoustic modes (GGAMs) is designed. An example of the use of the developed model for derivation of the integral conditions for GGAM existence and of the corresponding dispersion relation is presented. The paper is dedicated to the memory of academician V.D. Shafranov.

  16. Spatial analysis of rural land development

    Treesearch

    Seong-Hoon Cho; David H. Newman

    2005-01-01

    This article examines patterns of rural land development and density using spatial econometric models with the application of Geographical Information System (GIS). The cluster patterns of both development and high-density development indicate that the spatially continuous expansions of development and high-density development exist in relatively remote rural areas....

  17. The Cognitive Processes Associated with Occupational/Career Indecision: A Model for Gifted Adolescents

    ERIC Educational Resources Information Center

    Jung, Jae Yup

    2013-01-01

    This study developed and tested a new model of the cognitive processes associated with occupational/career indecision for gifted adolescents. A survey instrument with rigorous psychometric properties, developed from a number of existing instruments, was administered to a sample of 687 adolescents attending three academically selective high schools…

  18. Development of a biorefinery optimized biofuel supply curve for the western United States

    Treesearch

    Nathan Parker; Peter Tittmann; Quinn Hart; Richard Nelson; Ken Skog; Anneliese Schmidt; Edward Gray; Bryan Jenkins

    2010-01-01

    A resource assessment and biorefinery siting optimization model was developed and implemented to assess potential biofuel supply across the Western United States from agricultural, forest, urban, and energy crop biomass. Spatial information including feedstock resources, existing and potential refinery locations and a transportation network model is provided to a mixed...

  19. Pedagogical Catalysts of Civic Competence: The Development of a Critical Epistemological Model for Community-Based Learning

    ERIC Educational Resources Information Center

    Stokamer, Stephanie

    2013-01-01

    Democratic problem-solving necessitates an active and informed citizenry, but existing research on service-learning has shed little light on the relationship between pedagogical practices and civic competence outcomes. This study developed and tested a model to represent that relationship and identified pedagogical catalysts of civic competence…

  20. Entrepreneurial Universities and the Development of Regional Societies: A Spatial View of the Europe of Knowledge

    ERIC Educational Resources Information Center

    Kitagawa, Fumi

    2005-01-01

    This article highlights a range of university entrepreneurship activities and regional engagement in relation to current governance and finance issues. A model for networking and developing partnership between universities and their region is presented, which reflects existing and emerging European level policy instruments. This model aims at…

  1. An Investigation of Jogging Biomechanics using the Full-Body Lumbar Spine Model: Model Development and Validation

    PubMed Central

    Raabe, Margaret E.; Chaudhari, Ajit M.W.

    2016-01-01

    The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the Full-Body Lumbar Spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model's trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. PMID:26947033

  2. A Knowledge Discovery from POS Data using State Space Models

    NASA Astrophysics Data System (ADS)

    Sato, Tadahiko; Higuchi, Tomoyuki

    The number of competing-brands changes by new product's entry. The new product introduction is endemic among consumer packaged goods firm and is an integral component of their marketing strategy. As a new product's entry affects markets, there is a pressing need to develop market response model that can adapt to such changes. In this paper, we develop a dynamic model that capture the underlying evolution of the buying behavior associated with the new product. This extends an application of a dynamic linear model, which is used by a number of time series analyses, by allowing the observed dimension to change at some point in time. Our model copes with a problem that dynamic environments entail: changes in parameter over time and changes in the observed dimension. We formulate the model with framework of a state space model. We realize an estimation of the model using modified Kalman filter/fixed interval smoother. We find that new product's entry (1) decreases brand differentiation for existing brands, as indicated by decreasing difference between cross-price elasticities; (2) decreases commodity power for existing brands, as indicated by decreasing trend; and (3) decreases the effect of discount for existing brands, as indicated by a decrease in the magnitude of own-brand price elasticities. The proposed framework is directly applicable to other fields in which the observed dimension might be change, such as economic, bioinformatics, and so forth.

  3. Development of a numerical model to predict physiological strain of firefighter in fire hazard.

    PubMed

    Su, Yun; Yang, Jie; Song, Guowen; Li, Rui; Xiang, Chunhui; Li, Jun

    2018-02-26

    This paper aims to develop a numerical model to predict heat stress of firefighter under low-level thermal radiation. The model integrated a modified multi-layer clothing model with a human thermoregulation model. We took the coupled radiative and conductive heat transfer in the clothing, the size-dependent heat transfer in the air gaps, and the controlling active and controlled passive thermal regulation in human body into consideration. The predicted core temperature and mean skin temperature from the model showed a good agreement with the experimental results. Parametric study was conducted and the result demonstrated that the radiative intensity had a significant influence on the physiological heat strain. The existence of air gap showed positive effect on the physiological heat strain when air gap size is small. However, when the size of air gap exceeds 6 mm, a different trend was observed due to the occurrence of natural convection. Additionally, the time length for the existence of the physiological heat strain was greater than the existence of the skin burn under various heat exposures. The findings obtained in this study provide a better understanding of the physiological strain of firefighter and shed light on textile material engineering for achieving higher protective performance.

  4. Habitat Suitability Index Models: Hairy woodpecker

    USGS Publications Warehouse

    Sousa, Patrick J.

    1987-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the hairy woodpecker (Picoides villosus). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  5. Habitat Suitability Index Models: Diamondback terrapin (nesting) - Atlantic coast

    USGS Publications Warehouse

    Palmer, William M.; Cordes, Carroll L.

    1988-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the diamondback terrapin (Malaclemys terrapin). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  6. Habitat Suitability Index Models: Snapping turtle

    USGS Publications Warehouse

    Graves, Brent M.; Anderson, Stanley H.

    1987-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the snapping turtle (Chelydra serpentina). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) and 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  7. Habitat Suitability Index Models: Red-winged blackbird

    USGS Publications Warehouse

    Short, Henry L.

    1985-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the red-winged blackbird (Agelaius phoeniceus L.). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  8. A Simulation Model for Studying Effects of Pollution and Freshwater Inflow on Secondary Productivity in an Ecosystem. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1974-01-01

    A mathematical model of an ecosystem is developed. Secondary productivity is evaluated in terms of man related and controllable factors. Information from an existing physical parameters model is used as well as pertinent biological measurements. Predictive information of value to estuarine management is presented. Biological, chemical, and physical parameters measured in order to develop models of ecosystems are identified.

  9. Habitat Suitability Index Models: Spotted owl

    USGS Publications Warehouse

    Laymon, Stephen A.; Salwasser, Hal; Barrett, Reginald H.

    1985-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for the spotted owl (Strix occidentalis). The model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) to 1.0 (optimum habitat). HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  10. An Eye Model for Computational Dosimetry Using A Multi-Scale Voxel Phantom

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-06-01

    The lens of the eye is a radiosensitive tissue with cataract formation being the major concern. Recently reduced recommended dose limits to the lens of the eye have made understanding the dose to this tissue of increased importance. Due to memory limitations, the voxel resolution of computational phantoms used for radiation dose calculations is too large to accurately represent the dimensions of the eye. A revised eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and is then transformed into a high-resolution voxel model. This eye model is combined with an existing set of whole body models to form a multi-scale voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  11. Women's Self-definition in Adulthood: From a Different Model?

    ERIC Educational Resources Information Center

    Peck, Teresa A.

    1986-01-01

    Examines criticisms of existing models of adult development from both feminist and developmental psychologists. A model of women's adult self-definition is presented, based upon current research on women's adult experience. The model combines a dialectical approach, which considers the effects of social/historical factors, with a feminist…

  12. Conceptual Models, Choices, and Benchmarks for Building Quality Work Cultures.

    ERIC Educational Resources Information Center

    Acker-Hocevar, Michele

    1996-01-01

    The two models in Florida's Educational Quality Benchmark System represent a new way of thinking about developing schools' work culture. The Quality Performance System Model identifies nine dimensions of work within a quality system. The Change Process Model provides a theoretical framework for changing existing beliefs, attitudes, and behaviors…

  13. MODELING THE DYNAMICS OF THREE FUNCTIONAL GROUPS OF MACROALGAE IN TROPICAL SEAGRASS HABITATS. (R828677C004)

    EPA Science Inventory

    A model of three functional groups of macroalgae, drift algae, rhizophytic calcareous algae, and seagrass epiphytes, was developed to complement an existing seagrass production model for tropical habitats dominated by Thalassia testudinum (Turtle-grass). The current modeling e...

  14. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  15. Community models for wildlife impact assessment: a review of concepts and approaches

    USGS Publications Warehouse

    Schroeder, Richard L.

    1987-01-01

    The first two sections of this paper are concerned with defining and bounding communities, and describing those attributes of the community that are quantifiable and suitable for wildlife impact assessment purposes. Prior to the development or use of a community model, it is important to have a clear understanding of the concept of a community and a knowledge of the types of community attributes that can serve as outputs for the development of models. Clearly defined, unambiguous model outputs are essential for three reasons: (1) to ensure that the measured community attributes relate to the wildlife resource objectives of the study; (2) to allow testing of the outputs in experimental studies, to determine accuracy, and to allow for improvements based on such testing; and (3) to enable others to clearly understand the community attribute that has been measured. The third section of this paper described input variables that may be used to predict various community attributes. These input variables do not include direct measures of wildlife populations. Most impact assessments involve projects that result in drastic changes in habitat, such as changes in land use, vegetation, or available area. Therefore, the model input variables described in this section deal primarily with habitat related features. Several existing community models are described in the fourth section of this paper. A general description of each model is provided, including the nature of the input variables and the model output. The logic and assumptions of each model are discussed, along with data requirements needed to use the model. The fifth section provides guidance on the selection and development of community models. Identification of the community attribute that is of concern will determine the type of model most suitable for a particular application. This section provides guidelines on selected an existing model, as well as a discussion of the major steps to be followed in modifying an existing model or developing a new model. Considerations associated with the use of community models with the Habitat Evaluation Procedures are also discussed. The final section of the paper summarizes major findings of interest to field biologists and provides recommendations concerning the implementation of selected concepts in wildlife community analyses.

  16. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  17. Transportation Impact Evaluation System

    DOT National Transportation Integrated Search

    1979-11-01

    This report specifies a framework for spatial analysis and the general modelling steps required. It also suggests available urban and regional data sources, along with some typical existing urban and regional models. The goal is to develop a computer...

  18. Malaysian Private Education Quality: Application of SERVQUAL Model

    ERIC Educational Resources Information Center

    Vaz, Anthony; Mansori, Shaheen

    2013-01-01

    Intense competition among existing private education providers and the Malaysian government's relaxation of regulations for allowing international universities to open off shore campuses in Malaysia, have forced companies in the education industry to develop strategies which can help them to make their existing students satisfied and keep them…

  19. 40 CFR 91.501 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information used as a basis for the FEL (e.g., previous emission tests, development tests), the specific...), (2) and (3) of this section. (1) The provisions of this subpart are waived for existing technology OB... provisions of this subpart for existing technology OB/PWC for a specific engine family through model year...

  20. A Proposal for a K-12 Sequence of Environmental Education Competencies.

    ERIC Educational Resources Information Center

    Culbert, Jack; And Others

    Presented is an overview and model of the proposed curriculum development process in environmental education in Connecticut. Concepts and competencies are identified at each grade level and are designed to facilitate the infusion of environmental education activities within the existing curricula using existing learning resources such as…

  1. EXPOSURE RELATED DOSE ESTIMATING MODEL ( ERDEM ) A PHYSIOLOGICALLY-BASED PHARMACOKINETIC AND PHARMACODYNAMIC ( PBPK/PD ) MODEL FOR ASSESSING HUMAN EXPOSURE AND RISK

    EPA Science Inventory

    The Exposure Related Dose Estimating Model (ERDEM) is a PBPK/PD modeling system that was developed by EPA's National Exposure Research Laboratory (NERL). The ERDEM framework provides the flexibility either to use existing models and to build new PBPK and PBPK/PD models to address...

  2. A review of methods for predicting air pollution dispersion

    NASA Technical Reports Server (NTRS)

    Mathis, J. J., Jr.; Grose, W. L.

    1973-01-01

    Air pollution modeling, and problem areas in air pollution dispersion modeling were surveyed. Emission source inventory, meteorological data, and turbulent diffusion are discussed in terms of developing a dispersion model. Existing mathematical models of urban air pollution, and highway and airport models are discussed along with their limitations. Recommendations for improving modeling capabilities are included.

  3. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  4. Habitat Suitability Index Models: Laughing gull

    USGS Publications Warehouse

    Zale, Alexander V.; Mulholland, Rosemarie

    1985-01-01

    A review and synthesis of existing information were used to develop a habitat model for laughing gull (Larus atricilla). The model is scaled to produce an index of habitat suitability between 0 (unsuitable habitat) and 1.0 (optimally suitable habitat) for areas along the Gulf of Mexico coast. Habitat suitability indices are designed for use with the Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service. Guidelines for application of the model and techniques for measuring model variables are described.

  5. A survey of Applied Psychological Services' models of the human operator

    NASA Technical Reports Server (NTRS)

    Siegel, A. I.; Wolf, J. J.

    1979-01-01

    A historical perspective is presented in terms of the major features and status of two families of computer simulation models in which the human operator plays the primary role. Both task oriented and message oriented models are included. Two other recent efforts are summarized which deal with visual information processing. They involve not whole model development but a family of subroutines customized to add the human aspects to existing models. A global diagram of the generalized model development/validation process is presented and related to 15 criteria for model evaluation.

  6. Principle of Spacetime and Black Hole Equivalence

    NASA Astrophysics Data System (ADS)

    Zhang, Tianxi

    2016-06-01

    Modelling the universe without relying on a set of hypothetical entities (HEs) to explain observations and overcome problems and difficulties is essential to developing a physical cosmology. The well-known big bang cosmology, widely accepted as the standard model, stands on two fundamentals, which are Einstein’s general relativity (GR) that describes the effect of matter on spacetime and the cosmological principle (CP) of spacetime isotropy and homogeneity. The field equation of GR along with the Friedmann-Lemaitre-Robertson-Walker (FLRW) metric of spacetime derived from CP generates the Friedmann equation (FE) that governs the development and dynamics of the universe. The big bang theory has made impressive successes in explaining the universe, but still has problems and solutions of them rely on an increasing number of HEs such as inflation, dark matter, dark energy, and so on. Recently, the author has developed a new cosmological model called black hole universe, which, instead of making many those hypotheses, only includes a new single postulate (or a new principle) to the cosmology - Principle of Spacetime and Black Hole Equivalence (SBHEP) - to explain all the existing observations of the universe and overcome all the existing problems in conventional cosmologies. This study thoroughly demonstrates how this newly developed black hole universe model, which therefore stands on the three fundamentals (GR, CP, and SBHEP), can fully explain the universe as well as easily conquer the difficulties according to the well-developed physics, thus, neither needing any other hypotheses nor existing any unsolved difficulties. This work was supported by NSF/REU (Grant #: PHY-1263253) at Alabama A & M University.

  7. RAPID ASSESSMENT OF URBAN WETLANDS: FUNCTIONAL ASSESSMENT MODEL DEVELOPMENT AND EVALUATION

    EPA Science Inventory

    The objective of this study was to test the ability of existing hydrogeomorphic (HGM) functional assessment models and our own proposed models to predict rates of nitrate production and removal, functions critical to water quality protection, in forested riparian wetlands in nort...

  8. A microcomputer model for simulating pressurized flow in a storm sewer system : interim report.

    DOT National Transportation Integrated Search

    1988-01-01

    A study is being conducted on the development of a microcomputer model for simulating storm sewer flow under surcharged or pressurized conditions. Several existing models, including the EPA Storm Water Management Hodel (SYMM) and the Illinois Urban D...

  9. Determination of wind tunnel constraint effects by a unified pressure signature method. Part 2: Application to jet-in-crossflow

    NASA Technical Reports Server (NTRS)

    Hackett, J. E.; Sampath, S.; Phillips, C. G.

    1981-01-01

    The development of an improved jet-in-crossflow model for estimating wind tunnel blockage and angle-of-attack interference is described. Experiments showed that the simpler existing models fall seriously short of representing far-field flows properly. A new, vortex-source-doublet (VSD) model was therefore developed which employs curved trajectories and experimentally-based singularity strengths. The new model is consistent with existing and new experimental data and it predicts tunnel wall (i.e. far-field) pressures properly. It is implemented as a preprocessor to the wall-pressure-signature-based tunnel interference predictor. The supporting experiments and theoretical studies revealed some new results. Comparative flow field measurements with 1-inch "free-air" and 3-inch impinging jets showed that vortex penetration into the flow, in diameters, was almost unaltered until 'hard' impingement occurred. In modeling impinging cases, a 'plume redirection' term was introduced which is apparently absent in previous models. The effects of this term were found to be very significant.

  10. Cloud Computing Value Chains: Understanding Businesses and Value Creation in the Cloud

    NASA Astrophysics Data System (ADS)

    Mohammed, Ashraf Bany; Altmann, Jörn; Hwang, Junseok

    Based on the promising developments in Cloud Computing technologies in recent years, commercial computing resource services (e.g. Amazon EC2) or software-as-a-service offerings (e.g. Salesforce. com) came into existence. However, the relatively weak business exploitation, participation, and adoption of other Cloud Computing services remain the main challenges. The vague value structures seem to be hindering business adoption and the creation of sustainable business models around its technology. Using an extensive analyze of existing Cloud business models, Cloud services, stakeholder relations, market configurations and value structures, this Chapter develops a reference model for value chains in the Cloud. Although this model is theoretically based on porter's value chain theory, the proposed Cloud value chain model is upgraded to fit the diversity of business service scenarios in the Cloud computing markets. Using this model, different service scenarios are explained. Our findings suggest new services, business opportunities, and policy practices for realizing more adoption and value creation paths in the Cloud.

  11. Balancing the Roles of a Family Medicine Residency Faculty: A Grounded Theory Study.

    PubMed

    Reitz, Randall; Sudano, Laura; Siler, Anne; Trimble, Kristopher

    2016-05-01

    Great variety exists in the roles that family medicine residency faculty fill in the lives of their residents. A family medicine-specific model has never been created to describe and promote effective training relationships. This research aims to create a consensus model for faculty development, ethics education, and policy creation. Using a modified grounded theory methods, researchers conducted phone interviews with 22 key informants from US family medicine residencies. Data were analyzed to delineate faculty roles, common role conflicts, and ethical principles for avoiding and managing role conflicts. Key informants were asked to apply their experience and preferences to adapt an existing model to fit with family medicine residency settings. The primary result of this research is the creation of a family medicine-specific model that describes faculty roles and provides insight into how to manage role conflicts with residents. Primary faculty roles include Role Model, Advisor, Teacher, Supervisor, and Evaluator. Secondary faculty roles include Friendly Colleague, Wellness Supporter, and Helping Hand. The secondary roles exist on a continuum from disengaged to enmeshed. When not balanced, the secondary roles can detract from the primary roles. Differences were found between role expectations of physician versus behavioral science faculty and larger/university/urban residencies versus smaller/community/rural residencies. Diversity of opinion exists related to the types of roles that are appropriate for family medicine faculty to maintain with residents. This new model is a first attempt to build consensus in the field and has application to faculty development, ethics education, and policy creation.

  12. Assessing the feasibility, cost, and utility of developing models of human performance in aviation

    NASA Technical Reports Server (NTRS)

    Stillwell, William

    1990-01-01

    The purpose of the effort outlined in this briefing was to determine whether models exist or can be developed that can be used to address aviation automation issues. A multidisciplinary team has been assembled to undertake this effort, including experts in human performance, team/crew, and aviation system modeling, and aviation data used as input to such models. The project consists of two phases, a requirements assessment phase that is designed to determine the feasibility and utility of alternative modeling efforts, and a model development and evaluation phase that will seek to implement the plan (if a feasible cost effective development effort is found) that results from the first phase. Viewgraphs are given.

  13. Using existing case-mix methods to fund trauma cases.

    PubMed

    Monakova, Julia; Blais, Irene; Botz, Charles; Chechulin, Yuriy; Picciano, Gino; Basinski, Antoni

    2010-01-01

    Policymakers frequently face the need to increase funding in isolated and frequently heterogeneous (clinically and in terms of resource consumption) patient subpopulations. This article presents a methodologic solution for testing the appropriateness of using existing grouping and weighting methodologies for funding subsets of patients in the scenario where a case-mix approach is preferable to a flat-rate based payment system. Using as an example the subpopulation of trauma cases of Ontario lead trauma hospitals, the statistical techniques of linear and nonlinear regression models, regression trees, and spline models were applied to examine the fit of the existing case-mix groups and reference weights for the trauma cases. The analyses demonstrated that for funding Ontario trauma cases, the existing case-mix systems can form the basis for rational and equitable hospital funding, decreasing the need to develop a different grouper for this subset of patients. This study confirmed that Injury Severity Score is a poor predictor of costs for trauma patients. Although our analysis used the Canadian case-mix classification system and cost weights, the demonstrated concept of using existing case-mix systems to develop funding rates for specific subsets of patient populations may be applicable internationally.

  14. Models and theories of prescribing decisions: A review and suggested a new model.

    PubMed

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  15. A Simplified Micromechanical Modeling Approach to Predict the Tensile Flow Curve Behavior of Dual-Phase Steels

    NASA Astrophysics Data System (ADS)

    Nanda, Tarun; Kumar, B. Ravi; Singh, Vishal

    2017-11-01

    Micromechanical modeling is used to predict material's tensile flow curve behavior based on microstructural characteristics. This research develops a simplified micromechanical modeling approach for predicting flow curve behavior of dual-phase steels. The existing literature reports on two broad approaches for determining tensile flow curve of these steels. The modeling approach developed in this work attempts to overcome specific limitations of the existing two approaches. This approach combines dislocation-based strain-hardening method with rule of mixtures. In the first step of modeling, `dislocation-based strain-hardening method' was employed to predict tensile behavior of individual phases of ferrite and martensite. In the second step, the individual flow curves were combined using `rule of mixtures,' to obtain the composite dual-phase flow behavior. To check accuracy of proposed model, four distinct dual-phase microstructures comprising of different ferrite grain size, martensite fraction, and carbon content in martensite were processed by annealing experiments. The true stress-strain curves for various microstructures were predicted with the newly developed micromechanical model. The results of micromechanical model matched closely with those of actual tensile tests. Thus, this micromechanical modeling approach can be used to predict and optimize the tensile flow behavior of dual-phase steels.

  16. Anticipating Forest and Range Land Development in Central Oregon (USA) for Landscape Analysis, with an Example Application Involving Mule Deer

    NASA Astrophysics Data System (ADS)

    Kline, Jeffrey D.; Moses, Alissa; Burcsu, Theresa

    2010-05-01

    Forest policymakers, public lands managers, and scientists in the Pacific Northwest (USA) seek ways to evaluate the landscape-level effects of policies and management through the multidisciplinary development and application of spatially explicit methods and models. The Interagency Mapping and Analysis Project (IMAP) is an ongoing effort to generate landscape-wide vegetation data and models to evaluate the integrated effects of disturbances and management activities on natural resource conditions in Oregon and Washington (USA). In this initial analysis, we characterized the spatial distribution of forest and range land development in a four-county pilot study region in central Oregon. The empirical model describes the spatial distribution of buildings and new building construction as a function of population growth, existing development, topography, land-use zoning, and other factors. We used the model to create geographic information system maps of likely future development based on human population projections to inform complementary landscape analyses underway involving vegetation, habitat, and wildfire interactions. In an example application, we use the model and resulting maps to show the potential impacts of future forest and range land development on mule deer ( Odocoileus hemionus) winter range. Results indicate significant development encroachment and habitat loss already in 2000 with development located along key migration routes and increasing through the projection period to 2040. The example application illustrates a simple way for policymakers and public lands managers to combine existing data and preliminary model outputs to begin to consider the potential effects of development on future landscape conditions.

  17. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  18. A Review of Mathematical Models for Leukemia and Lymphoma

    PubMed Central

    Clapp, Geoffrey; Levy, Doron

    2014-01-01

    Recently, there has been significant activity in the mathematical community, aimed at developing quantitative tools for studying leukemia and lymphoma. Mathematical models have been applied to evaluate existing therapies and to suggest novel therapies. This article reviews the recent contributions of mathematical modeling to leukemia and lymphoma research. These developments suggest that mathematical modeling has great potential in this field. Collaboration between mathematicians, clinicians, and experimentalists can significantly improve leukemia and lymphoma therapy. PMID:26744598

  19. Requirements for Medical Modeling Languages

    PubMed Central

    van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes

    2001-01-01

    Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383

  20. Information Technology Implementation and Sustainment Model: Data Collection Instrument

    DTIC Science & Technology

    2005-03-01

    users (Wing and Bettinger , 2003). A GIS is a computerized system for spatial (geographically-referenced) data management (Davis and Schultz, 1990:3...AFIT/GEM/ENV/05M-15 Abstract The goal of this research was to develop a data collection instrument for an existing information technology...implementation and sustsinment model. In 2003, a unique system dynamics model was developed at the Air Force Institute of Technology to predict the

  1. Investigation and Development of Data-Driven D-Region Model for HF Systems Impacts

    NASA Technical Reports Server (NTRS)

    Eccles, J. V.; Rice, D.; Sojka, J. J.; Hunsucker, R. D.

    2002-01-01

    Space Environment Corporation (SEC) and RP Consultants (RPC) are to develop and validate a weather-capable D region model for making High Frequency (HF) absorption predictions in support of the HF communications and radar communities. The weather-capable model will assimilate solar and earth space observations from NASA satellites. The model will account for solar-induced impacts on HF absorption, including X-rays, Solar Proton Events (SPE's), and auroral precipitation. The work plan includes: I . Optimize D-region model to quickly obtain ion and electron densities for proper HF absorption calculations. 2. Develop indices-driven modules for D-region ionization sources for low, mid, & high latitudes including X-rays, cosmic rays, auroral precipitation, & solar protons. (Note: solar spectrum & auroral modules already exist). 3. Setup low-cost monitors of existing HF beacons and add one single-frequency beacon. 4. Use PENEX HF-link database with HF monitor data to validate D-region/HF absorption model using climatological ionization drivers. 5. Develop algorithms to assimilate NASA satellite data of solar, interplanetary, and auroral observations into ionization source modules. 6. Use PENEX HF-link & HF-beacon data for skill score comparison of assimilation versus climatological D-region/HF absorption model. Only some satellites are available for the PENEX time period, thus, HF-beacon data is necessary. 7. Use HF beacon monitors to develop HF-link data assimilation algorithms for regional improvement to the D-region/HF absorption model.

  2. Steering operational synergies in terrestrial observation networks: opportunity for advancing Earth system dynamics modelling

    NASA Astrophysics Data System (ADS)

    Baatz, Roland; Sullivan, Pamela L.; Li, Li; Weintraub, Samantha R.; Loescher, Henry W.; Mirtl, Michael; Groffman, Peter M.; Wall, Diana H.; Young, Michael; White, Tim; Wen, Hang; Zacharias, Steffen; Kühn, Ingolf; Tang, Jianwu; Gaillardet, Jérôme; Braud, Isabelle; Flores, Alejandro N.; Kumar, Praveen; Lin, Henry; Ghezzehei, Teamrat; Jones, Julia; Gholz, Henry L.; Vereecken, Harry; Van Looy, Kris

    2018-05-01

    Advancing our understanding of Earth system dynamics (ESD) depends on the development of models and other analytical tools that apply physical, biological, and chemical data. This ambition to increase understanding and develop models of ESD based on site observations was the stimulus for creating the networks of Long-Term Ecological Research (LTER), Critical Zone Observatories (CZOs), and others. We organized a survey, the results of which identified pressing gaps in data availability from these networks, in particular for the future development and evaluation of models that represent ESD processes, and provide insights for improvement in both data collection and model integration. From this survey overview of data applications in the context of LTER and CZO research, we identified three challenges: (1) widen application of terrestrial observation network data in Earth system modelling, (2) develop integrated Earth system models that incorporate process representation and data of multiple disciplines, and (3) identify complementarity in measured variables and spatial extent, and promoting synergies in the existing observational networks. These challenges lead to perspectives and recommendations for an improved dialogue between the observation networks and the ESD modelling community, including co-location of sites in the existing networks and further formalizing these recommendations among these communities. Developing these synergies will enable cross-site and cross-network comparison and synthesis studies, which will help produce insights around organizing principles, classifications, and general rules of coupling processes with environmental conditions.

  3. Risk adjustment models for short-term outcomes after surgical resection for oesophagogastric cancer.

    PubMed

    Fischer, C; Lingsma, H; Hardwick, R; Cromwell, D A; Steyerberg, E; Groene, O

    2016-01-01

    Outcomes for oesophagogastric cancer surgery are compared with the aim of benchmarking quality of care. Adjusting for patient characteristics is crucial to avoid biased comparisons between providers. The study objective was to develop a case-mix adjustment model for comparing 30- and 90-day mortality and anastomotic leakage rates after oesophagogastric cancer resections. The study reviewed existing models, considered expert opinion and examined audit data in order to select predictors that were consequently used to develop a case-mix adjustment model for the National Oesophago-Gastric Cancer Audit, covering England and Wales. Models were developed on patients undergoing surgical resection between April 2011 and March 2013 using logistic regression. Model calibration and discrimination was quantified using a bootstrap procedure. Most existing risk models for oesophagogastric resections were methodologically weak, outdated or based on detailed laboratory data that are not generally available. In 4882 patients with oesophagogastric cancer used for model development, 30- and 90-day mortality rates were 2·3 and 4·4 per cent respectively, and 6·2 per cent of patients developed an anastomotic leak. The internally validated models, based on predictors selected from the literature, showed moderate discrimination (area under the receiver operating characteristic (ROC) curve 0·646 for 30-day mortality, 0·664 for 90-day mortality and 0·587 for anastomotic leakage) and good calibration. Based on available data, three case-mix adjustment models for postoperative outcomes in patients undergoing curative surgery for oesophagogastric cancer were developed. These models should be used for risk adjustment when assessing hospital performance in the National Health Service, and tested in other large health systems. © 2015 BJS Society Ltd Published by John Wiley & Sons Ltd.

  4. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  5. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  6. Federal Highway Administration (FHWA) work zone driver model software

    DOT National Transportation Integrated Search

    2016-11-01

    FHWA and the U.S. Department of Transportation (USDOT) Volpe Center are developing a work zone car-following model and simulation software that interfaces with existing microsimulation tools, enabling more accurate simulation of car-following through...

  7. Beyond the Model: Building an Effective and Dynamic IT Curriculum

    ERIC Educational Resources Information Center

    Brewer, Jeffrey; Harriger, Alka; Mendonca, John

    2006-01-01

    A model curriculum, such as that developed by the ACM/SIGITE Curriculum Committee (2005), has two important functions. First, it provides a base structure for newly developing programs that can use it as a platform for articulating a curriculum. Second, it offers an existing curriculum framework that can be used for validation by existing…

  8. Beyond the Gender Binary: A Case Study of Two Transgender Students at a Midwestern Research University

    ERIC Educational Resources Information Center

    Bilodeau, Brent

    2005-01-01

    Few non-pathologizing models of transgender identity development currently exist. This study uses an adaptation of the D'Augelli (1994) lifespan model of sexual orientation identity development to consider the lives of transgender college students. Interviews with two transgender-identified students find that they have developmental experiences in…

  9. 20170921 - Development of an acute oral toxicity dataset to facilitate assessment of existing QSARs and development of new models (ASCCT)

    EPA Science Inventory

    Assessment of the acute toxic potential of a substance is necessary to determine the adverse effects that might occur following accidental or deliberate short-term exposure. There are no accepted in vitro approaches available and few in silico models. Until recently, there had be...

  10. Study of solid state photomultiplier

    NASA Technical Reports Server (NTRS)

    Hays, K. M.; Laviolette, R. A.

    1987-01-01

    Available solid state photomultiplier (SSPM) detectors were tested under low-background, low temperature conditions to determine the conditions producing optimal sensitivity in a space-based astronomy system such as a liquid cooled helium telescope in orbit. Detector temperatures varied between 6 and 9 K, with background flux ranging from 10 to the 13th power to less than 10 to the 6th power photons/square cm-s. Measured parameters included quantum efficiency, noise, dark current, and spectral response. Experimental data were reduced, analyzed, and combined with existing data to build the SSPM data base included herein. The results were compared to analytical models of SSPM performance where appropriate models existed. Analytical models presented here were developed to be as consistent with the data base as practicable. Significant differences between the theory and data are described. Some models were developed or updated as a result of this study.

  11. Existence and characterization of optimal control in mathematics model of diabetics population

    NASA Astrophysics Data System (ADS)

    Permatasari, A. H.; Tjahjana, R. H.; Udjiani, T.

    2018-03-01

    Diabetes is a chronic disease with a huge burden affecting individuals and the whole society. In this paper, we constructed the optimal control mathematical model by applying a strategy to control the development of diabetic population. The constructed mathematical model considers the dynamics of disabled people due to diabetes. Moreover, an optimal control approach is proposed in order to reduce the burden of pre-diabetes. Implementation of control is done by preventing the pre-diabetes develop into diabetics with and without complications. The existence of optimal control and characterization of optimal control is discussed in this paper. Optimal control is characterized by applying the Pontryagin minimum principle. The results indicate that there is an optimal control in optimization problem in mathematics model of diabetic population. The effect of the optimal control variable (prevention) is strongly affected by the number of healthy people.

  12. Building occupancy simulation and data assimilation using a graph-based agent-oriented model

    NASA Astrophysics Data System (ADS)

    Rai, Sanish; Hu, Xiaolin

    2018-07-01

    Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.

  13. An investigation of jogging biomechanics using the full-body lumbar spine model: Model development and validation.

    PubMed

    Raabe, Margaret E; Chaudhari, Ajit M W

    2016-05-03

    The ability of a biomechanical simulation to produce results that can translate to real-life situations is largely dependent on the physiological accuracy of the musculoskeletal model. There are a limited number of freely-available, full-body models that exist in OpenSim, and those that do exist are very limited in terms of trunk musculature and degrees of freedom in the spine. Properly modeling the motion and musculature of the trunk is necessary to most accurately estimate lower extremity and spinal loading. The objective of this study was to develop and validate a more physiologically accurate OpenSim full-body model. By building upon three previously developed OpenSim models, the full-body lumbar spine (FBLS) model, comprised of 21 segments, 30 degrees-of-freedom, and 324 musculotendon actuators, was developed. The five lumbar vertebrae were modeled as individual bodies, and coupled constraints were implemented to describe the net motion of the spine. The eight major muscle groups of the lumbar spine were modeled (rectus abdominis, external and internal obliques, erector spinae, multifidus, quadratus lumborum, psoas major, and latissimus dorsi), and many of these muscle groups were modeled as multiple fascicles allowing the large muscles to act in multiple directions. The resulting FBLS model׳s trunk muscle geometry, maximal isometric joint moments, and simulated muscle activations compare well to experimental data. The FBLS model will be made freely available (https://simtk.org/home/fullbodylumbar) for others to perform additional analyses and develop simulations investigating full-body dynamics and contributions of the trunk muscles to dynamic tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Comparison of three GIS-based models for predicting rockfall runout zones at a regional scale

    NASA Astrophysics Data System (ADS)

    Dorren, Luuk K. A.; Seijmonsbergen, Arie C.

    2003-11-01

    Site-specific information about the level of protection that mountain forests provide is often not available for large regions. Information regarding rockfalls is especially scarce. The most efficient way to obtain information about rockfall activity and the efficacy of protection forests at a regional scale is to use a simulation model. At present, it is still unknown which forest parameters could be incorporated best in such models. Therefore, the purpose of this study was to test and evaluate a model for rockfall assessment at a regional scale in which simple forest stand parameters, such as the number of trees per hectare and the diameter at breast height, are incorporated. Therefore, a newly developed Geographical Information System (GIS)-based distributed model is compared with two existing rockfall models. The developed model is the only model that calculates the rockfall velocity on the basis of energy loss due to collisions with trees and on the soil surface. The two existing models calculate energy loss over the distance between two cell centres, while the newly developed model is able to calculate multiple bounces within a pixel. The patterns of rockfall runout zones produced by the three models are compared with patterns of rockfall deposits derived from geomorphological field maps. Furthermore, the rockfall velocities modelled by the three models are compared. It is found that the models produced rockfall runout zone maps with rather similar accuracies. However, the developed model performs best on forested hillslopes and it also produces velocities that match best with field estimates on both forested and nonforested hillslopes irrespective of the slope gradient.

  15. A Theoretical Approach to the Long-Hazleton Process of Public Relations Model.

    ERIC Educational Resources Information Center

    Myers, Scott A.

    One way to implement theory into existing public relations classes is to utilize the Process of Public Relations model developed by L. W. Long and V. Hazleton. The use of the model in the classroom is important because the model stresses the interdependence between the public relations practitioner and the organization. The model begins by…

  16. Fault management for data systems

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann

    1993-01-01

    Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.

  17. The Abdominal Aortic Aneurysm Statistically Corrected Operative Risk Evaluation (AAA SCORE) for predicting mortality after open and endovascular interventions.

    PubMed

    Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R

    2015-01-01

    Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P < .001 and P = .001, respectively). Discrimination remained excellent when only elective procedures were considered. There was no evidence of miscalibration by Hosmer-Lemeshow analysis. We have developed accurate models to assess risk of in-hospital mortality after AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  18. Integrated Workforce Modeling System

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.

    2000-01-01

    There are several computer-based systems, currently in various phases of development at KSC, which encompass some component, aspect, or function of workforce modeling. These systems may offer redundant capabilities and/or incompatible interfaces. A systems approach to workforce modeling is necessary in order to identify and better address user requirements. This research has consisted of two primary tasks. Task 1 provided an assessment of existing and proposed KSC workforce modeling systems for their functionality and applicability to the workforce planning function. Task 2 resulted in the development of a proof-of-concept design for a systems approach to workforce modeling. The model incorporates critical aspects of workforce planning, including hires, attrition, and employee development.

  19. The AQMEII Two-Continent Regional Air Quality Model Evaluation Study: Fueling Ideas with Unprecedented Data

    EPA Science Inventory

    Although strong collaborations in the air pollution field have existed among the North American (NA) and European (EU) countries over the past five decades, regional-scale air quality model developments and model performance evaluations have been carried out independently unlike ...

  20. Developing the Practising Model in Physical Education: An Expository Outline Focusing on Movement Capability

    ERIC Educational Resources Information Center

    Barker, D. M.; Aggerholm, K.; Standal, O.; Larsson, H.

    2018-01-01

    Background: Physical educators currently have a number of pedagogical (or curricular) models at their disposal. While existing models have been well-received in educational contexts, these models seek to extend students' capacities within a limited number of "human activities" (Arendt, 1958). The activity of "human practising,"…

  1. Developing the next generation of forest ecosystem models

    Treesearch

    Christopher R. Schwalm; Alan R. Ek

    2002-01-01

    Forest ecology and management are model-rich areas for research. Models are often cast as either empirical or mechanistic. With evolving climate change, hybrid models gain new relevance because of their ability to integrate existing mechanistic knowledge with empiricism based on causal thinking. The utility of hybrid platforms results in the combination of...

  2. Pharmacometrics Markup Language (PharmML): Opening New Perspectives for Model Exchange in Drug Development.

    PubMed

    Swat, M J; Moodie, S; Wimalaratne, S M; Kristensen, N R; Lavielle, M; Mari, A; Magni, P; Smith, M K; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, A C; Kaye, R; Keizer, R; Kloft, C; Kok, J N; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, H B; Parra-Guillen, Z P; Plan, E; Ribba, B; Smith, G; Trocóniz, I F; Yvon, F; Milligan, P A; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N

    2015-06-01

    The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps.

  3. Pharmacometrics Markup Language (PharmML): Opening New Perspectives for Model Exchange in Drug Development

    PubMed Central

    Swat, MJ; Moodie, S; Wimalaratne, SM; Kristensen, NR; Lavielle, M; Mari, A; Magni, P; Smith, MK; Bizzotto, R; Pasotti, L; Mezzalana, E; Comets, E; Sarr, C; Terranova, N; Blaudez, E; Chan, P; Chard, J; Chatel, K; Chenel, M; Edwards, D; Franklin, C; Giorgino, T; Glont, M; Girard, P; Grenon, P; Harling, K; Hooker, AC; Kaye, R; Keizer, R; Kloft, C; Kok, JN; Kokash, N; Laibe, C; Laveille, C; Lestini, G; Mentré, F; Munafo, A; Nordgren, R; Nyberg, HB; Parra-Guillen, ZP; Plan, E; Ribba, B; Smith, G; Trocóniz, IF; Yvon, F; Milligan, PA; Harnisch, L; Karlsson, M; Hermjakob, H; Le Novère, N

    2015-01-01

    The lack of a common exchange format for mathematical models in pharmacometrics has been a long-standing problem. Such a format has the potential to increase productivity and analysis quality, simplify the handling of complex workflows, ensure reproducibility of research, and facilitate the reuse of existing model resources. Pharmacometrics Markup Language (PharmML), currently under development by the Drug Disease Model Resources (DDMoRe) consortium, is intended to become an exchange standard in pharmacometrics by providing means to encode models, trial designs, and modeling steps. PMID:26225259

  4. Energy efficiency in nonprofit agencies: Creating effective program models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, M.A.; Prindle, B.; Scherr, M.I.

    Nonprofit agencies are a critical component of the health and human services system in the US. It has been clearly demonstrated by programs that offer energy efficiency services to nonprofits that, with minimal investment, they can educe their energy consumption by ten to thirty percent. This energy conservation potential motivated the Department of Energy and Oak Ridge National Laboratory to conceive a project to help states develop energy efficiency programs for nonprofits. The purpose of the project was two-fold: (1) to analyze existing programs to determine which design and delivery mechanisms are particularly effective, and (2) to create model programsmore » for states to follow in tailoring their own plans for helping nonprofits with energy efficiency programs. Twelve existing programs were reviewed, and three model programs were devised and put into operation. The model programs provide various forms of financial assistance to nonprofits and serve as a source of information on energy efficiency as well. After examining the results from the model programs (which are still on-going) and from the existing programs, several replicability factors'' were developed for use in the implementation of programs by other states. These factors -- some concrete and practical, others more generalized -- serve as guidelines for states devising program based on their own particular needs and resources.« less

  5. Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berk, Alexander; Hawes, Frederick; Fox, Marsha

    2016-03-15

    Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field measurement program in collaboration with the Remote Sensing and Exploitation group at Sandia National Laboratories (SNL) in which data from their ongoing polarimetric field and laboratory measurement program will be shared and, to the extent allowed, tailored for model validation in exchange for model predictions under conditions and for geometries outside of their measurement domain.« less

  6. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production

    PubMed Central

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D.; Pittelkow, Cameron M.

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders. PMID:28804490

  7. A Vision for Incorporating Environmental Effects into Nitrogen Management Decision Support Tools for U.S. Maize Production.

    PubMed

    Banger, Kamaljit; Yuan, Mingwei; Wang, Junming; Nafziger, Emerson D; Pittelkow, Cameron M

    2017-01-01

    Meeting crop nitrogen (N) demand while minimizing N losses to the environment has proven difficult despite significant field research and modeling efforts. To improve N management, several real-time N management tools have been developed with a primary focus on enhancing crop production. However, no coordinated effort exists to simultaneously address sustainability concerns related to N losses at field- and regional-scales. In this perspective, we highlight the opportunity for incorporating environmental effects into N management decision support tools for United States maize production systems by integrating publicly available crop models with grower-entered management information and gridded soil and climate data in a geospatial framework specifically designed to quantify environmental and crop production tradeoffs. To facilitate advances in this area, we assess the capability of existing crop models to provide in-season N recommendations while estimating N leaching and nitrous oxide emissions, discuss several considerations for initial framework development, and highlight important challenges related to improving the accuracy of crop model predictions. Such a framework would benefit the development of regional sustainable intensification strategies by enabling the identification of N loss hotspots which could be used to implement spatially explicit mitigation efforts in relation to current environmental quality goals and real-time weather conditions. Nevertheless, we argue that this long-term vision can only be realized by leveraging a variety of existing research efforts to overcome challenges related to improving model structure, accessing field data to enhance model performance, and addressing the numerous social difficulties in delivery and adoption of such tool by stakeholders.

  8. Combining existing numerical models with data assimilation using weighted least-squares finite element methods.

    PubMed

    Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J

    2017-01-01

    A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. A model of transverse fuel injection applied to the computation of supersonic combustor flow

    NASA Technical Reports Server (NTRS)

    Rogers, R. C.

    1979-01-01

    A two-dimensional, nonreacting flow model of the aerodynamic interaction of a transverse hydrogen jet within a supersonic mainstream has been developed. The model assumes profile shapes of mass flux, pressure, flow angle, and hydrogen concentration and produces downstream profiles of the other flow parameters under the constraints of the integrated conservation equations. These profiles are used as starting conditions for an existing finite difference parabolic computer code for the turbulent supersonic combustion of hydrogen. Integrated mixing and flow profile results obtained from the computer code compare favorably with existing data for the supersonic combustion of hydrogen.

  10. A comparison of operational and LANDSAT-aided snow water content estimation systems. [Feather River Basin, California

    NASA Technical Reports Server (NTRS)

    Sharp, J. M.; Thomas, R. W.

    1975-01-01

    How LANDSAT imagery can be cost effectively employed to augment an operational hydrologic model is described. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the LANDSAT-aided approach.

  11. Simple algorithms for remote determination of mineral abundances and particle sizes from reflectance spectra

    NASA Technical Reports Server (NTRS)

    Johnson, Paul E.; Smith, Milton O.; Adams, John B.

    1992-01-01

    Algorithms were developed, based on Hapke's (1981) equations, for remote determinations of mineral abundances and particle sizes from reflectance spectra. In this method, spectra are modeled as a function of end-member abundances and illumination/viewing geometry. The method was tested on a laboratory data set. It is emphasized that, although there exist more sophisticated models, the present algorithms are particularly suited for remotely sensed data, where little opportunity exists to independently measure reflectance versus article size and phase function.

  12. Reading, Social Development, and the Child.

    ERIC Educational Resources Information Center

    Ediger, Marlow

    Social development stresses the importance of working together with others in life. The home setting can emphasize social development and its objectives of instruction. How should parents assist the child in quality social development in which good human relations exist? First and foremost, parents should serve as models to children for good human…

  13. Product Development and Cost Analysis of Fabricating the Prototype of Roller Clamp in Intravenous (I.V) Tubing Medical Devices using Fused Deposition Modeling (FDM) Technology

    NASA Astrophysics Data System (ADS)

    Way, Yusoff

    2018-01-01

    The main aim of this research is to develop a new prototype and to conduct cost analysis of the existing roller clamp which is one of parts attached to Intravenous (I.V) Tubing used in Intravenous therapy medical device. Before proceed with the process to manufacture the final product using Fused Deposition Modeling (FDM) Technology, the data collected from survey were analyzed using Product Design Specifications approach. Selected concept has been proven to have better quality, functions and criteria compared to the existing roller clamp and the cost analysis of fabricating the roller clamp prototype was calculated.

  14. Green modes of transportation for Connecticut's mixed used developments.

    DOT National Transportation Integrated Search

    2011-02-01

    Our multi-disciplinary team will not be single-minded. We will create a dynamic business and transportation model for the delivery of goods for : the existing and proposed commercial establishments located in Downtown Storrs. These models will demons...

  15. MODELING FINE SEDIMENT TRANSPORT IN ESTUARIES

    EPA Science Inventory

    A sediment transport model (SEDIMENT IIIA) was developed to assist in predicting the fate of chemical pollutants sorbed to cohesive sediments in rivers and estuaries. Laboratory experiments were conducted to upgrade an existing two-dimensional, depth-averaged, finite element, coh...

  16. Green Infrastructure Models and Tools

    EPA Science Inventory

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  17. Developing and implementing the use of predictive models for estimating water quality at Great Lakes beaches

    USGS Publications Warehouse

    Francy, Donna S.; Brady, Amie M.G.; Carvin, Rebecca B.; Corsi, Steven R.; Fuller, Lori M.; Harrison, John H.; Hayhurst, Brett A.; Lant, Jeremiah; Nevers, Meredith B.; Terrio, Paul J.; Zimmerman, Tammy M.

    2013-01-01

    Predictive models have been used at beaches to improve the timeliness and accuracy of recreational water-quality assessments over the most common current approach to water-quality monitoring, which relies on culturing fecal-indicator bacteria such as Escherichia coli (E. coli.). Beach-specific predictive models use environmental and water-quality variables that are easily and quickly measured as surrogates to estimate concentrations of fecal-indicator bacteria or to provide the probability that a State recreational water-quality standard will be exceeded. When predictive models are used for beach closure or advisory decisions, they are referred to as “nowcasts.” During the recreational seasons of 2010-12, the U.S. Geological Survey (USGS), in cooperation with 23 local and State agencies, worked to improve existing nowcasts at 4 beaches, validate predictive models at another 38 beaches, and collect data for predictive-model development at 7 beaches throughout the Great Lakes. This report summarizes efforts to collect data and develop predictive models by multiple agencies and to compile existing information on the beaches and beach-monitoring programs into one comprehensive report. Local agencies measured E. coli concentrations and variables expected to affect E. coli concentrations such as wave height, turbidity, water temperature, and numbers of birds at the time of sampling. In addition to these field measurements, equipment was installed by the USGS or local agencies at or near several beaches to collect water-quality and metrological measurements in near real time, including nearshore buoys, weather stations, and tributary staff gages and monitors. The USGS worked with local agencies to retrieve data from existing sources either manually or by use of tools designed specifically to compile and process data for predictive-model development. Predictive models were developed by use of linear regression and (or) partial least squares techniques for 42 beaches that had at least 2 years of data (2010-11 and sometimes earlier) and for 1 beach that had 1 year of data. For most models, software designed for model development by the U.S. Environmental Protection Agency (Virtual Beach) was used. The selected model for each beach was based on a combination of explanatory variables including, most commonly, turbidity, day of the year, change in lake level over 24 hours, wave height, wind direction and speed, and antecedent rainfall for various time periods. Forty-two predictive models were validated against data collected during an independent year (2012) and compared to the current method for assessing recreational water quality-using the previous day’s E. coli concentration (persistence model). Goals for good predictive-model performance were responses that were at least 5 percent greater than the persistence model and overall correct responses greater than or equal to 80 percent, sensitivities (percentage of exceedances of the bathing-water standard that were correctly predicted by the model) greater than or equal to 50 percent, and specificities (percentage of nonexceedances correctly predicted by the model) greater than or equal to 85 percent. Out of 42 predictive models, 24 models yielded over-all correct responses that were at least 5 percent greater than the use of the persistence model. Predictive-model responses met the performance goals more often than the persistence-model responses in terms of overall correctness (28 versus 17 models, respectively), sensitivity (17 versus 4 models), and specificity (34 versus 25 models). Gaining knowledge of each beach and the factors that affect E. coli concentrations is important for developing good predictive models. Collection of additional years of data with a wide range of environmental conditions may also help to improve future model performance. The USGS will continue to work with local agencies in 2013 and beyond to develop and validate predictive models at beaches and improve existing nowcasts, restructuring monitoring activities to accommodate future uncertainties in funding and resources.

  18. 3D Numerical Modeling of the Propagation of Hydraulic Fracture at Its Intersection with Natural (Pre-existing) Fracture

    NASA Astrophysics Data System (ADS)

    Dehghan, Ali Naghi; Goshtasbi, Kamran; Ahangari, Kaveh; Jin, Yan; Bahmani, Aram

    2017-02-01

    A variety of 3D numerical models were developed based on hydraulic fracture experiments to simulate the propagation of hydraulic fracture at its intersection with natural (pre-existing) fracture. Since the interaction between hydraulic and pre-existing fractures is a key condition that causes complex fracture patterns, the extended finite element method was employed in ABAQUS software to simulate the problem. The propagation of hydraulic fracture in a fractured medium was modeled in two horizontal differential stresses (Δ σ) of 5e6 and 10e6 Pa considering different strike and dip angles of pre-existing fracture. The rate of energy release was calculated in the directions of hydraulic and pre-existing fractures (G_{{frac}} /G_{{rock}}) at their intersection point to determine the fracture behavior. Opening and crossing were two dominant fracture behaviors during the hydraulic and pre-existing fracture interaction at low and high differential stress conditions, respectively. The results of numerical studies were compared with those of experimental models, showing a good agreement between the two to validate the accuracy of the models. Besides the horizontal differential stress, strike and dip angles of the natural (pre-existing) fracture, the key finding of this research was the significant effect of the energy release rate on the propagation behavior of the hydraulic fracture. This effect was more prominent under the influence of strike and dip angles, as well as differential stress. The obtained results can be used to predict and interpret the generation of complex hydraulic fracture patterns in field conditions.

  19. Electro-Optic Identification Research Program

    DTIC Science & Technology

    2002-04-01

    Electro - optic identification (EOID) sensors provide photographic quality images that can be used to identify mine-like contacts provided by long...tasks such as validating existing electro - optic models, development of performance metrics, and development of computer aided identification and

  20. Sensory Impairments and Autism: A Re-Examination of Causal Modelling

    ERIC Educational Resources Information Center

    Gerrard, Sue; Rugg, Gordon

    2009-01-01

    Sensory impairments are widely reported in autism, but remain largely unexplained by existing models. This article examines Kanner's causal reasoning and identifies unsupported assumptions implicit in later empirical work. Our analysis supports a heterogeneous causal model for autistic characteristics. We propose that the development of a…

  1. Practical extension of a Lake States tree height model

    Treesearch

    Don C. Bragg

    2008-01-01

    By adapting data from national and state champion lists and the predictions of an existing height model, an exponential function was developed to improvetree height estimation. As a case study, comparisons between the original and redesigned model were made with eastern white pine (Pinus strobus L.). Forexample, the heights...

  2. Career Success: Constructing a Multidimensional Model

    ERIC Educational Resources Information Center

    Dries, Nicky; Pepermans, Roland; Carlier, Olivier

    2008-01-01

    A multidimensional model of career success was developed aiming to be more inclusive than existing models. In a first study, 22 managers were asked to tell the story of their careers. At the end of each interview, idiosyncratic career success "construct ladders" were constructed for each interviewee through an interactive process with the…

  3. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; Xu, Tengfang; Sathaye, Jayant

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  4. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  5. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  6. CELDA – an ontology for the comprehensive representation of cells in complex systems

    PubMed Central

    2013-01-01

    Background The need for detailed description and modeling of cells drives the continuous generation of large and diverse datasets. Unfortunately, there exists no systematic and comprehensive way to organize these datasets and their information. CELDA (Cell: Expression, Localization, Development, Anatomy) is a novel ontology for the association of primary experimental data and derived knowledge to various types of cells of organisms. Results CELDA is a structure that can help to categorize cell types based on species, anatomical localization, subcellular structures, developmental stages and origin. It targets cells in vitro as well as in vivo. Instead of developing a novel ontology from scratch, we carefully designed CELDA in such a way that existing ontologies were integrated as much as possible, and only minimal extensions were performed to cover those classes and areas not present in any existing model. Currently, ten existing ontologies and models are linked to CELDA through the top-level ontology BioTop. Together with 15.439 newly created classes, CELDA contains more than 196.000 classes and 233.670 relationship axioms. CELDA is primarily used as a representational framework for modeling, analyzing and comparing cells within and across species in CellFinder, a web based data repository on cells (http://cellfinder.org). Conclusions CELDA can semantically link diverse types of information about cell types. It has been integrated within the research platform CellFinder, where it exemplarily relates cell types from liver and kidney during development on the one hand and anatomical locations in humans on the other, integrating information on all spatial and temporal stages. CELDA is available from the CellFinder website: http://cellfinder.org/about/ontology. PMID:23865855

  7. NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-01-01

    This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less

  8. Modeling water quality, temperature, and flow in Link River, south-central Oregon

    USGS Publications Warehouse

    Sullivan, Annett B.; Rounds, Stewart A.

    2016-09-09

    The 2.1-km (1.3-mi) Link River connects Upper Klamath Lake to the Klamath River in south-central Oregon. A CE-QUAL-W2 flow and water-quality model of Link River was developed to provide a connection between an existing model of the upper Klamath River and any existing or future models of Upper Klamath Lake. Water-quality sampling at six locations in Link River was done during 2013–15 to support model development and to provide a better understanding of instream biogeochemical processes. The short reach and high velocities in Link River resulted in fast travel times and limited water-quality transformations, except for dissolved oxygen. Reaeration through the reach, especially at the falls in Link River, was particularly important in moderating dissolved oxygen concentrations that at times entered the reach at Link River Dam with marked supersaturation or subsaturation. This reaeration resulted in concentrations closer to saturation downstream at the mouth of Link River.

  9. An investigation into the vertical axis control power requirements for landing VTOL type aircraft onboard nonaviation ships in various sea states

    NASA Technical Reports Server (NTRS)

    Stevens, M. E.; Roskam, J.

    1985-01-01

    The problem of determining the vertical axis control requirements for landing a VTOL aircraft on a moving ship deck in various sea states is examined. Both a fixed-base piloted simulation and a nonpiloted simulation were used to determine the landing performance as influenced by thrust-to-weight ratio, vertical damping, and engine lags. The piloted simulation was run using a fixed-based simulator at Ames Research center. Simplified versions of an existing AV-8A Harrier model and an existing head-up display format were used. The ship model used was that of a DD963 class destroyer. Simplified linear models of the pilot, aircraft, ship motion, and ship air-wake turbulence were developed for the nonpiloted simulation. A unique aspect of the nonpiloted simulation was the development of a model of the piloting strategy used for shipboard landing. This model was refined during the piloted simulation until it provided a reasonably good representation of observed pilot behavior.

  10. Simulating human behavior for national security human interactions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.

    2007-01-01

    This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less

  11. Metadata mapping and reuse in caBIG.

    PubMed

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-02-05

    This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.

  12. Clinical modeling--a critical analysis.

    PubMed

    Blobel, Bernd; Goossen, William; Brochhausen, Mathias

    2014-01-01

    Modeling clinical processes (and their informational representation) is a prerequisite for optimally enabling and supporting high quality and safe care through information and communication technology and meaningful use of gathered information. The paper investigates existing approaches to clinical modeling, thereby systematically analyzing the underlying principles, the consistency with and the integration opportunity to other existing or emerging projects, as well as the correctness of representing the reality of health and health services. The analysis is performed using an architectural framework for modeling real-world systems. In addition, fundamental work on the representation of facts, relations, and processes in the clinical domain by ontologies is applied, thereby including the integration of advanced methodologies such as translational and system medicine. The paper demonstrates fundamental weaknesses and different maturity as well as evolutionary potential in the approaches considered. It offers a development process starting with the business domain and its ontologies, continuing with the Reference Model-Open Distributed Processing (RM-ODP) related conceptual models in the ICT ontology space, the information and the computational view, and concluding with the implementation details represented as engineering and technology view, respectively. The existing approaches reflect at different levels the clinical domain, put the main focus on different phases of the development process instead of first establishing the real business process representation and therefore enable quite differently and partially limitedly the domain experts' involvement. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Final Report on the Development of a Model Education and Training System for Inmates in Federal Correctional Institutions, to Federal Prison Industries, Incorporated, U.S. Department of Justice.

    ERIC Educational Resources Information Center

    Hitt, William D.; Agostino, Norman R.

    This study to develop an education and training (E&T) system for inmates in Federal correctional institutions described and evaluated existing E&T systems and needs at Milan, Michigan, and Terre Haute, Indiana; formulated an E&T model; and made specific recommendations for implementation of each point in the model. A systems analysis approach was…

  14. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  15. A methodology for modeling barrier island storm-impact scenarios

    USGS Publications Warehouse

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy

    2017-02-16

    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  16. Development and assessment of 30-meter pine density maps for landscape-level modeling of mountain pine beetle dynamics

    Treesearch

    Benjamin A. Crabb; James A. Powell; Barbara J. Bentz

    2012-01-01

    Forecasting spatial patterns of mountain pine beetle (MPB) population success requires spatially explicit information on host pine distribution. We developed a means of producing spatially explicit datasets of pine density at 30-m resolution using existing geospatial datasets of vegetation composition and structure. Because our ultimate goal is to model MPB population...

  17. Assessing crown fire potential by linking models of surface and crown fire behavior

    Treesearch

    Joe H. Scott; Elizabeth D. Reinhardt

    2001-01-01

    Fire managers are increasingly concerned about the threat of crown fires, yet only now are quantitative methods for assessing crown fire hazard being developed. Links among existing mathematical models of fire behavior are used to develop two indices of crown fire hazard-the Torching Index and Crowning Index. These indices can be used to ordinate different forest...

  18. Biomechanically determined hand force limits protecting the low back during occupational pushing and pulling tasks.

    PubMed

    Weston, Eric B; Aurand, Alexander; Dufour, Jonathan S; Knapik, Gregory G; Marras, William S

    2018-06-01

    Though biomechanically determined guidelines exist for lifting, existing recommendations for pushing and pulling were developed using a psychophysical approach. The current study aimed to establish objective hand force limits based on the results of a biomechanical assessment of the forces on the lumbar spine during occupational pushing and pulling activities. Sixty-two subjects performed pushing and pulling tasks in a laboratory setting. An electromyography-assisted biomechanical model estimated spinal loads, while hand force and turning torque were measured via hand transducers. Mixed modelling techniques correlated spinal load with hand force or torque throughout a wide range of exposures in order to develop biomechanically determined hand force and torque limits. Exertion type, exertion direction, handle height and their interactions significantly influenced dependent measures of spinal load, hand force and turning torque. The biomechanically determined guidelines presented herein are up to 30% lower than comparable psychophysically derived limits and particularly more protective for straight pushing. Practitioner Summary: This study utilises a biomechanical model to develop objective biomechanically determined push/pull risk limits assessed via hand forces and turning torque. These limits can be up to 30% lower than existing psychophysically determined pushing and pulling recommendations. Practitioners should consider implementing these guidelines in both risk assessment and workplace design moving forward.

  19. Language and Cognition Interaction Neural Mechanisms

    PubMed Central

    Perlovsky, Leonid

    2011-01-01

    How language and cognition interact in thinking? Is language just used for communication of completed thoughts, or is it fundamental for thinking? Existing approaches have not led to a computational theory. We develop a hypothesis that language and cognition are two separate but closely interacting mechanisms. Language accumulates cultural wisdom; cognition develops mental representations modeling surrounding world and adapts cultural knowledge to concrete circumstances of life. Language is acquired from surrounding language “ready-made” and therefore can be acquired early in life. This early acquisition of language in childhood encompasses the entire hierarchy from sounds to words, to phrases, and to highest concepts existing in culture. Cognition is developed from experience. Yet cognition cannot be acquired from experience alone; language is a necessary intermediary, a “teacher.” A mathematical model is developed; it overcomes previous difficulties and leads to a computational theory. This model is consistent with Arbib's “language prewired brain” built on top of mirror neuron system. It models recent neuroimaging data about cognition, remaining unnoticed by other theories. A number of properties of language and cognition are explained, which previously seemed mysterious, including influence of language grammar on cultural evolution, which may explain specifics of English and Arabic cultures. PMID:21876687

  20. Crop Characteristics Research: Growth and Reflectance Analysis

    NASA Technical Reports Server (NTRS)

    Badhwar, G. D. (Principal Investigator)

    1985-01-01

    Much of the early research in remote sensing follows along developing spectral signatures of cover types. It was found, however, that a signature from an unknown cover class could not always be matched to a catalog value of known cover class. This approach was abandoned and supervised classification schemes followed. These were not efficient and required extensive training. It was obvious that data acquired at a single time could not separate cover types. A large portion of the proposed research has concentrated on modeling the temporal behavior of agricultural crops and on removing the need for any training data in remote sensing surveys; the key to which is the solution of the so-called 'signature extension' problem. A clear need to develop spectral estimaters of crop ontogenic stages and yield has existed even though various correlations have been developed. Considerable effort in developing techniques to estimate these variables was devoted to this work. The need to accurately evaluate existing canopy reflectance model(s), improve these models, use them to understand the crop signatures, and estimate leaf area index was the third objective of the proposed work. A synopsis of this research effort is discussed.

  1. Workshop on Computational Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Shabbir, A. (Compiler); Shih, T.-H. (Compiler); Povinelli, L. A. (Compiler)

    1994-01-01

    The purpose of this meeting was to discuss the current status and future development of turbulence modeling in computational fluid dynamics for aerospace propulsion systems. Various turbulence models have been developed and applied to different turbulent flows over the past several decades and it is becoming more and more urgent to assess their performance in various complex situations. In order to help users in selecting and implementing appropriate models in their engineering calculations, it is important to identify the capabilities as well as the deficiencies of these models. This also benefits turbulence modelers by permitting them to further improve upon the existing models. This workshop was designed for exchanging ideas and enhancing collaboration between different groups in the Lewis community who are using turbulence models in propulsion related CFD. In this respect this workshop will help the Lewis goal of excelling in propulsion related research. This meeting had seven sessions for presentations and one panel discussion over a period of two days. Each presentation session was assigned to one or two branches (or groups) to present their turbulence related research work. Each group was asked to address at least the following points: current status of turbulence model applications and developments in the research; progress and existing problems; and requests about turbulence modeling. The panel discussion session was designed for organizing committee members to answer management and technical questions from the audience and to make concluding remarks.

  2. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-09-01

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  3. Review: Modelling chemical kinetics and convective heating in giant planet entries

    NASA Astrophysics Data System (ADS)

    Reynier, Philippe; D'Ammando, Giuliano; Bruno, Domenico

    2018-01-01

    A review of the existing chemical kinetics models for H2 / He mixtures and related transport and thermodynamic properties is presented as a pre-requisite towards the development of innovative models based on the state-to-state approach. A survey of the available results obtained during the mission preparation and post-flight analyses of the Galileo mission has been undertaken and a computational matrix has been derived. Different chemical kinetics schemes for hydrogen/helium mixtures have been applied to numerical simulations of the selected points along the entry trajectory. First, a reacting scheme, based on literature data, has been set up for computing the flow-field around the probe at high altitude and comparisons with existing numerical predictions are performed. Then, a macroscopic model derived from a state-to-state model has been constructed and incorporated into a CFD code. Comparisons with existing numerical results from the literature have been performed as well as cross-check comparisons between the predictions provided by the different models in order to evaluate the potential of innovative chemical kinetics models based on the state-to-state approach.

  4. Development of a Random Field Model for Gas Plume Detection in Multiple LWIR Images.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heasler, Patrick G.

    This report develops a random field model that describes gas plumes in LWIR remote sensing images. The random field model serves as a prior distribution that can be combined with LWIR data to produce a posterior that determines the probability that a gas plume exists in the scene and also maps the most probable location of any plume. The random field model is intended to work with a single pixel regression estimator--a regression model that estimates gas concentration on an individual pixel basis.

  5. The role of simulation in neurosurgery.

    PubMed

    Rehder, Roberta; Abd-El-Barr, Muhammad; Hooten, Kristopher; Weinstock, Peter; Madsen, Joseph R; Cohen, Alan R

    2016-01-01

    In an era of residency duty-hour restrictions, there has been a recent effort to implement simulation-based training methods in neurosurgery teaching institutions. Several surgical simulators have been developed, ranging from physical models to sophisticated virtual reality systems. To date, there is a paucity of information describing the clinical benefits of existing simulators and the assessment strategies to help implement them into neurosurgical curricula. Here, we present a systematic review of the current models of simulation and discuss the state-of-the-art and future directions for simulation in neurosurgery. Retrospective literature review. Multiple simulators have been developed for neurosurgical training, including those for minimally invasive procedures, vascular, skull base, pediatric, tumor resection, functional neurosurgery, and spine surgery. The pros and cons of existing systems are reviewed. Advances in imaging and computer technology have led to the development of different simulation models to complement traditional surgical training. Sophisticated virtual reality (VR) simulators with haptic feedback and impressive imaging technology have provided novel options for training in neurosurgery. Breakthrough training simulation using 3D printing technology holds promise for future simulation practice, proving high-fidelity patient-specific models to complement residency surgical learning.

  6. Performance measurement for people with multiple chronic conditions: conceptual model.

    PubMed

    Giovannetti, Erin R; Dy, Sydney; Leff, Bruce; Weston, Christine; Adams, Karen; Valuck, Tom B; Pittman, Aisha T; Blaum, Caroline S; McCann, Barbara A; Boyd, Cynthia M

    2013-10-01

    Improving quality of care for people with multiple chronic conditions (MCCs) requires performance measures reflecting the heterogeneity and scope of their care. Since most existing measures are disease specific, performance measures must be refined and new measures must be developed to address the complexity of care for those with MCCs. To describe development of the Performance Measurement for People with Multiple Chronic Conditions (PM-MCC) conceptual model. Framework development and a national stakeholder panel. We used reviews of existing conceptual frameworks of performance measurement, review of the literature on MCCs, input from experts in the multistakeholder Steering Committee, and public comment. The resulting model centers on the patient and family goals and preferences for care in the context of multiple care sites and providers, the type of care they are receiving, and the national priority domains for healthcare quality measurement. This model organizes measures into a comprehensive framework and identifies areas where measures are lacking. In this context, performance measures can be prioritized and implemented at different levels, in the context of patients' overall healthcare needs.

  7. Modeling Group Interactions via Open Data Sources

    DTIC Science & Technology

    2011-08-30

    data. The state-of-art search engines are designed to help general query-specific search and not suitable for finding disconnected online groups. The...groups, (2) developing innovative mathematical and statistical models and efficient algorithms that leverage existing search engines and employ

  8. Combustion of Nitramine Propellants

    DTIC Science & Technology

    1983-03-01

    through development of a comprehensive analytical model. The ultimate goals are to enable prediction of deflagration rate over a wide pressure range...superior in burn rate prediction , both simple models fail in correlating existing temperature- sensitivity data. (2) In the second part, a...auxiliary condition to enable independent burn rate prediction ; improved melt phase model including decomposition-gas bubbles; model for far-field

  9. Turbine Engine Research Center (TERC) Data System Enhancement and Test Article Evaluation. Delivery Order 0002: TERC Aeromechanical Characterization

    DTIC Science & Technology

    2005-06-01

    test, the entire turbulence model was changed from standard k- epsilon to Spalart- Allmaras. Using these different tools of turbulence models, a few...this research, leaving only pre-existing finite element models to be used. At some point a NASTRAN model was developed for vibrations analysis but

  10. Comparison of three models predicting developmental milestones given environmental and individual variation

    Treesearch

    Estella Gilbert; James A. Powell; Jesse A. Logan; Barbara J. Bentz

    2004-01-01

    In all organisms, phenotypic variability is an evolutionary stipulation. Because the development of poikilothermic organisms depends directly on the temperature of their habitat, environmental variability is also an integral factor in models of their phenology. In this paper we present two existing phenology models, the distributed delay model and the Sharpe and...

  11. Simulation of Healing Threshold in Strain-Induced Inflammation Through a Discrete Informatics Model.

    PubMed

    Ibrahim, Israr Bin M; Sarma O V, Sanjay; Pidaparti, Ramana M

    2018-05-01

    Respiratory diseases such as asthma and acute respiratory distress syndrome as well as acute lung injury involve inflammation at the cellular level. The inflammation process is very complex and is characterized by the emergence of cytokines along with other changes in cellular processes. Due to the complexity of the various constituents that makes up the inflammation dynamics, it is necessary to develop models that can complement experiments to fully understand inflammatory diseases. In this study, we developed a discrete informatics model based on cellular automata (CA) approach to investigate the influence of elastic field (stretch/strain) on the dynamics of inflammation and account for probabilistic adaptation based on statistical interpretation of existing experimental data. Our simulation model investigated the effects of low, medium, and high strain conditions on inflammation dynamics. Results suggest that the model is able to indicate the threshold of innate healing of tissue as a response to strain experienced by the tissue. When strain is under the threshold, the tissue is still capable of adapting its structure to heal the damaged part. However, there exists a strain threshold where healing capability breaks down. The results obtained demonstrate that the developed discrete informatics based CA model is capable of modeling and giving insights into inflammation dynamics parameters under various mechanical strain/stretch environments.

  12. A meta-model for computer executable dynamic clinical safety checklists.

    PubMed

    Nan, Shan; Van Gorp, Pieter; Lu, Xudong; Kaymak, Uzay; Korsten, Hendrikus; Vdovjak, Richard; Duan, Huilong

    2017-12-12

    Safety checklist is a type of cognitive tool enforcing short term memory of medical workers with the purpose of reducing medical errors caused by overlook and ignorance. To facilitate the daily use of safety checklists, computerized systems embedded in the clinical workflow and adapted to patient-context are increasingly developed. However, the current hard-coded approach of implementing checklists in these systems increase the cognitive efforts of clinical experts and coding efforts for informaticists. This is due to the lack of a formal representation format that is both understandable by clinical experts and executable by computer programs. We developed a dynamic checklist meta-model with a three-step approach. Dynamic checklist modeling requirements were extracted by performing a domain analysis. Then, existing modeling approaches and tools were investigated with the purpose of reusing these languages. Finally, the meta-model was developed by eliciting domain concepts and their hierarchies. The feasibility of using the meta-model was validated by two case studies. The meta-model was mapped to specific modeling languages according to the requirements of hospitals. Using the proposed meta-model, a comprehensive coronary artery bypass graft peri-operative checklist set and a percutaneous coronary intervention peri-operative checklist set have been developed in a Dutch hospital and a Chinese hospital, respectively. The result shows that it is feasible to use the meta-model to facilitate the modeling and execution of dynamic checklists. We proposed a novel meta-model for the dynamic checklist with the purpose of facilitating creating dynamic checklists. The meta-model is a framework of reusing existing modeling languages and tools to model dynamic checklists. The feasibility of using the meta-model is validated by implementing a use case in the system.

  13. Hiproofs

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Power, John

    2003-01-01

    We introduce a hierarchical notion of formal proof, useful in the implementation of theorem provers, which we call highproofs. Two alternative definitions are given, motivated by existing notations used in theorem proving research. We define transformations between these two forms of hiproof, develop notions of underlying proof, and give a suitable definition of refinement in order to model incremental proof development. We show that our transformations preserve both underlying proofs and refinement. The relationship of our theory to existing theorem proving systems is discussed, as is its future extension.

  14. Low Reynolds number two-equation modeling of turbulent flows

    NASA Technical Reports Server (NTRS)

    Michelassi, V.; Shih, T.-H.

    1991-01-01

    A k-epsilon model that accounts for viscous and wall effects is presented. The proposed formulation does not contain the local wall distance thereby making very simple the application to complex geometries. The formulation is based on an existing k-epsilon model that proved to fit very well with the results of direct numerical simulation. The new form is compared with nine different two-equation models and with direct numerical simulation for a fully developed channel flow at Re = 3300. The simple flow configuration allows a comparison free from numerical inaccuracies. The computed results prove that few of the considered forms exhibit a satisfactory agreement with the channel flow data. The model shows an improvement with respect to the existing formulations.

  15. Development of an improved system for contract time determination : phase III.

    DOT National Transportation Integrated Search

    2010-09-30

    This study developed Daily Work Report (DWR) based prediction models to determine reasonable : production rates of controlling activities of highway projects. The study used available resources such as : DWR, soil data, AADT and other existing projec...

  16. The Role of Sister Cities' Staff Exchanges in Developing "Learning Cities": Exploring Necessary and Sufficient Conditions in Social Capital Development Utilizing Proportional Odds Modeling.

    PubMed

    Buckley, Patrick Henry; Takahashi, Akio; Anderson, Amy

    2015-06-24

    In the last half century former international adversaries have become cooperators through networking and knowledge sharing for decision making aimed at improving quality of life and sustainability; nowhere has this been more striking then at the urban level where such activity is seen as a key component in building "learning cities" through the development of social capital. Although mega-cities have been leaders in such efforts, mid-sized cities with lesser resource endowments have striven to follow by focusing on more frugal sister city type exchanges. The underlying thesis of our research is that great value can be derived from city-to-city exchanges through social capital development. However, such a study must differentiate between necessary and sufficient conditions. Past studies assumed necessary conditions were met and immediately jumped to demonstrating the existence of structural relationships by measuring networking while further assuming that the existence of such demonstrated a parallel development of cognitive social capital. Our research addresses this lacuna by stepping back and critically examining these assumptions. To accomplish this goal we use a Proportional Odds Modeling with a Cumulative Logit Link approach to demonstrate the existence of a common latent structure, hence asserting that necessary conditions are met.

  17. The Role of Sister Cities’ Staff Exchanges in Developing “Learning Cities”: Exploring Necessary and Sufficient Conditions in Social Capital Development Utilizing Proportional Odds Modeling

    PubMed Central

    Buckley, Patrick Henry; Takahashi, Akio; Anderson, Amy

    2015-01-01

    In the last half century former international adversaries have become cooperators through networking and knowledge sharing for decision making aimed at improving quality of life and sustainability; nowhere has this been more striking then at the urban level where such activity is seen as a key component in building “learning cities” through the development of social capital. Although mega-cities have been leaders in such efforts, mid-sized cities with lesser resource endowments have striven to follow by focusing on more frugal sister city type exchanges. The underlying thesis of our research is that great value can be derived from city-to-city exchanges through social capital development. However, such a study must differentiate between necessary and sufficient conditions. Past studies assumed necessary conditions were met and immediately jumped to demonstrating the existence of structural relationships by measuring networking while further assuming that the existence of such demonstrated a parallel development of cognitive social capital. Our research addresses this lacuna by stepping back and critically examining these assumptions. To accomplish this goal we use a Proportional Odds Modeling with a Cumulative Logit Link approach to demonstrate the existence of a common latent structure, hence asserting that necessary conditions are met. PMID:26114245

  18. Model-based testing with UML applied to a roaming algorithm for bluetooth devices.

    PubMed

    Dai, Zhen Ru; Grabowski, Jens; Neukirchen, Helmut; Pals, Holger

    2004-11-01

    In late 2001, the Object Management Group issued a Request for Proposal to develop a testing profile for UML 2.0. In June 2003, the work on the UML 2.0 Testing Profile was finally adopted by the OMG. Since March 2004, it has become an official standard of the OMG. The UML 2.0 Testing Profile provides support for UML based model-driven testing. This paper introduces a methodology on how to use the testing profile in order to modify and extend an existing UML design model for test issues. The application of the methodology will be explained by applying it to an existing UML Model for a Bluetooth device.

  19. Vehicle Modeling for use in the CAFE model: Process description and modeling assumptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moawad, Ayman; Kim, Namdoo; Rousseau, Aymeric

    2016-06-01

    The objective of this project is to develop and demonstrate a process that, at a minimum, provides more robust information that can be used to calibrate inputs applicable under the CAFE model’s existing structure. The project will be more fully successful if a process can be developed that minimizes the need for decision trees and replaces the synergy factors by inputs provided directly from a vehicle simulation tool. The report provides a description of the process that was developed by Argonne National Laboratory and implemented in Autonomie.

  20. The interactive role of subsynoptic scale jet sreak and planetary boundary layer adjustments in organizing an apparently isolated convective complex

    NASA Technical Reports Server (NTRS)

    Kaplan, M. L.; Zack, J. W.; Wong, V. C.; Tuccillo, J. J.; Coats, G. D.

    1982-01-01

    A mesoscale atmospheric simulation system is described that is being developed in order to improve the simulation of subsynoptic and mesoscale adjustments associated with cyclogenesis, severe storm development, and significant atmospheric transport processes. Present emphasis in model development is in the parameterization of physical processes, time-dependent boundary conditions, sophisticated initialization and analysis procedures, nested grid solutions, and applications software development. Basic characteristics of the system as of March 1982 are listed. In a case study, the Grand Island tornado outbreak of 3 June 1980 is considered in substantial detail. Results of simulations with a mesoscale atmospheric simulation system indicate that over the high plains subtle interactions between existing jet streaks and deep well mixed boundary layers can lead to well organized patterns of mesoscale divergence and pressure falls. The amplitude and positioning of these mesoscale features is a function of the subtle nonlinear interaction between the pre-existing jet-streak and deep well mixed boundary layers. Model results for the case study indicate that the model has the potential for forecasting the precursor mesoscale convective environment.

  1. Wind tunnel measurements for dispersion modelling of vehicle wakes

    NASA Astrophysics Data System (ADS)

    Carpentieri, Matteo; Kumar, Prashant; Robins, Alan

    2012-12-01

    Wind tunnel measurements downwind of reduced scale car models have been made to study the wake regions in detail, test the usefulness of existing vehicle wake models, and draw key information needed for dispersion modelling in vehicle wakes. The experiments simulated a car moving in still air. This is achieved by (i) the experimental characterisation of the flow, turbulence and concentration fields in both the near and far wake regions, (ii) the preliminary assessment of existing wake models using the experimental database, and (iii) the comparison of previous field measurements in the wake of a real diesel car with the wind tunnel measurements. The experiments highlighted very large gradients of velocities and concentrations existing, in particular, in the near-wake. Of course, the measured fields are strongly dependent on the geometry of the modelled vehicle and a generalisation for other vehicles may prove to be difficult. The methodology applied in the present study, although improvable, could constitute a first step towards the development of mathematical parameterisations. Experimental results were also compared with the estimates from two wake models. It was found that they can adequately describe the far-wake of a vehicle in terms of velocities, but a better characterisation in terms of turbulence and pollutant dispersion is needed. Parameterised models able to predict velocity and concentrations with fine enough details at the near-wake scale do not exist.

  2. The challenge and promise of anti-epileptic therapy development in animal models

    PubMed Central

    Simonato, Michele; Brooks-Kayal, Amy R; Engel, Jerome; Galanopoulou, Aristea S; Jensen, Frances E; Moshé, Solomon L; O’Brien, Terence J; Pitkanen, Asla; Wilcox, Karen S; French, Jacqueline A

    2016-01-01

    Translation of successful target and compound validation studies into clinically effective therapies is a major challenge, with potential for costly clinical trial failures. This situation holds true for the epilepsies—complex diseases with different causes and symptoms. Although the availability of predictive animal models has led to the development of effective antiseizure therapies that are routinely used in clinical practice, showing that translation can be successful, several important unmet therapeutic needs still exist. Available treatments do not fully control seizures in a third of patients with epilepsy, and produce substantial side-effects. No treatment can prevent the development of epilepsy in at-risk patients or cure patients with epilepsy. And no specific treatment for epilepsy-associated comorbidities exists. To meet these demands, a redesign of translational approaches is urgently needed. PMID:25127174

  3. Estimating kinetic mechanisms with prior knowledge I: Linear parameter constraints.

    PubMed

    Salari, Autoosa; Navarro, Marco A; Milescu, Mirela; Milescu, Lorin S

    2018-02-05

    To understand how ion channels and other proteins function at the molecular and cellular levels, one must decrypt their kinetic mechanisms. Sophisticated algorithms have been developed that can be used to extract kinetic parameters from a variety of experimental data types. However, formulating models that not only explain new data, but are also consistent with existing knowledge, remains a challenge. Here, we present a two-part study describing a mathematical and computational formalism that can be used to enforce prior knowledge into the model using constraints. In this first part, we focus on constraints that enforce explicit linear relationships involving rate constants or other model parameters. We develop a simple, linear algebra-based transformation that can be applied to enforce many types of model properties and assumptions, such as microscopic reversibility, allosteric gating, and equality and inequality parameter relationships. This transformation converts the set of linearly interdependent model parameters into a reduced set of independent parameters, which can be passed to an automated search engine for model optimization. In the companion article, we introduce a complementary method that can be used to enforce arbitrary parameter relationships and any constraints that quantify the behavior of the model under certain conditions. The procedures described in this study can, in principle, be coupled to any of the existing methods for solving molecular kinetics for ion channels or other proteins. These concepts can be used not only to enforce existing knowledge but also to formulate and test new hypotheses. © 2018 Salari et al.

  4. Human Resource Development to Facilitate Experiential Learning: The Case of Yahoo Japan

    ERIC Educational Resources Information Center

    Matsuo, Makoto

    2015-01-01

    Although work experiences are recognized as important mechanisms for developing leaders in organizations, existing research has focused primarily on work assignments rather than on human resource development (HRD) systems that promote experiential learning of managers. The primary goal of this study was to develop an HRD model for facilitating…

  5. Diagnostic utility of appetite loss in addition to existing prediction models for community-acquired pneumonia in the elderly: a prospective diagnostic study in acute care hospitals in Japan

    PubMed Central

    Yamamoto, Yosuke; Terada, Kazuhiko; Ohta, Mitsuyasu; Mikami, Wakako; Yokota, Hajime; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuma, Shingo; Fukuhara, Shunichi

    2017-01-01

    Objective Diagnosis of community-acquired pneumonia (CAP) in the elderly is often delayed because of atypical presentation and non-specific symptoms, such as appetite loss, falls and disturbance in consciousness. The aim of this study was to investigate the external validity of existing prediction models and the added value of the non-specific symptoms for the diagnosis of CAP in elderly patients. Design Prospective cohort study. Setting General medicine departments of three teaching hospitals in Japan. Participants A total of 109 elderly patients who consulted for upper respiratory symptoms between 1 October 2014 and 30 September 2016. Main outcome measures The reference standard for CAP was chest radiograph evaluated by two certified radiologists. The existing models were externally validated for diagnostic performance by calibration plot and discrimination. To evaluate the additional value of the non-specific symptoms to the existing prediction models, we developed an extended logistic regression model. Calibration, discrimination, category-free net reclassification improvement (NRI) and decision curve analysis (DCA) were investigated in the extended model. Results Among the existing models, the model by van Vugt demonstrated the best performance, with an area under the curve of 0.75(95% CI 0.63 to 0.88); calibration plot showed good fit despite a significant Hosmer-Lemeshow test (p=0.017). Among the non-specific symptoms, appetite loss had positive likelihood ratio of 3.2 (2.0–5.3), negative likelihood ratio of 0.4 (0.2–0.7) and OR of 7.7 (3.0–19.7). Addition of appetite loss to the model by van Vugt led to improved calibration at p=0.48, NRI of 0.53 (p=0.019) and higher net benefit by DCA. Conclusions Information on appetite loss improved the performance of an existing model for the diagnosis of CAP in the elderly. PMID:29122806

  6. Aeroservoelastic Modeling of Body Freedom Flutter for Control System Design

    NASA Technical Reports Server (NTRS)

    Ouellette, Jeffrey

    2017-01-01

    One of the most severe forms of coupling between aeroelasticity and flight dynamics is an instability called freedom flutter. The existing tools often assume relatively weak coupling, and are therefore unable to accurately model body freedom flutter. Because the existing tools were developed from traditional flutter analysis models, inconsistencies in the final models are not compatible with control system design tools. To resolve these issues, a number of small, but significant changes have been made to the existing approaches. A frequency domain transformation is used with the unsteady aerodynamics to ensure a more physically consistent stability axis rational function approximation of the unsteady aerodynamic model. The aerodynamic model is augmented with additional terms to account for limitations of the baseline unsteady aerodynamic model and to account for the gravity forces. An assumed modes method is used for the structural model to ensure a consistent definition of the aircraft states across the flight envelope. The X-56A stiff wing flight-test data were used to validate the current modeling approach. The flight-test data does not show body-freedom flutter, but does show coupling between the flight dynamics and the aeroelastic dynamics and the effects of the fuel weight.

  7. Signal Partitioning Algorithm for Highly Efficient Gaussian Mixture Modeling in Mass Spectrometry

    PubMed Central

    Polanski, Andrzej; Marczyk, Michal; Pietrowska, Monika; Widlak, Piotr; Polanska, Joanna

    2015-01-01

    Mixture - modeling of mass spectra is an approach with many potential applications including peak detection and quantification, smoothing, de-noising, feature extraction and spectral signal compression. However, existing algorithms do not allow for automated analyses of whole spectra. Therefore, despite highlighting potential advantages of mixture modeling of mass spectra of peptide/protein mixtures and some preliminary results presented in several papers, the mixture modeling approach was so far not developed to the stage enabling systematic comparisons with existing software packages for proteomic mass spectra analyses. In this paper we present an efficient algorithm for Gaussian mixture modeling of proteomic mass spectra of different types (e.g., MALDI-ToF profiling, MALDI-IMS). The main idea is automated partitioning of protein mass spectral signal into fragments. The obtained fragments are separately decomposed into Gaussian mixture models. The parameters of the mixture models of fragments are then aggregated to form the mixture model of the whole spectrum. We compare the elaborated algorithm to existing algorithms for peak detection and we demonstrate improvements of peak detection efficiency obtained by using Gaussian mixture modeling. We also show applications of the elaborated algorithm to real proteomic datasets of low and high resolution. PMID:26230717

  8. Breaking barriers to novel analgesic drug development.

    PubMed

    Yekkirala, Ajay S; Roberson, David P; Bean, Bruce P; Woolf, Clifford J

    2017-08-01

    Acute and chronic pain complaints, although common, are generally poorly served by existing therapies. This unmet clinical need reflects a failure to develop novel classes of analgesics with superior efficacy, diminished adverse effects and a lower abuse liability than those currently available. Reasons for this include the heterogeneity of clinical pain conditions, the complexity and diversity of underlying pathophysiological mechanisms, and the unreliability of some preclinical pain models. However, recent advances in our understanding of the neurobiology of pain are beginning to offer opportunities for developing novel therapeutic strategies and revisiting existing targets, including modulating ion channels, enzymes and G-protein-coupled receptors.

  9. Breaking barriers to novel analgesic drug development

    PubMed Central

    Yekkirala, Ajay S; Roberson, David P; Bean, Bruce P.; Woolf, Clifford J.

    2017-01-01

    Acute and chronic pain complaints, while very common, are generally poorly served by existing therapies. The unmet clinical need reflects the failure in developing novel classes of analgesics with superior efficacy, diminished adverse effects and a lower abuse liability than those currently available. Reasons for this include the heterogeneity of clinical pain conditions, the complexity and diversity of underlying pathophysiological mechanisms coupled with the unreliability of some preclinical pain models. However, recent advances in our understanding of the neurobiology of pain are beginning to offer opportunities to develop new therapeutic strategies and revisit existing targets, including modulating ion channels, enzymes and GPCRs. PMID:28596533

  10. Development of user-centered interfaces to search the knowledge resources of the Virginia Henderson International Nursing Library.

    PubMed

    Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane

    2003-01-01

    This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.

  11. The Development of a Modelling Solution to Address Manpower and Personnel Issues Using the IPME

    DTIC Science & Technology

    2010-11-01

    training for a military system. It deals with the number of personnel spaces and available people. One of the main concerns in this domain is to...are often addressed by examining existing solutions for similar systems and/or a trial-and-error method based on human-in- the -loop tests. Such an...significant effort and resources on the development of a human performance modelling software, the Integrated Performance Modelling Environment (IPME

  12. Habitat Suitability Index Models: Northern Gulf of Mexico brown shrimp and white shrimp

    USGS Publications Warehouse

    Turner, Robert Eugene; Brody, Michael S.

    1983-01-01

    A review and synthesis of existing information were used to develop estuarine habitat models for brown shrimp (Penaeus aztecus) and white shrimp (Penaeus setiferus). The models are scaled to produce an index of habitat suitability between 0 (unsuitable habitat) to 1 (optimally suitable habitat) for estuarine areas of the northern Gulf of Mexico. Habitat suitability indexes are designed for use with the habitat evaluation procedures developed by the U.S. Fish and Wildlife Service.

  13. A hydrological model for interprovincial water resource planning and management: A case study in the Long Xuyen Quadrangle, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Hanington, Peter; To, Quang Toan; Van, Pham Dang Tri; Doan, Ngoc Anh Vu; Kiem, Anthony S.

    2017-04-01

    In this paper we present the results of the development and calibration of a fine-scaled quasi-2D hydrodynamic model (IWRM-LXQ) for the Long Xuyen Quadrangle - an important interprovincial agricultural region in the Vietnamese Mekong Delta. We use the Long Xuyen Quadrangle as a case study to highlight the need for further investment in hydrodynamic modelling at scales relevant to the decisions facing water resource managers and planners in the Vietnamese Mekong Delta. The IWRM-LXQ was calibrated using existing data from a low flood year (2010) and high flood year (2011), including dry season and wet season flows. The model performed well in simulating low flood and high flood events in both dry and wet seasons where good spatial and temporal data exists. However, our study shows that there are data quality issues and key data gaps that need to be addressed before the model can be further refined, validated and then used for decision making. The development of the IWRM-LXQ is timely, as significant investments in land and water resource development and infrastructure are in planning for the Vietnamese Mekong Delta. In order to define the scope of such investments and their feasibility, models such as the IWRM-LXQ are an essential tool to provide objective assessment of investment options and build stakeholder consensus around potentially contentious development decisions.

  14. Crystal study and econometric model

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An econometric model was developed that can be used to predict demand and supply figures for crystals over a time horizon roughly concurrent with that of NASA's Space Shuttle Program - that is, 1975 through 1990. The model includes an equation to predict the impact on investment in the crystal-growing industry. Actually, two models are presented. The first is a theoretical model which follows rather strictly the standard theoretical economic concepts involved in supply and demand analysis, and a modified version of the model was developed which, though not quite as theoretically sound, was testable utilizing existing data sources.

  15. Models Required to Mitigate Impacts of Space Weather on Space Systems

    NASA Technical Reports Server (NTRS)

    Barth, Janet L.

    2003-01-01

    This viewgraph presentation attempts to develop a model of factors which need to be considered in the design and construction of spacecraft to lessen the effects of space weather on these vehicles. Topics considered include: space environments and effects, radiation environments and effects, space weather drivers, space weather models, climate models, solar proton activity and mission design for the GOES mission. The authors conclude that space environment models need to address issues from mission planning through operations and a program to develop and validate authoritative space environment models for application to spacecraft design does not exist at this time.

  16. Numerical Modeling of River Ice Processes on the Lower Nelson River

    NASA Astrophysics Data System (ADS)

    Malenchak, Jarrod Joseph

    Water resource infrastructure in cold regions of the world can be significantly impacted by the existence of river ice. Major engineering concerns related to river ice include ice jam flooding, the design and operation of hydropower facilities and other hydraulic structures, water supplies, as well as ecological, environmental, and morphological effects. The use of numerical simulation models has been identified as one of the most efficient means by which river ice processes can be studied and the effects of river ice be evaluated. The continued advancement of these simulation models will help to develop new theories and evaluate potential mitigation alternatives for these ice issues. In this thesis, a literature review of existing river ice numerical models, of anchor ice formation and modeling studies, and of aufeis formation and modeling studies is conducted. A high level summary of the two-dimensional CRISSP numerical model is presented as well as the developed freeze-up model with a focus specifically on the anchor ice and aufeis growth processes. This model includes development in the detailed heat transfer calculations, an improved surface ice mass exchange model which includes the rapids entrainment process, and an improved dry bed treatment model along with the expanded anchor ice and aufeis growth model. The developed sub-models are tested in an ideal channel setting as somewhat of a model confirmation. A case study of significant anchor ice and aufeis growth on the Nelson River in northern Manitoba, Canada, will be the primary field test case for the anchor ice and aufeis model. A second case study on the same river will be used to evaluate the surface ice components of the model in a field setting. The results from these cases studies will be used to highlight the capabilities and deficiencies in the numerical model and to identify areas of further research and model development.

  17. Scales and erosion

    USDA-ARS?s Scientific Manuscript database

    There is a need to develop scale explicit understanding of erosion to overcome existing conceptual and methodological flaws in our modelling methods currently applied to understand the process of erosion, transport and deposition at the catchment scale. These models need to be based on a sound under...

  18. MODELING MERCURY CONTROL WITH POWDERED ACTIVATED CARBON

    EPA Science Inventory

    The paper presents a mathematical model of total mercury removed from the flue gas at coal-fired plants equipped with powdered activated carbon (PAC) injection for Mercury control. The developed algorithms account for mercury removal by both existing equipment and an added PAC in...

  19. A CONCEPTUAL MODEL FOR MULTI-SCALAR ASSESSMENTS OF ESTUARINE ECOLOGICAL INTEGRITY

    EPA Science Inventory

    A conceptual model was developed that relates an estuarine system's anthropogenic inputs to it's ecological integrity. Ecological integrity is operationally defined as an emergent property of an ecosystem that exists when the structural components are complete and the functional ...

  20. Changing Models for Researching Pedagogy with Information and Communications Technologies

    ERIC Educational Resources Information Center

    Webb, M.

    2013-01-01

    This paper examines changing models of pedagogy by drawing on recent research with teachers and their students as well as theoretical developments. In relation to a participatory view of learning, the paper reviews existing pedagogical models that take little account of the use of information and communications technologies as well as those that…

  1. A predictive model for floating leaf vegetation in the St. Louis River Estuary

    EPA Science Inventory

    In July 2014, USEPA staff was asked by MPCA to develop a predictive model for floating leaf vegetation (FLV) in the St. Louis River Estuary (SLRE). The existing model (Host et al. 2012) greatly overpredicts FLV in St. Louis Bay probably because it was based on a limited number of...

  2. Modeling carbon and nitrogen biogeochemistry in forest ecosystems

    Treesearch

    Changsheng Li; Carl Trettin; Ge Sun; Steve McNulty; Klaus Butterbach-Bahl

    2005-01-01

    A forest biogeochemical model, Forest-DNDC, was developed to quantify carbon sequestration in and trace gas emissions from forest ecosystems. Forest-DNDC was constructed by integrating two existing moels, PnET and DNDC, with several new features including nitrification, forest litter layer, soil freezing and thawing etc, PnET is a forest physiological model predicting...

  3. NREL Deploys Wave and Tidal Measurement Buoys | News | NREL

    Science.gov Websites

    various wave and tidal models, and in turn, reduce risks for developers. These buoys allow researchers to "better understand the limitations and errors in existing global wave models," says Kilcher using NREL, laboratory, and U.S. Department of Energy published models, the team identified likely

  4. Partnering Principal and Teacher Candidates: Exploring a Virtual Coaching Model in Teacher Education

    ERIC Educational Resources Information Center

    Stapleton, Joy; Tschida, Christina; Cuthrell, Kristen

    2017-01-01

    Colleges of education are constantly searching for innovations to develop stronger graduates. This paper describes and shares findings from a study of a virtual coaching partnership model that links a principal candidate with a teacher candidate. Through the use of existing virtual coaching software, this model provides teacher candidates with…

  5. Applying the Theory of Work Adjustment to Latino Immigrant Workers: An Exploratory Study

    ERIC Educational Resources Information Center

    Eggerth, Donald E.; Flynn, Michael A.

    2012-01-01

    Blustein mapped career decision making onto Maslow's model of motivation and personality and concluded that most models of career development assume opportunities and decision-making latitude that do not exist for many individuals from low income or otherwise disadvantaged backgrounds. Consequently, Blustein argued that these models may be of…

  6. Models of Sexual and Relational Orientation: A Critical Review and Synthesis

    ERIC Educational Resources Information Center

    Moe, Jeffry L.; Reicherzer, Stacee; Dupuy, Paula J.

    2011-01-01

    Many frameworks exist to explain and describe the phenomenon of same-sex sexuality as it applies to human development. This conceptual article provides a critical overview and synthesis of previous models to serve as a theoretical bridge for the suggested multiple continua model of sexual and relational orientations. Recommendations for how…

  7. Designing an Educational Game with Ten Steps to Complex Learning

    ERIC Educational Resources Information Center

    Enfield, Jacob

    2012-01-01

    Few instructional design (ID) models exist which are specific for developing educational games. Moreover, those extant ID models have not been rigorously evaluated. No ID models were found which focus on educational games with complex learning objectives. "Ten Steps to Complex Learning" (TSCL) is based on the four component instructional…

  8. Detecting Lung and Colorectal Cancer Recurrence Using Structured Clinical/Administrative Data to Enable Outcomes Research and Population Health Management.

    PubMed

    Hassett, Michael J; Uno, Hajime; Cronin, Angel M; Carroll, Nikki M; Hornbrook, Mark C; Ritzwoller, Debra

    2017-12-01

    Recurrent cancer is common, costly, and lethal, yet we know little about it in community-based populations. Electronic health records and tumor registries contain vast amounts of data regarding community-based patients, but usually lack recurrence status. Existing algorithms that use structured data to detect recurrence have limitations. We developed algorithms to detect the presence and timing of recurrence after definitive therapy for stages I-III lung and colorectal cancer using 2 data sources that contain a widely available type of structured data (claims or electronic health record encounters) linked to gold-standard recurrence status: Medicare claims linked to the Cancer Care Outcomes Research and Surveillance study, and the Cancer Research Network Virtual Data Warehouse linked to registry data. Twelve potential indicators of recurrence were used to develop separate models for each cancer in each data source. Detection models maximized area under the ROC curve (AUC); timing models minimized average absolute error. Algorithms were compared by cancer type/data source, and contrasted with an existing binary detection rule. Detection model AUCs (>0.92) exceeded existing prediction rules. Timing models yielded absolute prediction errors that were small relative to follow-up time (<15%). Similar covariates were included in all detection and timing algorithms, though differences by cancer type and dataset challenged efforts to create 1 common algorithm for all scenarios. Valid and reliable detection of recurrence using big data is feasible. These tools will enable extensive, novel research on quality, effectiveness, and outcomes for lung and colorectal cancer patients and those who develop recurrence.

  9. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products

    USGS Publications Warehouse

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan

    2017-01-01

    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  10. Challenges and opportunities for integrating lake ecosystem modelling approaches

    USGS Publications Warehouse

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative view on the functioning of lake ecosystems. We end with a set of specific recommendations that may be of help in the further development of lake ecosystem models.

  11. Alternatives to the fish early life-stage test: Developing a conceptual model for early fish development

    EPA Science Inventory

    Chronic fish toxicity is a key parameter for hazard classification and environmental risk assessment of chemicals, and the OECD 210 fish early life-stage (FELS) test is the primary guideline test used for various international regulatory programs. There exists a need to develop ...

  12. Women, Human Development, and Learning. ERIC Digest.

    ERIC Educational Resources Information Center

    Kerka, Sandra

    A growing body of literature is questioning whether existing models of human development apply equally to men and women. Prevailing theories of human development have been criticized for being based on research with primarily male subjects of similar ethnic, racial, or class backgrounds. Some research supports the viewpoint that women have…

  13. An Industry-Sponsored, School-Focused Model for Continuing Professional Development of Technology Teachers

    ERIC Educational Resources Information Center

    Engelbrecht, Werner; Ankiewicz, Piet; de Swardt, Estelle

    2007-01-01

    Traditionally a divide has existed between faculties of education at higher education institutions (HEIs) and trade and industry, but the business sector is increasingly buying into community development with corporate social investment, especially regarding technology education. We report on a continuing professional teacher development (CPTD)…

  14. Rethinking biology instruction: The application of DNR-based instruction to the learning and teaching of biology

    NASA Astrophysics Data System (ADS)

    Maskiewicz, April Lee

    Educational studies report that secondary and college level students have developed only limited understandings of the most basic biological processes and their interrelationships from typical classroom experiences. Furthermore, students have developed undesirable reasoning schemes and beliefs that directly affect how they make sense of and account for biological phenomena. For these reasons, there exists a need to rethink instructional practices in biology. This dissertation discusses how the principles of Harel's (1998, 2001) DNR-based instruction in mathematics could be applied to the teaching and learning of biology. DNR is an acronym for the three foundational principles of the system: Duality, Necessity, and Repeated-reasoning. This study examines the application of these three principles to ecology instruction. Through clinical and teaching interviews, I developed models of students' existing ways of understanding in ecology and inferred their ways of thinking. From these models a hypothetical learning trajectory was developed for 16 college level freshmen enrolled in a 10-week ecology teaching experiment. Through cyclical, interpretive analysis I documented and analyzed the evolution of the participants' progress. The results provide empirical evidence to support the claim that the DNR principles are applicable to ecology instruction. With respect to the Duality Principle, helping students develop specific ways of understanding led to the development of model-based reasoning---a way of thinking and the cognitive objective guiding instruction. Through carefully structured problem solving tasks, the students developed a biological understanding of the relationship between matter cycling, energy flow, and cellular processes such as photosynthesis and respiration, and used this understanding to account for observable phenomena in nature. In the case of intellectual necessity, the results illuminate how problem situations can be developed for biology learners that create cognitive disequilibrium-equilibrium phases and thus lead to modification or refinement of existing schemes. Elements that contributed to creating intellectual need include (a) problem tasks that built on students' existing knowledge; (b) problem tasks that challenged students; (c) a routine in which students presented their group's solution to the class; and (d) the didactical contract (Brousseau, 1997) established in the classroom.

  15. Development of Interspecies Correlation Models for Petroleum Hydrocarbons

    EPA Science Inventory

    Estimating the consequences of petroleum products to water column organisms has commonly been hampered by limited acute toxicity data, which exists only for a relatively small number of test species. In this study, we developed petroleum-specific Interspecies Correlation Estimati...

  16. Educational Diagnostic Assessment.

    ERIC Educational Resources Information Center

    Bejar, Isaac I.

    1984-01-01

    Approaches proposed for educational diagnostic assessment are reviewed and identified as deficit assessment and error analysis. The development of diagnostic instruments may require a reexamination of existing psychometric models and development of alternative ones. The psychometric and content demands of diagnostic assessment all but require test…

  17. Multiscale Materials Modeling in an Industrial Environment.

    PubMed

    Weiß, Horst; Deglmann, Peter; In 't Veld, Pieter J; Cetinkaya, Murat; Schreiner, Eduard

    2016-06-07

    In this review, we sketch the materials modeling process in industry. We show that predictive and fast modeling is a prerequisite for successful participation in research and development processes in the chemical industry. Stable and highly automated workflows suitable for handling complex systems are a must. In particular, we review approaches to build and parameterize soft matter systems. By satisfying these prerequisites, efficiency for the development of new materials can be significantly improved, as exemplified here for formulation polymer development. This is in fact in line with recent Materials Genome Initiative efforts sponsored by the US government. Valuable contributions to product development are possible today by combining existing modeling techniques in an intelligent fashion, provided modeling and experiment work hand in hand.

  18. A methodology to select a wire insulation for use in habitable spacecraft.

    PubMed

    Paulos, T; Apostolakis, G

    1998-08-01

    This paper investigates electrical overheating events aboard a habitable spacecraft. The wire insulation involved in these failures plays a major role in the entire event scenario from threat development to detection and damage assessment. Ideally, if models of wire overheating events in microgravity existed, the various wire insulations under consideration could be quantitatively compared. However, these models do not exist. In this paper, a methodology is developed that can be used to select a wire insulation that is best suited for use in a habitable spacecraft. The results of this study show that, based upon the Analytic Hierarchy Process and simplifying assumptions, the criteria selected, and data used in the analysis, Tefzel is better than Teflon for use in a habitable spacecraft.

  19. EUROPLANET-RI modelling service for the planetary science community: European Modelling and Data Analysis Facility (EMDAF)

    NASA Astrophysics Data System (ADS)

    Khodachenko, Maxim; Miller, Steven; Stoeckler, Robert; Topf, Florian

    2010-05-01

    Computational modeling and observational data analysis are two major aspects of the modern scientific research. Both appear nowadays under extensive development and application. Many of the scientific goals of planetary space missions require robust models of planetary objects and environments as well as efficient data analysis algorithms, to predict conditions for mission planning and to interpret the experimental data. Europe has great strength in these areas, but it is insufficiently coordinated; individual groups, models, techniques and algorithms need to be coupled and integrated. Existing level of scientific cooperation and the technical capabilities for operative communication, allow considerable progress in the development of a distributed international Research Infrastructure (RI) which is based on the existing in Europe computational modelling and data analysis centers, providing the scientific community with dedicated services in the fields of their computational and data analysis expertise. These services will appear as a product of the collaborative communication and joint research efforts of the numerical and data analysis experts together with planetary scientists. The major goal of the EUROPLANET-RI / EMDAF is to make computational models and data analysis algorithms associated with particular national RIs and teams, as well as their outputs, more readily available to their potential user community and more tailored to scientific user requirements, without compromising front-line specialized research on model and data analysis algorithms development and software implementation. This objective will be met through four keys subdivisions/tasks of EMAF: 1) an Interactive Catalogue of Planetary Models; 2) a Distributed Planetary Modelling Laboratory; 3) a Distributed Data Analysis Laboratory, and 4) enabling Models and Routines for High Performance Computing Grids. Using the advantages of the coordinated operation and efficient communication between the involved computational modelling, research and data analysis expert teams and their related research infrastructures, EMDAF will provide a 1) flexible, 2) scientific user oriented, 3) continuously developing and fast upgrading computational and data analysis service to support and intensify the European planetary scientific research. At the beginning EMDAF will create a set of demonstrators and operational tests of this service in key areas of European planetary science. This work will aim at the following objectives: (a) Development and implementation of tools for distant interactive communication between the planetary scientists and computing experts (including related RIs); (b) Development of standard routine packages, and user-friendly interfaces for operation of the existing numerical codes and data analysis algorithms by the specialized planetary scientists; (c) Development of a prototype of numerical modelling services "on demand" for space missions and planetary researchers; (d) Development of a prototype of data analysis services "on demand" for space missions and planetary researchers; (e) Development of a prototype of coordinated interconnected simulations of planetary phenomena and objects (global multi-model simulators); (f) Providing the demonstrators of a coordinated use of high performance computing facilities (super-computer networks), done in cooperation with European HPC Grid DEISA.

  20. Existing and Emerging Payment and Delivery Reforms in Cardiology

    PubMed Central

    Farmer, Steven A.; Darling, Margaret L.; George, Meaghan; Casale, Paul N.; Hagan, Eileen; McClellan, Mark B.

    2017-01-01

    IMPORTANCE Recent health care reforms aim to increase patient access, reduce costs, and improve health care quality as payers turn to payment reform for greater value. Cardiologists need to understand emerging payment models to succeed in the evolving payment landscape. We review existing payment and delivery reforms that affect cardiologists, present 4 emerging examples, and consider their implications for clinical practice. OBSERVATIONS Public and commercial payers have recently implemented payment reforms and new models are evolving. Most cardiology models are modified fee-for-service or address procedural or episodic care, but population models are also emerging. Although there is widespread agreement that payment reform is needed, existing programs have significant limitations and the adoption ofnew programs has been slow. New payment reforms address some of these problems, but many details remain undefined. CONCLUSIONS AND RELEVANCE Early payment reforms were voluntary and cardiologists’ participation is variable. However, conventional fee-for-service will become less viable, and enrollment in new payment models will be unavoidable. Early participation in new payment models will allow clinicians to develop expertise in new care pathways during a period of relatively lower risk. PMID:27851858

  1. Interactive Reliability Model for Whisker-toughened Ceramics

    NASA Technical Reports Server (NTRS)

    Palko, Joseph L.

    1993-01-01

    Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.

  2. An individual-based model of zebrafish population dynamics accounting for energy dynamics.

    PubMed

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R R

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level.

  3. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  4. Cogeneration computer model assessment: Advanced cogeneration research study

    NASA Technical Reports Server (NTRS)

    Rosenberg, L.

    1983-01-01

    Cogeneration computer simulation models to recommend the most desirable models or their components for use by the Southern California Edison Company (SCE) in evaluating potential cogeneration projects was assessed. Existing cogeneration modeling capabilities are described, preferred models are identified, and an approach to the development of a code which will best satisfy SCE requirements is recommended. Five models (CELCAP, COGEN 2, CPA, DEUS, and OASIS) are recommended for further consideration.

  5. Orthogonal model and experimental data for analyzing wood-fiber-based tri-axial ribbed structural panels in bending

    Treesearch

    Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai

    2017-01-01

    This paper presents an analysis of 3-dimensional engineered structural panels (3DESP) made from wood-fiber-based laminated paper composites. Since the existing models for calculating the mechanical behavior of core configurations within sandwich panels are very complex, a new simplified orthogonal model (SOM) using an equivalent element has been developed. This model...

  6. Atmospheric prediction model survey

    NASA Technical Reports Server (NTRS)

    Wellck, R. E.

    1976-01-01

    As part of the SEASAT Satellite program of NASA, a survey of representative primitive equation atmospheric prediction models that exist in the world today was written for the Jet Propulsion Laboratory. Seventeen models developed by eleven different operational and research centers throughout the world are included in the survey. The surveys are tutorial in nature describing the features of the various models in a systematic manner.

  7. A University-Industry Cooperation Model for Small and Medium Enterprises: The Case of Chengdu KEDA Optoelectronic Technology Ltd.

    ERIC Educational Resources Information Center

    Peng, Shanzhong; Ferreira, Fernando A. F.; Zheng, He

    2017-01-01

    In this study, we develop a firm-dominated incremental cooperation model. Following the critical review of current literature and various cooperation models, we identified a number of strengths and shortcomings that form the basis for our framework. The objective of our theoretical model is to contribute to overcome the existing gap within…

  8. Toward an Integrated Gender-Linked Model of Aggression Subtypes in Early and Middle Childhood

    ERIC Educational Resources Information Center

    Ostrov, Jamie M.; Godleski, Stephanie A.

    2010-01-01

    An integrative model is proposed for understanding the development of physical and relational aggression in early and middle childhood. The central goal was to posit a new theoretical framework that expands on existing social-cognitive and gender schema models (i.e., Social Information-Processing Model of Children's Adjustment [N. R. Crick & K. A.…

  9. An integrated production-inventory model for food products adopting a general raw material procurement policy

    NASA Astrophysics Data System (ADS)

    Fauza, G.; Prasetyo, H.; Amanto, B. S.

    2018-05-01

    Studies on an integrated production-inventory model for deteriorating items have been done extensively. Most of the studies define deterioration as physical depletion of some inventories over time. This definition may not represent the deterioration characteristics of food products. The quality of food production decreases over time while the quantity remains the same. Further, in the existing models, the raw material is replenished several times (or at least once) within one production cycle. In food industries, however, a food company, for several reasons (e.g., the seasonal raw materials, discounted price, etc.) sometimes will get more benefit if it orders raw materials in a large quantity. Considering this fact, this research, therefore, is aimed at developing a more representative inventory model by (i) considering the quality losses in food and (ii) adopting a general raw material procurement policy. A mathematical model is established to represent the proposed policy in which the total profit of the system is the objective function. To evaluate the performance of the model, a numerical test was conducted. The numerical test indicates that the developed model has better performance, i.e., the total profit is 2.3% higher compared to the existing model.

  10. Evaluating Anthropogenic Carbon Emissions in the Urban Salt Lake Valley through Inverse Modeling: Combining Long-term CO2 Observations and an Emission Inventory using a Multiple-box Atmospheric Model

    NASA Astrophysics Data System (ADS)

    Catharine, D.; Strong, C.; Lin, J. C.; Cherkaev, E.; Mitchell, L.; Stephens, B. B.; Ehleringer, J. R.

    2016-12-01

    The rising level of atmospheric carbon dioxide (CO2), driven by anthropogenic emissions, is the leading cause of enhanced radiative forcing. Increasing societal interest in reducing anthropogenic greenhouse gas emissions call for a computationally efficient method of evaluating anthropogenic CO2 source emissions, particularly if future mitigation actions are to be developed. A multiple-box atmospheric transport model was constructed in conjunction with a pre-existing fossil fuel CO2 emission inventory to estimate near-surface CO2 mole fractions and the associated anthropogenic CO2 emissions in the Salt Lake Valley (SLV) of northern Utah, a metropolitan area with a population of 1 million. A 15-year multi-site dataset of observed CO2 mole fractions is used in conjunction with the multiple-box model to develop an efficient method to constrain anthropogenic emissions through inverse modeling. Preliminary results of the multiple-box model CO2 inversion indicate that the pre-existing anthropogenic emission inventory may over-estimate CO2 emissions in the SLV. In addition, inversion results displaying a complex spatial and temporal distribution of urban emissions, including the effects of residential development and vehicular traffic will be discussed.

  11. Application of Fracture Distribution Prediction Model in Xihu Depression of East China Sea

    NASA Astrophysics Data System (ADS)

    Yan, Weifeng; Duan, Feifei; Zhang, Le; Li, Ming

    2018-02-01

    There are different responses on each of logging data with the changes of formation characteristics and outliers caused by the existence of fractures. For this reason, the development of fractures in formation can be characterized by the fine analysis of logging curves. The well logs such as resistivity, sonic transit time, density, neutron porosity and gamma ray, which are classified as conventional well logs, are more sensitive to formation fractures. In view of traditional fracture prediction model, using the simple weighted average of different logging data to calculate the comprehensive fracture index, are more susceptible to subjective factors and exist a large deviation, a statistical method is introduced accordingly. Combining with responses of conventional logging data on the development of formation fracture, a prediction model based on membership function is established, and its essence is to analyse logging data with fuzzy mathematics theory. The fracture prediction results in a well formation in NX block of Xihu depression through two models are compared with that of imaging logging, which shows that the accuracy of fracture prediction model based on membership function is better than that of traditional model. Furthermore, the prediction results are highly consistent with imaging logs and can reflect the development of cracks much better. It can provide a reference for engineering practice.

  12. Development and validation of a stochastic model for potential growth of Listeria monocytogenes in naturally contaminated lightly preserved seafood.

    PubMed

    Mejlholm, Ole; Bøknæs, Niels; Dalgaard, Paw

    2015-02-01

    A new stochastic model for the simultaneous growth of Listeria monocytogenes and lactic acid bacteria (LAB) was developed and validated on data from naturally contaminated samples of cold-smoked Greenland halibut (CSGH) and cold-smoked salmon (CSS). During industrial processing these samples were added acetic and/or lactic acids. The stochastic model was developed from an existing deterministic model including the effect of 12 environmental parameters and microbial interaction (O. Mejlholm and P. Dalgaard, Food Microbiology, submitted for publication). Observed maximum population density (MPD) values of L. monocytogenes in naturally contaminated samples of CSGH and CSS were accurately predicted by the stochastic model based on measured variability in product characteristics and storage conditions. Results comparable to those from the stochastic model were obtained, when product characteristics of the least and most preserved sample of CSGH and CSS were used as input for the existing deterministic model. For both modelling approaches, it was shown that lag time and the effect of microbial interaction needs to be included to accurately predict MPD values of L. monocytogenes. Addition of organic acids to CSGH and CSS was confirmed as a suitable mitigation strategy against the risk of growth by L. monocytogenes as both types of products were in compliance with the EU regulation on ready-to-eat foods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE PAGES

    Chassin, David P.; Rondeau, Daniel

    2016-08-24

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. Finally, the results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  14. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Rondeau, Daniel

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. Finally, the results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  15. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Rondeau, Daniel

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. The results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  16. BIM authoring for an image-based bridge maintenance system of existing cable-supported bridges

    NASA Astrophysics Data System (ADS)

    Dang, N. S.; Shim, C. S.

    2018-04-01

    Infrastructure nowadays is increasingly become the main backbone for the metropolitan development in general. Along with the rise of new facilities, the demand in term of maintenance for the existing bridges is indispensable. Recently, the terminology of “preventive maintenance” is not unfamiliar with the engineer, literally is the use of a bridge maintenance system (BMS) based on a BIM-oriented model. In this paper, the process of generating a BMS based on BIM model is introduced in detail. Data management for this BMS is separated into two modules: site inspection system and information management system. The noteworthy aspect of this model lays on the closed and automatic process of “capture image, generate the technical damage report, and upload/feedback to the BMS” in real-time. A pilot BMS system for a cable-supported bridge is presented which showed a good performance and potential to further development of preventive maintenance.

  17. Numerical studies of nonlinear ultrasonic guided waves in uniform waveguides with arbitrary cross sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Peng; Fan, Zheng, E-mail: ZFAN@ntu.edu.sg; Zhou, Yu

    2016-07-15

    Nonlinear guided waves have been investigated widely in simple geometries, such as plates, pipe and shells, where analytical solutions have been developed. This paper extends the application of nonlinear guided waves to waveguides with arbitrary cross sections. The criteria for the existence of nonlinear guided waves were summarized based on the finite deformation theory and nonlinear material properties. Numerical models were developed for the analysis of nonlinear guided waves in complex geometries, including nonlinear Semi-Analytical Finite Element (SAFE) method to identify internal resonant modes in complex waveguides, and Finite Element (FE) models to simulate the nonlinear wave propagation at resonantmore » frequencies. Two examples, an aluminum plate and a steel rectangular bar, were studied using the proposed numerical model, demonstrating the existence of nonlinear guided waves in such structures and the energy transfer from primary to secondary modes.« less

  18. Metadata mapping and reuse in caBIG™

    PubMed Central

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-01-01

    Background This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG™). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG™ framework or other frameworks that use metadata repositories. Results The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG™ framework and potentially any framework that uses a metadata repository. Conclusion This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG™. This effort contributes to facilitating the development of interoperable systems within caBIG™ as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies. PMID:19208192

  19. Alternative alignments development and evaluation for the US 220 project in Maryland.

    DOT National Transportation Integrated Search

    2011-06-01

    This project aims to find the preferred alternative alignments for the Maryland section of existing US 220, using the highway : alignment optimization (HAO) model. The model was used to explore alternative alignments within a 4,000 foot-wide buffer o...

  20. Development of a Human Physiologically Based Pharmacokinetics (PBPK) Model For Dermal Permeability for Lindane

    EPA Science Inventory

    Lindane is a neurotoxicant used for the treatment of lice and scabies present on human skin. Due to its pharmaceutical application, an extensive pharmacokinetic database exists in humans. Mathematical diffusion models allow for calculation of lindane skin permeability coefficient...

  1. DEVELOPING SEASONAL AMMONIA EMISSION ESTIMATES WITH AN INVERSE MODELING TECHNIQUE

    EPA Science Inventory

    Significant uncertainty exists in magnitude and variability of ammonia (NH3) emissions, which are needed for air quality modeling of aerosols and deposition of nitrogen compounds. Approximately 85% of NH3 emissions are estimated to come from agricultural non-point sources. We sus...

  2. Yield Model Development (YMD) implementation plan for fiscal years 1981 and 1982

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A. (Principal Investigator)

    1981-01-01

    A plan is described for supporting USDA crop production forecasting and estimation by (1) testing, evaluating, and selecting crop yield models for application testing; (2) identifying areas of feasible research for improvement of models; and (3) conducting research to modify existing models and to develop new crop yield assessment methods. Tasks to be performed for each of these efforts are described as well as for project management and support. The responsibilities of USDA, USDC, USDI, and NASA are delineated as well as problem areas to be addressed.

  3. Habitat Suitability Index Models: Red king crab

    USGS Publications Warehouse

    Jewett, Stephen C.; Onuf, Christopher P.

    1988-01-01

    A review and synthesis of existing information were used to develop a Habitat Suitability Index (HSI) model for evaluating habitat of different life stages of red king crab (Paralithodes camtschatica). A model consolidates habitat use information into a framework appropriate for field application, and is scaled to produce an index between 0.0 (unsuitable habitat) and 1.0 (optimum habitat) in Alaskan coastal waters, especially in the Gulf of Alaska and the southeastern Bering Sea. HSI models are designed to be used with Habitat Evaluation Procedures previously developed by the U.S. Fish and Wildlife Service.

  4. Straddling Interdisciplinary Seams: Working Safely in the Field, Living Dangerously With a Model

    NASA Astrophysics Data System (ADS)

    Light, B.; Roberts, A.

    2016-12-01

    Many excellent proposals for observational work have included language detailing how the proposers will appropriately archive their data and publish their results in peer-reviewed literature so that they may be readily available to the modeling community for parameterization development. While such division of labor may be both practical and inevitable, the assimilation of observational results and the development of observationally-based parameterizations of physical processes require care and feeding. Key questions include: (1) Is an existing parameterization accurate, consistent, and general? If not, it may be ripe for additional physics. (2) Do there exist functional working relationships between human modeler and human observationalist? If not, one or more may need to be initiated and cultivated. (3) If empirical observation and model development are a chicken/egg problem, how, given our lack of prescience and foreknowledge, can we better design observational science plans to meet the eventual demands of model parameterization? (4) Will the addition of new physics "break" the model? If so, then the addition may be imperative. In the context of these questions, we will make retrospective and forward-looking assessments of a now-decade-old numerical parameterization to treat the partitioning of solar energy at the Earth's surface where sea ice is present. While this so called "Delta-Eddington Albedo Parameterization" is currently employed in the widely-used Los Alamos Sea Ice Model (CICE) and appears to be standing the tests of accuracy, consistency, and generality, we will highlight some ideas for its ongoing development and improvement.

  5. Model Guided Design and Development Process for an Electronic Health Record Training Program

    PubMed Central

    He, Ze; Marquard, Jenna; Henneman, Elizabeth

    2016-01-01

    Effective user training is important to ensure electronic health record (EHR) implementation success. Though many previous studies report best practice principles and success and failure stories, current EHR training is largely empirically-based and often lacks theoretical guidance. In addition, the process of training development is underemphasized and underreported. A white paper by the American Medical Informatics Association called for models of user training for clinical information system implementation; existing instructional development models from learning theory provide a basis to meet this call. We describe in this paper our experiences and lessons learned as we adapted several instructional development models to guide our development of EHR user training. Specifically, we focus on two key aspects of this training development: training content and training process. PMID:28269940

  6. Use of Animal Models to Develop Antiaddiction Medications

    PubMed Central

    Gardner, Eliot L.

    2008-01-01

    Although addiction is a uniquely human phenomenon, some of its pathognomonic features can be modeled at the animal level. Such features include the euphoric “high” produced by acute administration of addictive drugs; the dysphoric “crash” produced by acute withdrawal, drug-seeking, and drug-taking behaviors; and relapse to drug-seeking behavior after achieving successful abstinence. Animal models exist for each of these features. In this review, I focus on various animal models of addiction and how they can be used to search for clinically effective antiaddiction medications. I conclude by noting some of the new and novel medications that have been developed preclinically using such models and the hope for further developments along such lines. PMID:18803910

  7. Challenges in Developing Models Describing Complex Soil Systems

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.

    2014-12-01

    Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.

  8. Teachers' Attitudes toward Reporting Child Sexual Abuse: Problems with Existing Research Leading to New Scale Development

    ERIC Educational Resources Information Center

    Walsh, Kerryann; Rassafiani, Mehdi; Mathews, Ben; Farrell, Ann; Butler, Des

    2010-01-01

    This paper details a systematic literature review identifying problems in extant research relating to teachers' attitudes toward reporting child sexual abuse and offers a model for new attitude scale development and testing. Scale development comprised a five-phase process grounded in contemporary attitude theories, including (a) developing the…

  9. The stock-flow model of spatial data infrastructure development refined by fuzzy logic.

    PubMed

    Abdolmajidi, Ehsan; Harrie, Lars; Mansourian, Ali

    2016-01-01

    The system dynamics technique has been demonstrated to be a proper method by which to model and simulate the development of spatial data infrastructures (SDI). An SDI is a collaborative effort to manage and share spatial data at different political and administrative levels. It is comprised of various dynamically interacting quantitative and qualitative (linguistic) variables. To incorporate linguistic variables and their joint effects in an SDI-development model more effectively, we suggest employing fuzzy logic. Not all fuzzy models are able to model the dynamic behavior of SDIs properly. Therefore, this paper aims to investigate different fuzzy models and their suitability for modeling SDIs. To that end, two inference and two defuzzification methods were used for the fuzzification of the joint effect of two variables in an existing SDI model. The results show that the Average-Average inference and Center of Area defuzzification can better model the dynamics of SDI development.

  10. Paying for quality not quantity: a wisconsin health maintenance organization proposes an incentive model for reimbursement of chiropractic services.

    PubMed

    Pursel, Kevin J; Jacobson, Martin; Stephenson, Kathy

    2012-07-01

    The purpose of this study is to describe a reimbursement model that was developed by one Health Maintenance Organization (HMO) to transition from fee-for-service to add a combination of pay for performance and reporting model of reimbursement for chiropractic care. The previous incentive program used by the HMO provided best-practice education and additional reimbursement incentives for achieving the National Committee for Quality Assurance Back Pain Recognition Program (NCQA-BPRP) recognition status. However, this model had not leveled costs between doctors of chiropractic (DCs). Therefore, the HMO management aimed to develop a reimbursement model to incentivize providers to embrace existing best-practice models and report existing quality metrics. The development goals included the following: it should (1) be as financially predictable as the previous system, (2) cost no more on a per-member basis, (3) meet the coverage needs of its members, and (4) be able to be operationalized. The model should also reward DCs who embraced best practices with compensation, not simply tied to providing more procedures, the new program needed to (1) cause little or no disruption in current billing, (2) be grounded achievable and defined expectations for improvement in quality, and (3) be voluntary, without being unduly punitive, should the DC choose not to participate in the program. The generated model was named the Comprehensive Chiropractic Quality Reimbursement Methodology (CCQRM; pronounced "Quorum"). In this hybrid model, additional reimbursement, beyond pay-for-procedures will be based on unique payment interpretations reporting selected, existing Physician Quality Reporting System (PQRS) codes, meaningful use of electronic health records, and achieving NCQA-BPRP recognition. This model aims to compensate providers using pay-for-performance, pay-for-quality reporting, pay-for-procedure methods. The CCQRM reimbursement model was developed to address the current needs of one HMO that aims to transition from fee-for-service to a pay-for-performance and quality reporting for reimbursement for chiropractic care. This model is theoretically based on the combination of a fee-for-service payment, pay for participation (NCQA Back Pain Recognition Program payment), meaningful use of electronic health record payment, and pay for reporting (PQRS-BPMG payment). Evaluation of this model needs to be implemented to determine if it will achieve its intended goals. Copyright © 2012 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  11. A systems biology approach to investigate the antimicrobial activity of oleuropein.

    PubMed

    Li, Xianhua; Liu, Yanhong; Jia, Qian; LaMacchia, Virginia; O'Donoghue, Kathryn; Huang, Zuyi

    2016-12-01

    Oleuropein and its hydrolysis products are olive phenolic compounds that have antimicrobial effects on a variety of pathogens, with the potential to be utilized in food and pharmaceutical products. While the existing research is mainly focused on individual genes or enzymes that are regulated by oleuropein for antimicrobial activities, little work has been done to integrate intracellular genes, enzymes and metabolic reactions for a systematic investigation of antimicrobial mechanism of oleuropein. In this study, the first genome-scale modeling method was developed to predict the system-level changes of intracellular metabolism triggered by oleuropein in Staphylococcus aureus, a common food-borne pathogen. To simulate the antimicrobial effect, an existing S. aureus genome-scale metabolic model was extended by adding the missing nitric oxide reactions, and exchange rates of potassium, phosphate and glutamate were adjusted in the model as suggested by previous research to mimic the stress imposed by oleuropein on S. aureus. The developed modeling approach was able to match S. aureus growth rates with experimental data for five oleuropein concentrations. The reactions with large flux change were identified and the enzymes of fifteen of these reactions were validated by existing research for their important roles in oleuropein metabolism. When compared with experimental data, the up/down gene regulations of 80% of these enzymes were correctly predicted by our modeling approach. This study indicates that the genome-scale modeling approach provides a promising avenue for revealing the intracellular metabolism of oleuropein antimicrobial properties.

  12. Improved indexes for targeting placement of buffers of Hortonian runoff

    Treesearch

    M.G. Dosskey; Z. Qiu; M.J. Helmers; D.E. Eisenhauer

    2011-01-01

    Targeting specific locations within agricultural watersheds for installing vegetative buffers has been advocated as a way to enhance the impact of buffers and buffer programs on stream water quality. Existing models for targeting buffers of Hortonian, or infiltration-excess, runoff are not well developed. The objective was to improve on an existing soil survey–based...

  13. Rapid Automated Aircraft Simulation Model Updating from Flight Data

    NASA Technical Reports Server (NTRS)

    Brian, Geoff; Morelli, Eugene A.

    2011-01-01

    Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.

  14. Enhancing Access to Patient Education Information: A Pilot Usability Study

    PubMed Central

    Beaudoin, Denise E.; Rocha, Roberto A.; Tse, Tony

    2005-01-01

    Health care organizations are developing Web-based portals to provide patient access to personal health information and enhance patient-provider communication. This pilot study investigates two navigation models (“serial” and “menu-driven”) for improving access to education materials available through a portal. There was a trend toward greater user satisfaction with the menu-driven model. Model preference was influenced by frequency of Web use. Results should aid in the improvement of existing portals and in the development of new ones. PMID:16779179

  15. Extending enterprise architecture modelling with business goals and requirements

    NASA Astrophysics Data System (ADS)

    Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten

    2011-02-01

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.

  16. Numerical Simulation of Liquid Jet Atomization Including Turbulence Effects

    NASA Technical Reports Server (NTRS)

    Trinh, Huu P.; Chen, C. P.; Balasubramanyam, M. S.

    2005-01-01

    This paper describes numerical implementation of a newly developed hybrid model, T-blob/T-TAB, into an existing computational fluid dynamics (CFD) program for primary and secondary breakup simulation of liquid jet atomization. This model extend two widely used models, the Kelvin-Helmholtz (KH) instability of Reitz (blob model) and the Taylor-Analogy-Breakup (TAB) secondary droplet breakup by O'Rourke and Amsden to include turbulence effects. In the primary breakup model, the level of the turbulence effect on the liquid breakup depends on the characteristic scales and the initial flow conditions. For the secondary breakup, an additional turbulence force acted on parent drops is modeled and integrated into the TAB governing equation. Several assessment studies are presented and the results indicate that the existing KH and TAB models tend to under-predict the product drop size and spray angle, while the current model provides superior results when compared with the measured data.

  17. Using micro-simulation to investigate the safety impacts of transit design alternatives at signalized intersections.

    PubMed

    Li, Lu; Persaud, Bhagwant; Shalaby, Amer

    2017-03-01

    This study investigates the use of crash prediction models and micro-simulation to develop an effective surrogate safety assessment measure at the intersection level. With the use of these tools, hypothetical scenarios can be developed and explored to evaluate the safety impacts of design alternatives in a controlled environment, in which factors not directly associated with the design alternatives can be fixed. Micro-simulation models are developed, calibrated, and validated. Traffic conflicts in the micro-simulation models are estimated and linked with observed crash frequency, which greatly alleviates the lengthy time needed to collect sufficient crash data for evaluating alternatives, due to the rare and infrequent nature of crash events. A set of generalized linear models with negative binomial error structure is developed to correlate the simulated conflicts with the observed crash frequency in Toronto, Ontario, Canada. Crash prediction models are also developed for crashes of different impact types and for transit-involved crashes. The resulting statistical significance and the goodness-of-fit of the models suggest adequate predictive ability. Based on the established correlation between simulated conflicts and observed crashes, scenarios are developed in the micro-simulation models to investigate the safety effects of individual transit line elements by making hypothetical modifications to such elements and estimating changes in crash frequency from the resulting changes in conflicts. The findings imply that the existing transit signal priority schemes can have a negative effect on safety performance, and that the existing near-side stop positioning and streetcar transit type can be safer at their current state than if they were to be replaced by their respective counterparts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Investigation of Stimulation-Response Relationships for Complex Fracture Systems in Enhanced Geothermal Reservoirs

    DOE Data Explorer

    Fu, Pengcheng; Johnson, Scott M.; Carrigan, Charles R.

    2011-01-01

    Hydraulic fracturing is currently the primary method for stimulating low-permeability geothermal reservoirs and creating Enhanced (or Engineered) Geothermal Systems (EGS) with improved permeability and heat production efficiency. Complex natural fracture systems usually exist in the formations to be stimulated and it is therefore critical to understand the interactions between existing fractures and newly created fractures before optimal stimulation strategies can be developed. Our study aims to improve the understanding of EGS stimulation-response relationships by developing and applying computer-based models that can effectively reflect the key mechanisms governing interactions between complex existing fracture networks and newly created hydraulic fractures. In this paper, we first briefly describe the key modules of our methodology, namely a geomechanics solver, a discrete fracture flow solver, a rock joint response model, an adaptive remeshing module, and most importantly their effective coupling. After verifying the numerical model against classical closed-form solutions, we investigate responses of reservoirs with different preexisting natural fractures to a variety of stimulation strategies. The factors investigated include: the in situ stress states (orientation of the principal stresses and the degree of stress anisotropy), pumping pressure, and stimulation sequences of multiple wells.

  19. A system level model for preliminary design of a space propulsion solid rocket motor

    NASA Astrophysics Data System (ADS)

    Schumacher, Daniel M.

    Preliminary design of space propulsion solid rocket motors entails a combination of components and subsystems. Expert design tools exist to find near optimal performance of subsystems and components. Conversely, there is no system level preliminary design process for space propulsion solid rocket motors that is capable of synthesizing customer requirements into a high utility design for the customer. The preliminary design process for space propulsion solid rocket motors typically builds on existing designs and pursues feasible rather than the most favorable design. Classical optimization is an extremely challenging method when dealing with the complex behavior of an integrated system. The complexity and combinations of system configurations make the number of the design parameters that are traded off unreasonable when manual techniques are used. Existing multi-disciplinary optimization approaches generally address estimating ratios and correlations rather than utilizing mathematical models. The developed system level model utilizes the Genetic Algorithm to perform the necessary population searches to efficiently replace the human iterations required during a typical solid rocket motor preliminary design. This research augments, automates, and increases the fidelity of the existing preliminary design process for space propulsion solid rocket motors. The system level aspect of this preliminary design process, and the ability to synthesize space propulsion solid rocket motor requirements into a near optimal design, is achievable. The process of developing the motor performance estimate and the system level model of a space propulsion solid rocket motor is described in detail. The results of this research indicate that the model is valid for use and able to manage a very large number of variable inputs and constraints towards the pursuit of the best possible design.

  20. Biomechanics of Early Cardiac Development

    PubMed Central

    Goenezen, Sevan; Rennie, Monique Y.

    2012-01-01

    Biomechanics affect early cardiac development, from looping to the development of chambers and valves. Hemodynamic forces are essential for proper cardiac development, and their disruption leads to congenital heart defects. A wealth of information already exists on early cardiac adaptations to hemodynamic loading, and new technologies, including high resolution imaging modalities and computational modeling, are enabling a more thorough understanding of relationships between hemodynamics and cardiac development. Imaging and modeling approaches, used in combination with biological data on cell behavior and adaptation, are paving the road for new discoveries on links between biomechanics and biology and their effect on cardiac development and fetal programming. PMID:22760547

  1. Natural products for pest control: an analysis of their role, value and future.

    PubMed

    Gerwick, B Clifford; Sparks, Thomas C

    2014-08-01

    Natural products (NPs) have long been used as pesticides and have broadly served as a source of inspiration for a great many commercial synthetic organic fungicides, herbicides and insecticides that are in the market today. In light of the continuing need for new tools to address an ever-changing array of fungal, weed and insect pests, NPs continue to be a source of models and templates for the development of new pest control agents. Interestingly, an examination of the literature suggests that NP models exist for many of the pest control agents that were discovered by other means, suggesting that, had circumstances been different, these NPs could have served as inspiration for the discovery of a great many more of today's pest control agents. Here, an attempt is made to answer questions regarding the existence of an NP model for existing classes of pesticides and what is needed for the discovery of new NPs and NP models for pest control agents. © 2014 Society of Chemical Industry.

  2. Development of a human adaptive immune system in cord blood cell-transplanted mice.

    PubMed

    Traggiai, Elisabetta; Chicha, Laurie; Mazzucchelli, Luca; Bronz, Lucio; Piffaretti, Jean-Claude; Lanzavecchia, Antonio; Manz, Markus G

    2004-04-02

    Because ethical restrictions limit in vivo studies of the human hemato-lymphoid system, substitute human to small animal xenotransplantation models have been employed. Existing models, however, sustain only limited development and maintenance of human lymphoid cells and rarely produce immune responses. Here we show that intrahepatic injection of CD34+ human cord blood cells into conditioned newborn Rag2-/-gammac-/- mice leads to de novo development of B, T, and dendritic cells; formation of structured primary and secondary lymphoid organs; and production of functional immune responses. This provides a valuable model to study development and function of the human adaptive immune system in vivo.

  3. Returners and explorers dichotomy in human mobility

    PubMed Central

    Pappalardo, Luca; Simini, Filippo; Rinzivillo, Salvatore; Pedreschi, Dino; Giannotti, Fosca; Barabási, Albert-László

    2015-01-01

    The availability of massive digital traces of human whereabouts has offered a series of novel insights on the quantitative patterns characterizing human mobility. In particular, numerous recent studies have lead to an unexpected consensus: the considerable variability in the characteristic travelled distance of individuals coexists with a high degree of predictability of their future locations. Here we shed light on this surprising coexistence by systematically investigating the impact of recurrent mobility on the characteristic distance travelled by individuals. Using both mobile phone and GPS data, we discover the existence of two distinct classes of individuals: returners and explorers. As existing models of human mobility cannot explain the existence of these two classes, we develop more realistic models able to capture the empirical findings. Finally, we show that returners and explorers play a distinct quantifiable role in spreading phenomena and that a correlation exists between their mobility patterns and social interactions. PMID:26349016

  4. Ferroelectric Material Application: Modeling Ferroelectric Field Effect Transistor Characteristics from Micro to Nano

    NASA Technical Reports Server (NTRS)

    MacLeod, Todd, C.; Ho, Fat Duen

    2006-01-01

    All present ferroelectric transistors have been made on the micrometer scale. Existing models of these devices do not take into account effects of nanoscale ferroelectric transistors. Understanding the characteristics of these nanoscale devices is important in developing a strategy for building and using future devices. This paper takes an existing microscale ferroelectric field effect transistor (FFET) model and adds effects that become important at a nanoscale level, including electron velocity saturation and direct tunneling. The new model analyzed FFETs ranging in length from 40,000 nanometers to 4 nanometers and ferroelectric thickness form 200 nanometers to 1 nanometer. The results show that FFETs can operate on the nanoscale but have some undesirable characteristics at very small dimensions.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haney, Thomas Jay

    This report documents the Data Quality Objectives (DQOs) developed for the Idaho National Laboratory (INL) Site ambient air surveillance program. The development of the DQOs was based on the seven-step process recommended “for systematic planning to generate performance and acceptance criteria for collecting environmental data” (EPA 2006). The process helped to determine the type, quantity, and quality of data needed to meet current regulatory requirements and to follow U.S. Department of Energy guidance for environmental surveillance air monitoring design. It also considered the current air monitoring program that has existed at INL Site since the 1950s. The development of themore » DQOs involved the application of the atmospheric dispersion model CALPUFF to identify likely contamination dispersion patterns at and around the INL Site using site-specific meteorological data. Model simulations were used to quantitatively assess the probable frequency of detection of airborne radionuclides released by INL Site facilities using existing and proposed air monitors.« less

  6. A development framework for artificial intelligence based distributed operations support systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.; Cottman, Bruce H.

    1990-01-01

    Advanced automation is required to reduce costly human operations support requirements for complex space-based and ground control systems. Existing knowledge based technologies have been used successfully to automate individual operations tasks. Considerably less progress has been made in integrating and coordinating multiple operations applications for unified intelligent support systems. To fill this gap, SOCIAL, a tool set for developing Distributed Artificial Intelligence (DAI) systems is being constructed. SOCIAL consists of three primary language based components defining: models of interprocess communication across heterogeneous platforms; models for interprocess coordination, concurrency control, and fault management; and for accessing heterogeneous information resources. DAI applications subsystems, either new or existing, will access these distributed services non-intrusively, via high-level message-based protocols. SOCIAL will reduce the complexity of distributed communications, control, and integration, enabling developers to concentrate on the design and functionality of the target DAI system itself.

  7. Longitudinal driver model and collision warning and avoidance algorithms based on human driving databases

    NASA Astrophysics Data System (ADS)

    Lee, Kangwon

    Intelligent vehicle systems, such as Adaptive Cruise Control (ACC) or Collision Warning/Collision Avoidance (CW/CA), are currently under development, and several companies have already offered ACC on selected models. Control or decision-making algorithms of these systems are commonly evaluated under extensive computer simulations and well-defined scenarios on test tracks. However, they have rarely been validated with large quantities of naturalistic human driving data. This dissertation utilized two University of Michigan Transportation Research Institute databases (Intelligent Cruise Control Field Operational Test and System for Assessment of Vehicle Motion Environment) in the development and evaluation of longitudinal driver models and CW/CA algorithms. First, to examine how drivers normally follow other vehicles, the vehicle motion data from the databases were processed using a Kalman smoother. The processed data was then used to fit and evaluate existing longitudinal driver models (e.g., the linear follow-the-leader model, the Newell's special model, the nonlinear follow-the-leader model, the linear optimal control model, the Gipps model and the optimal velocity model). A modified version of the Gipps model was proposed and found to be accurate in both microscopic (vehicle) and macroscopic (traffic) senses. Second, to examine emergency braking behavior and to evaluate CW/CA algorithms, the concepts of signal detection theory and a performance index suitable for unbalanced situations (few threatening data points vs. many safe data points) are introduced. Selected existing CW/CA algorithms were found to have a performance index (geometric mean of true-positive rate and precision) not exceeding 20%. To optimize the parameters of the CW/CA algorithms, a new numerical optimization scheme was developed to replace the original data points with their representative statistics. A new CW/CA algorithm was proposed, which was found to score higher than 55% in the performance index. This dissertation provides a model of how drivers follow lead-vehicles that is much more accurate than other models in the literature. Furthermore, the data-based approach was used to confirm that a CW/CA algorithm utilizing lead-vehicle braking was substantially more effective than existing algorithms, leading to collision warning systems that are much more likely to contribute to driver safety.

  8. Physically based estimation of soil water retention from textural data: General framework, new models, and streamlined existing models

    USGS Publications Warehouse

    Nimmo, J.R.; Herkelrath, W.N.; Laguna, Luna A.M.

    2007-01-01

    Numerous models are in widespread use for the estimation of soil water retention from more easily measured textural data. Improved models are needed for better prediction and wider applicability. We developed a basic framework from which new and existing models can be derived to facilitate improvements. Starting from the assumption that every particle has a characteristic dimension R associated uniquely with a matric pressure ?? and that the form of the ??-R relation is the defining characteristic of each model, this framework leads to particular models by specification of geometric relationships between pores and particles. Typical assumptions are that particles are spheres, pores are cylinders with volume equal to the associated particle volume times the void ratio, and that the capillary inverse proportionality between radius and matric pressure is valid. Examples include fixed-pore-shape and fixed-pore-length models. We also developed alternative versions of the model of Arya and Paris that eliminate its interval-size dependence and other problems. The alternative models are calculable by direct application of algebraic formulas rather than manipulation of data tables and intermediate results, and they easily combine with other models (e.g., incorporating structural effects) that are formulated on a continuous basis. Additionally, we developed a family of models based on the same pore geometry as the widely used unsaturated hydraulic conductivity model of Mualem. Predictions of measurements for different suitable media show that some of the models provide consistently good results and can be chosen based on ease of calculations and other factors. ?? Soil Science Society of America. All rights reserved.

  9. High Reynolds number turbulence model of rotating shear flows

    NASA Astrophysics Data System (ADS)

    Masuda, S.; Ariga, I.; Koyama, H. S.

    1983-09-01

    A Reynolds stress closure model for rotating turbulent shear flows is developed. Special attention is paid to keeping the model constants independent of rotation. First, general forms of the model of a Reynolds stress equation and a dissipation rate equation are derived, the only restrictions of which are high Reynolds number and incompressibility. The model equations are then applied to two-dimensional equilibrium boundary layers and the effects of Coriolis acceleration on turbulence structures are discussed. Comparisons with the experimental data and with previous results in other external force fields show that there exists a very close analogy between centrifugal, buoyancy and Coriolis force fields. Finally, the model is applied to predict the two-dimensional boundary layers on rotating plane walls. Comparisons with existing data confirmed its capability of predicting mean and turbulent quantities without employing any empirical relations in rotating fields.

  10. A Multi-layer Dynamic Model for Coordination Based Group Decision Making in Water Resource Allocation and Scheduling

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Zhang, Xingnan; Li, Chenming; Wang, Jianying

    Management of group decision-making is an important issue in water source management development. In order to overcome the defects in lacking of effective communication and cooperation in the existing decision-making models, this paper proposes a multi-layer dynamic model for coordination in water resource allocation and scheduling based group decision making. By introducing the scheme-recognized cooperative satisfaction index and scheme-adjusted rationality index, the proposed model can solve the problem of poor convergence of multi-round decision-making process in water resource allocation and scheduling. Furthermore, the problem about coordination of limited resources-based group decision-making process can be solved based on the effectiveness of distance-based group of conflict resolution. The simulation results show that the proposed model has better convergence than the existing models.

  11. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  12. Aspen succession in the Intermountain West: A deterministic model

    Treesearch

    Dale L. Bartos; Frederick R. Ward; George S. Innis

    1983-01-01

    A deterministic model of succession in aspen forests was developed using existing data and intuition. The degree of uncertainty, which was determined by allowing the parameter values to vary at random within limits, was larger than desired. This report presents results of an analysis of model sensitivity to changes in parameter values. These results have indicated...

  13. Logistic regression models of factors influencing the location of bioenergy and biofuels plants

    Treesearch

    T.M. Young; R.L. Zaretzki; J.H. Perdue; F.M. Guess; X. Liu

    2011-01-01

    Logistic regression models were developed to identify significant factors that influence the location of existing wood-using bioenergy/biofuels plants and traditional wood-using facilities. Logistic models provided quantitative insight for variables influencing the location of woody biomass-using facilities. Availability of "thinnings to a basal area of 31.7m2/ha...

  14. Dynamics of a stochastic tuberculosis model with constant recruitment and varying total population size

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Jiang, Daqing; Shi, Ningzhong; Hayat, Tasawar; Alsaedi, Ahmed

    2017-03-01

    In this paper, we develop a mathematical model for a tuberculosis model with constant recruitment and varying total population size by incorporating stochastic perturbations. By constructing suitable stochastic Lyapunov functions, we establish sufficient conditions for the existence of an ergodic stationary distribution as well as extinction of the disease to the stochastic system.

  15. [Municipalities as a Model for New Careers and Redirection of Vocational-Technical Education Programs.] Final Report.

    ERIC Educational Resources Information Center

    Institute for Local Self Government, Berkeley, CA.

    To meet the manpower needs of local governments, the model developed for this project redirects national and technical education toward new careers programs. Designed by task forces of professional personnel, the model utilizes existing local government resources, including funds for new career activities. Accomplishments of the project include:…

  16. A New Theory-to-Practice Model for Student Affairs: Integrating Scholarship, Context, and Reflection

    ERIC Educational Resources Information Center

    Reason, Robert D.; Kimball, Ezekiel W.

    2012-01-01

    In this article, we synthesize existing theory-to-practice approaches within the student affairs literature to arrive at a new model that incorporates formal and informal theory, institutional context, and reflective practice. The new model arrives at a balance between the rigor necessary for scholarly theory development and the adaptability…

  17. The Development of a Multi-Level Model for Crisis Preparedness and Intervention in the Greek Educational System

    ERIC Educational Resources Information Center

    Hatzichristiou, Chryse; Issari, Philia; Lykitsakou, Konstantina; Lampropoulou, Aikaterini; Dimitropoulou, Panayiota

    2011-01-01

    This article proposes a multi-level model for crisis preparedness and intervention in the Greek educational system. It presents: a) a brief overview of leading models of school crisis preparedness and intervention as well as cultural considerations for contextually relevant crisis response; b) a description of existing crisis intervention…

  18. A Handbook of Teacher-Developed Career Education Infusion Lessons for the Senior High School.

    ERIC Educational Resources Information Center

    Livonia Public Schools, MI.

    This handbook contains 200 teacher-developed lessons which infuse the four career development components (self-awareness and -assessment, career awareness and exploration, career decision making, career planning and placement) of the Michigan Model of Career Education into the existing course content, emphasizing one or more of the career life…

  19. Development and Modification of a Response Class via Positive and Negative Reinforcement: A Translational Approach

    ERIC Educational Resources Information Center

    Mendres, Amber E.; Borrero, John C.

    2010-01-01

    When responses function to produce the same reinforcer, a response class exists. Researchers have examined response classes in applied settings; however, the challenges associated with conducting applied research on response class development have recently necessitated the development of an analogue response class model. To date, little research…

  20. Strategy for the management of substance use disorders in the State of Punjab: Developing a structural model of state-level de-addiction services in the health sector (the “Punjab model”)

    PubMed Central

    Basu, Debasish; Avasthi, Ajit

    2015-01-01

    Background: Substance use disorders are believed to have become rampant in the State of Punjab, causing substantive loss to the person, the family, the society, and the state. The situation is likely to worsen further if a structured, government-level, state-wide de-addiction service is not put into place. Aims: The aim was to describe a comprehensive structural model of de-addiction service in the State of Punjab (the “Pyramid model” or “Punjab model”), which is primarily concerned with demand reduction, particularly that part which is concerned with identification, treatment, and aftercare of substance users. Materials and Methods: At the behest of the Punjab Government, this model was developed by the authors after a detailed study of the current scenario, critical and exhaustive look at the existing guidelines, policies, books, web resources, government documents, and the like in this area, a check of the ground reality in terms of existing infrastructural and manpower resources, and keeping pragmatism and practicability in mind. Several rounds of meetings with the government officials and other important stakeholders helped to refine the model further. Results: Our model envisages structural innovation and renovations within the existing state healthcare infrastructure. We formulated a “Pyramid model,” later renamed as “Punjab model,” where there is a broad community base for early identification and outpatient level treatment at the primary care level, both outpatient and inpatient care at the secondary care level, and comprehensive management for more difficult cases at the tertiary care level. A separate de-addiction system for the prisons was also developed. Each of these structural elements was described and refined in details, with the aim of uniform, standardized, and easily accessible care across the state. Conclusions: If the “Punjab model” succeeds, it can provide useful models for other states or even at the national level. PMID:25657452

  1. Developing a Degree-Day Model to Predict Billbug (Coleoptera: Curculionidae) Seasonal Activity in Utah and Idaho Turfgrass.

    PubMed

    Dupuy, Madeleine M; Powell, James A; Ramirez, Ricardo A

    2017-10-01

    Billbugs are native pests of turfgrass throughout North America, primarily managed with preventive, calendar-based insecticide applications. An existing degree-day model (lower development threshold of 10°C, biofix 1 March) developed in the eastern United States for bluegrass billbug, Sphenophorus parvulus (Gyllenhal; Coleoptera: Curculionidae), may not accurately predict adult billbug activity in the western United States, where billbugs occur as a species complex. The objectives of this study were 1) to track billbug phenology and species composition in managed Utah and Idaho turfgrass and 2) to evaluate model parameters that best predict billbug activity, including those of the existing bluegrass billbug model. Tracking billbugs with linear pitfall traps at two sites each in Utah and Idaho, we confirmed a complex of three univoltine species damaging turfgrass consisting of (in descending order of abundance) bluegrass billbug, hunting billbug (Sphenophorus venatus vestitus Chittenden; Coleoptera: Curculionidae), and Rocky Mountain billbug (Sphenophorus cicatristriatus Fabraeus; Coleoptera: Curculionidae). This complex was active from February through mid-October, with peak activity in mid-June. Based on linear regression analysis, we found that the existing bluegrass billbug model was not robust in predicting billbug activity in Utah and Idaho. Instead, the model that best predicts adult activity of the billbug complex accumulates degree-days above 3°C after 13 January. This model predicts adult activity levels important for management within 11 d of observed activity at 77% of sites. In conjunction with outreach and cooperative networking, this predictive degree-day model may assist end users to better time monitoring efforts and insecticide applications against billbug pests in Utah and Idaho by predicting adult activity. © The Author 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Evaluation and Development of Pavement Scores, Performance Models and Needs Estimates for the TXDOT Pavement Management Information System : Final Report

    DOT National Transportation Integrated Search

    2012-10-01

    This project conducted a thorough review of the existing Pavement Management Information System (PMIS) database, : performance models, needs estimates, utility curves, and scores calculations, as well as a review of District practices : concerning th...

  3. A TRAINING MODEL FOR THE JOBLESS ADULT.

    ERIC Educational Resources Information Center

    ULRICH, BERNARD

    THE TRAINING SYSTEMS DESIGN, AN INTERDISCIPLINARY APPROACH UTILIZING KNOWLEDGE OF BEHAVIORAL SCIENCES, NEW INSTRUCTIONAL TECHNOLOGY, AND SYSTEMS DESIGN, HAS BEEN APPLIED TO DEVELOP A MODEL FOR RE-EDUCATING AND TRAINING THE AGING UNEMPLOYED. RESEARCH INTO EXISTING MDTA DEMONSTRATION PROGRAMS BY THE COOPERATIVE EFFORTS OF MCGRAW-HILL AND THE…

  4. Diversity's Impact on the Executive Coaching Process

    ERIC Educational Resources Information Center

    Maltbia, Terrence E.; Power, Anne

    2005-01-01

    This paper presents a conceptual model intended to expand existing executive coaching processes used in organizations by building the strategic learning capabilities needed to integrate a diversity perspective into this emerging field of HRD practice. This model represents the early development of results from a Diversity Practitioner Study…

  5. The Use of Chemical Probes for the Characterization of the Predominant Abiotic Reductants in Anaerobic Sediments

    EPA Science Inventory

    Identifying the predominant chemical reductants and pathways for electron transfer in anaerobic systems is paramount to the development of environmental fate models that incorporate pathways for abiotic reductive transformations. Currently, such models do not exist. In this chapt...

  6. Multiaxial and Thermomechanical Fatigue of Materials: A Historical Perspective and Some Future Challenges

    NASA Technical Reports Server (NTRS)

    Kalluri, Sreeramesh

    2013-01-01

    Structural materials used in engineering applications routinely subjected to repetitive mechanical loads in multiple directions under non-isothermal conditions. Over past few decades, several multiaxial fatigue life estimation models (stress- and strain-based) developed for isothermal conditions. Historically, numerous fatigue life prediction models also developed for thermomechanical fatigue (TMF) life prediction, predominantly for uniaxial mechanical loading conditions. Realistic structural components encounter multiaxial loads and non-isothermal loading conditions, which increase potential for interaction of damage modes. A need exists for mechanical testing and development verification of life prediction models under such conditions.

  7. High Resolution Visualization Applied to Future Heavy Airlift Concept Development and Evaluation

    NASA Technical Reports Server (NTRS)

    FordCook, A. B.; King, T.

    2012-01-01

    This paper explores the use of high resolution 3D visualization tools for exploring the feasibility and advantages of future military cargo airlift concepts and evaluating compatibility with existing and future payload requirements. Realistic 3D graphic representations of future airlifters are immersed in rich, supporting environments to demonstrate concepts of operations to key personnel for evaluation, feedback, and development of critical joint support. Accurate concept visualizations are reviewed by commanders, platform developers, loadmasters, soldiers, scientists, engineers, and key principal decision makers at various stages of development. The insight gained through the review of these physically and operationally realistic visualizations is essential to refining design concepts to meet competing requirements in a fiscally conservative defense finance environment. In addition, highly accurate 3D geometric models of existing and evolving large military vehicles are loaded into existing and proposed aircraft cargo bays. In this virtual aircraft test-loading environment, materiel developers, engineers, managers, and soldiers can realistically evaluate the compatibility of current and next-generation airlifters with proposed cargo.

  8. Allometric Equations for Aboveground and Belowground Biomass Estimations in an Evergreen Forest in Vietnam.

    PubMed

    Nam, Vu Thanh; van Kuijk, Marijke; Anten, Niels P R

    2016-01-01

    Allometric regression models are widely used to estimate tropical forest biomass, but balancing model accuracy with efficiency of implementation remains a major challenge. In addition, while numerous models exist for aboveground mass, very few exist for roots. We developed allometric equations for aboveground biomass (AGB) and root biomass (RB) based on 300 (of 45 species) and 40 (of 25 species) sample trees respectively, in an evergreen forest in Vietnam. The biomass estimations from these local models were compared to regional and pan-tropical models. For AGB we also compared local models that distinguish functional types to an aggregated model, to assess the degree of specificity needed in local models. Besides diameter at breast height (DBH) and tree height (H), wood density (WD) was found to be an important parameter in AGB models. Existing pan-tropical models resulted in up to 27% higher estimates of AGB, and overestimated RB by nearly 150%, indicating the greater accuracy of local models at the plot level. Our functional group aggregated local model which combined data for all species, was as accurate in estimating AGB as functional type specific models, indicating that a local aggregated model is the best choice for predicting plot level AGB in tropical forests. Finally our study presents the first allometric biomass models for aboveground and root biomass in forests in Vietnam.

  9. Allometric Equations for Aboveground and Belowground Biomass Estimations in an Evergreen Forest in Vietnam

    PubMed Central

    Nam, Vu Thanh; van Kuijk, Marijke; Anten, Niels P. R.

    2016-01-01

    Allometric regression models are widely used to estimate tropical forest biomass, but balancing model accuracy with efficiency of implementation remains a major challenge. In addition, while numerous models exist for aboveground mass, very few exist for roots. We developed allometric equations for aboveground biomass (AGB) and root biomass (RB) based on 300 (of 45 species) and 40 (of 25 species) sample trees respectively, in an evergreen forest in Vietnam. The biomass estimations from these local models were compared to regional and pan-tropical models. For AGB we also compared local models that distinguish functional types to an aggregated model, to assess the degree of specificity needed in local models. Besides diameter at breast height (DBH) and tree height (H), wood density (WD) was found to be an important parameter in AGB models. Existing pan-tropical models resulted in up to 27% higher estimates of AGB, and overestimated RB by nearly 150%, indicating the greater accuracy of local models at the plot level. Our functional group aggregated local model which combined data for all species, was as accurate in estimating AGB as functional type specific models, indicating that a local aggregated model is the best choice for predicting plot level AGB in tropical forests. Finally our study presents the first allometric biomass models for aboveground and root biomass in forests in Vietnam. PMID:27309718

  10. Use of animals in the development and control of viral vaccines.

    PubMed

    Minor, P D

    1996-01-01

    Animal models were central to the development of poliovaccines and remain essential in some form in the routine quality control of both live and killed vaccines. The necessity of an animal model is illustrated by the examples of mumps and measles vaccines where the existing materials, while satisfactory, have a number of drawbacks and where changes in current practice raise concerns for safety and efficacy.

  11. Development of Accommodation Models for Soldiers in Vehicles: Squad

    DTIC Science & Technology

    2014-09-01

    average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed...unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Data from a previous study of Soldier posture and position were analyzed to develop statistical...range of seat height and seat back angle. All of the models include the effects of body armor and body borne gear. 15. SUBJECT TERMS Anthropometry

  12. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  13. Modeling and prediction of ionospheric scintillation

    NASA Technical Reports Server (NTRS)

    Fremouw, E. J.

    1974-01-01

    Scintillation modeling performed thus far is based on the theory of diffraction by a weakly modulating phase screen developed by Briggs and Parkin (1963). Shortcomings of the existing empirical model for the scintillation index are discussed together with questions of channel modeling, giving attention to the needs of the communication engineers. It is pointed out that much improved scintillation index models may be available in a matter of a year or so.

  14. A simple, analytic 3-dimensional downburst model based on boundary layer stagnation flow

    NASA Technical Reports Server (NTRS)

    Oseguera, Rosa M.; Bowles, Roland L.

    1988-01-01

    A simple downburst model is developed for use in batch and real-time piloted simulation studies of guidance strategies for terminal area transport aircraft operations in wind shear conditions. The model represents an axisymmetric stagnation point flow, based on velocity profiles from the Terminal Area Simulation System (TASS) model developed by Proctor and satisfies the mass continuity equation in cylindrical coordinates. Altitude dependence, including boundary layer effects near the ground, closely matches real-world measurements, as do the increase, peak, and decay of outflow and downflow with increasing distance from the downburst center. Equations for horizontal and vertical winds were derived, and found to be infinitely differentiable, with no singular points existent in the flow field. In addition, a simple relationship exists among the ratio of maximum horizontal to vertical velocities, the downdraft radius, depth of outflow, and altitude of maximum outflow. In use, a microburst can be modeled by specifying four characteristic parameters, velocity components in the x, y and z directions, and the corresponding nine partial derivatives are obtained easily from the velocity equations.

  15. Absorption and Clearance of Pharmaceutical Aerosols in the Human Nose: Development of a CFD Model.

    PubMed

    Rygg, Alex; Longest, P Worth

    2016-10-01

    The objective of this study was to develop a computational fluid dynamics (CFD) model to predict the deposition, dissolution, clearance, and absorption of pharmaceutical particles in the human nasal cavity. A three-dimensional nasal cavity geometry was converted to a surface-based model, providing an anatomically-accurate domain for the simulations. Particle deposition data from a commercial nasal spray product was mapped onto the surface model, and a mucus velocity field was calculated and validated with in vivo nasal clearance rates. A submodel for the dissolution of deposited particles was developed and validated based on comparisons to existing in vitro data for multiple pharmaceutical products. A parametric study was then performed to assess sensitivity of epithelial drug uptake to model conditions and assumptions. The particle displacement distance (depth) in the mucus layer had a modest effect on overall drug absorption, while the mucociliary clearance rate was found to be primarily responsible for drug uptake over the timescale of nasal clearance for the corticosteroid mometasone furoate (MF). The model revealed that drug deposition in the nasal vestibule (NV) could slowly be transported into the main passage (MP) and then absorbed through connection of the liquid layer in the NV and MP regions. As a result, high intersubject variability in cumulative uptake was predicted, depending on the length of time the NV dose was left undisturbed without blowing or wiping the nose. This study has developed, for the first time, a complete CFD model of nasal aerosol delivery from the point of spray formation through absorption at the respiratory epithelial surface. For the development and assessment of nasal aerosol products, this CFD-based in silico model provides a new option to complement existing in vitro nasal cast studies of deposition and in vivo imaging experiments of clearance.

  16. Models and theories of prescribing decisions: A review and suggested a new model

    PubMed Central

    Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the ‘persuasion theory - elaboration likelihood model’, the stimuli–response marketing model’, the ‘agency theory’, the theory of planned behaviour,’ and ‘social power theory,’ in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research. PMID:28690701

  17. Boundary cooled rocket engines for space storable propellants

    NASA Technical Reports Server (NTRS)

    Kesselring, R. C.; Mcfarland, B. L.; Knight, R. M.; Gurnitz, R. N.

    1972-01-01

    An evaluation of an existing analytical heat transfer model was made to develop the technology of boundary film/conduction cooled rocket thrust chambers to the space storable propellant combination oxygen difluoride/diborane. Critical design parameters were identified and their importance determined. Test reduction methods were developed to enable data obtained from short duration hot firings with a thin walled (calorimeter) chamber to be used quantitatively evaluate the heat absorbing capability of the vapor film. The modification of the existing like-doublet injector was based on the results obtained from the calorimeter firings.

  18. Technical Review of the CENWP Computational Fluid Dynamics Model of the John Day Dam Forebay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakowski, Cynthia L.; Serkowski, John A.; Richmond, Marshall C.

    The US Army Corps of Engineers Portland District (CENWP) has developed a computational fluid dynamics (CFD) model of the John Day forebay on the Columbia River to aid in the development and design of alternatives to improve juvenile salmon passage at the John Day Project. At the request of CENWP, Pacific Northwest National Laboratory (PNNL) Hydrology Group has conducted a technical review of CENWP's CFD model run in CFD solver software, STAR-CD. PNNL has extensive experience developing and applying 3D CFD models run in STAR-CD for Columbia River hydroelectric projects. The John Day forebay model developed by CENWP is adequatelymore » configured and validated. The model is ready for use simulating forebay hydraulics for structural and operational alternatives. The approach and method are sound, however CENWP has identified some improvements that need to be made for future models and for modifications to this existing model.« less

  19. Development and validation of a Markov microsimulation model for the economic evaluation of treatments in osteoporosis.

    PubMed

    Hiligsmann, Mickaël; Ethgen, Olivier; Bruyère, Olivier; Richy, Florent; Gathon, Henry-Jean; Reginster, Jean-Yves

    2009-01-01

    Markov models are increasingly used in economic evaluations of treatments for osteoporosis. Most of the existing evaluations are cohort-based Markov models missing comprehensive memory management and versatility. In this article, we describe and validate an original Markov microsimulation model to accurately assess the cost-effectiveness of prevention and treatment of osteoporosis. We developed a Markov microsimulation model with a lifetime horizon and a direct health-care cost perspective. The patient history was recorded and was used in calculations of transition probabilities, utilities, and costs. To test the internal consistency of the model, we carried out an example calculation for alendronate therapy. Then, external consistency was investigated by comparing absolute lifetime risk of fracture estimates with epidemiologic data. For women at age 70 years, with a twofold increase in the fracture risk of the average population, the costs per quality-adjusted life-year gained for alendronate therapy versus no treatment were estimated at €9105 and €15,325, respectively, under full and realistic adherence assumptions. All the sensitivity analyses in terms of model parameters and modeling assumptions were coherent with expected conclusions and absolute lifetime risk of fracture estimates were within the range of previous estimates, which confirmed both internal and external consistency of the model. Microsimulation models present some major advantages over cohort-based models, increasing the reliability of the results and being largely compatible with the existing state of the art, evidence-based literature. The developed model appears to be a valid model for use in economic evaluations in osteoporosis.

  20. Digital imaging and remote sensing image generator (DIRSIG) as applied to NVESD sensor performance modeling

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly E.; Choi, Hee-sue S.; Kaur, Balvinder; Olson, Jeffrey T.; Hill, Clayton F.; Hutchinson, James A.

    2016-05-01

    The US Army's Communications Electronics Research, Development and Engineering Center (CERDEC) Night Vision and Electronic Sensors Directorate (referred to as NVESD) is developing a virtual detection, recognition, and identification (DRI) testing methodology using simulated imagery as a means of augmenting the field testing component of sensor performance evaluation, which is expensive, resource intensive, time consuming, and limited to the available target(s) and existing atmospheric visibility and environmental conditions at the time of testing. Existing simulation capabilities such as the Digital Imaging Remote Sensing Image Generator (DIRSIG) and NVESD's Integrated Performance Model Image Generator (NVIPM-IG) can be combined with existing detection algorithms to reduce cost/time, minimize testing risk, and allow virtual/simulated testing using full spectral and thermal object signatures, as well as those collected in the field. NVESD has developed an end-to-end capability to demonstrate the feasibility of this approach. Simple detection algorithms have been used on the degraded images generated by NVIPM-IG to determine the relative performance of the algorithms on both DIRSIG-simulated and collected images. Evaluating the degree to which the algorithm performance agrees between simulated versus field collected imagery is the first step in validating the simulated imagery procedure.

  1. A Comparative Test of Work-Family Conflict Models and Critical Examination of Work-Family Linkages

    ERIC Educational Resources Information Center

    Michel, Jesse S.; Mitchelson, Jacqueline K.; Kotrba, Lindsey M.; LeBreton, James M.; Baltes, Boris B.

    2009-01-01

    This paper is a comprehensive meta-analysis of over 20 years of work-family conflict research. A series of path analyses were conducted to compare and contrast existing work-family conflict models, as well as a new model we developed which integrates and synthesizes current work-family theory and research. This new model accounted for 40% of the…

  2. Catchment-scale modeling of nitrogen dynamics in a temperate forested watershed, Oregon. An interdisciplinary communication strategy.

    Treesearch

    Kellie Vache; Lutz Breuer; Julia Jones; Phil Sollins

    2015-01-01

    We present a systems modeling approach to the development of a place-based ecohydrological model. The conceptual model is calibrated to a variety of existing observations, taken in watershed 10 (WS10) at the HJ Andrews Experimental Forest (HJA) in Oregon, USA, a long term ecological research (LTER) site with a long history of catchment-...

  3. The development, evaluation, and application of O3 flux and flux-response models for additional agricultural crops

    Treesearch

    L. D. Emberson; W. J. Massman; P. Buker; G. Soja; I. Van De Sand; G. Mills; C. Jacobs

    2006-01-01

    Currently, stomatal O3 flux and flux-response models only exist for wheat and potato (LRTAP Convention, 2004), as such there is a need to extend these models to include additional crop types. The possibility of establishing robust stomatal flux models for five agricultural crops (tomato, grapevine, sugar beet, maize and sunflower) was investigated. These crops were...

  4. A Diffusion Model for Two-sided Service Systems

    NASA Astrophysics Data System (ADS)

    Homma, Koichi; Yano, Koujin; Funabashi, Motohisa

    A diffusion model is proposed for two-sided service systems. ‘Two-sided’ refers to the existence of an economic network effect between two different and interrelated groups, e.g., card holders and merchants in an electronic money service. The service benefit for a member of one side depends on the number and quality of the members on the other side. A mathematical model by J. H. Rohlfs explains the network (or bandwagon) effect of communications services. In Rohlfs' model, only the users' group exists and the model is one-sided. This paper extends Rohlfs' model to a two-sided model. We propose, first, a micro model that explains individual behavior in regard to service subscription of both sides and a computational method that drives the proposed model. Second, we develop macro models with two diffusion-rate variables by simplifying the micro model. As a case study, we apply the models to an electronic money service and discuss the simulation results and actual statistics.

  5. Proof-of-Concept Demonstrations for Computation-Based Human Reliability Analysis. Modeling Operator Performance During Flooding Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe, Jeffrey Clark; Boring, Ronald Laurids; Herberger, Sarah Elizabeth Marie

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program has the overall objective to help sustain the existing commercial nuclear power plants (NPPs). To accomplish this program objective, there are multiple LWRS “pathways,” or research and development (R&D) focus areas. One LWRS focus area is called the Risk-Informed Safety Margin and Characterization (RISMC) pathway. Initial efforts under this pathway to combine probabilistic and plant multi-physics models to quantify safety margins and support business decisions also included HRA, but in a somewhat simplified manner. HRA experts at Idaho National Laboratory (INL) have been collaborating with othermore » experts to develop a computational HRA approach, called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER), for inclusion into the RISMC framework. The basic premise of this research is to leverage applicable computational techniques, namely simulation and modeling, to develop and then, using RAVEN as a controller, seamlessly integrate virtual operator models (HUNTER) with 1) the dynamic computational MOOSE runtime environment that includes a full-scope plant model, and 2) the RISMC framework PRA models already in use. The HUNTER computational HRA approach is a hybrid approach that leverages past work from cognitive psychology, human performance modeling, and HRA, but it is also a significant departure from existing static and even dynamic HRA methods. This report is divided into five chapters that cover the development of an external flooding event test case and associated statistical modeling considerations.« less

  6. Benefit-cost estimation for alternative drinking water maximum contaminant levels

    NASA Astrophysics Data System (ADS)

    Gurian, Patrick L.; Small, Mitchell J.; Lockwood, John R.; Schervish, Mark J.

    2001-08-01

    A simulation model for estimating compliance behavior and resulting costs at U.S. Community Water Suppliers is developed and applied to the evaluation of a more stringent maximum contaminant level (MCL) for arsenic. Probability distributions of source water arsenic concentrations are simulated using a statistical model conditioned on system location (state) and source water type (surface water or groundwater). This model is fit to two recent national surveys of source waters, then applied with the model explanatory variables for the population of U.S. Community Water Suppliers. Existing treatment types and arsenic removal efficiencies are also simulated. Utilities with finished water arsenic concentrations above the proposed MCL are assumed to select the least cost option compatible with their existing treatment from among 21 available compliance strategies and processes for meeting the standard. Estimated costs and arsenic exposure reductions at individual suppliers are aggregated to estimate the national compliance cost, arsenic exposure reduction, and resulting bladder cancer risk reduction. Uncertainties in the estimates are characterized based on uncertainties in the occurrence model parameters, existing treatment types, treatment removal efficiencies, costs, and the bladder cancer dose-response function for arsenic.

  7. 3D Modeling of Lacus Mortis Pit Crater with Presumed Interior Tube Structure

    NASA Astrophysics Data System (ADS)

    Hong, Ik-Seon; Yi, Yu; Yu, Jaehyung; Haruyama, Junichi

    2015-06-01

    When humans explore the Moon, lunar caves will be an ideal base to provide a shelter from the hazards of radiation, meteorite impact, and extreme diurnal temperature differences. In order to ascertain the existence of caves on the Moon, it is best to visit the Moon in person. The Google Lunar X Prize(GLXP) competition started recently to attempt lunar exploration missions. Ones of those groups competing, plan to land on a pit of Lacus Mortis and determine the existence of a cave inside this pit. In this pit, there is a ramp from the entrance down to the inside of the pit, which enables a rover to approach the inner region of the pit. In this study, under the assumption of the existence of a cave in this pit, a 3D model was developed based on the optical image data. Since this model simulates the actual terrain, the rendering of the model agrees well with the image data. Furthermore, the 3D printing of this model will enable more rigorous investigations and also could be used to publicize lunar exploration missions with ease.

  8. Regulatory Technology Development Plan - Sodium Fast Reactor. Mechanistic Source Term - Trial Calculation. Work Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grabaskas, David; Bucknor, Matthew; Jerden, James

    2016-02-01

    The overall objective of the SFR Regulatory Technology Development Plan (RTDP) effort is to identify and address potential impediments to the SFR regulatory licensing process. In FY14, an analysis by Argonne identified the development of an SFR-specific MST methodology as an existing licensing gap with high regulatory importance and a potentially long lead-time to closure. This work was followed by an initial examination of the current state-of-knowledge regarding SFR source term development (ANLART-3), which reported several potential gaps. Among these were the potential inadequacies of current computational tools to properly model and assess the transport and retention of radionuclides duringmore » a metal fuel pool-type SFR core damage incident. The objective of the current work is to determine the adequacy of existing computational tools, and the associated knowledge database, for the calculation of an SFR MST. To accomplish this task, a trial MST calculation will be performed using available computational tools to establish their limitations with regard to relevant radionuclide release/retention/transport phenomena. The application of existing modeling tools will provide a definitive test to assess their suitability for an SFR MST calculation, while also identifying potential gaps in the current knowledge base and providing insight into open issues regarding regulatory criteria/requirements. The findings of this analysis will assist in determining future research and development needs.« less

  9. MCore: A High-Order Finite-Volume Dynamical Core for Atmospheric General Circulation Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P.; Jablonowski, C.

    2011-12-01

    The desire for increasingly accurate predictions of the atmosphere has driven numerical models to smaller and smaller resolutions, while simultaneously exponentially driving up the cost of existing numerical models. Even with the modern rapid advancement of computational performance, it is estimated that it will take more than twenty years before existing models approach the scales needed to resolve atmospheric convection. However, smarter numerical methods may allow us to glimpse the types of results we would expect from these fine-scale simulations while only requiring a fraction of the computational cost. The next generation of atmospheric models will likely need to rely on both high-order accuracy and adaptive mesh refinement in order to properly capture features of interest. We present our ongoing research on developing a set of ``smart'' numerical methods for simulating the global non-hydrostatic fluid equations which govern atmospheric motions. We have harnessed a high-order finite-volume based approach in developing an atmospheric dynamical core on the cubed-sphere. This type of method is desirable for applications involving adaptive grids, since it has been shown that spuriously reflected wave modes are intrinsically damped out under this approach. The model further makes use of an implicit-explicit Runge-Kutta-Rosenbrock (IMEX-RKR) time integrator for accurate and efficient coupling of the horizontal and vertical model components. We survey the algorithmic development of the model and present results from idealized dynamical core test cases, as well as give a glimpse at future work with our model.

  10. Modeling small cell lung cancer (SCLC) biology through deterministic and stochastic mathematical models.

    PubMed

    Salgia, Ravi; Mambetsariev, Isa; Hewelt, Blake; Achuthan, Srisairam; Li, Haiqing; Poroyko, Valeriy; Wang, Yingyu; Sattler, Martin

    2018-05-25

    Mathematical cancer models are immensely powerful tools that are based in part on the fractal nature of biological structures, such as the geometry of the lung. Cancers of the lung provide an opportune model to develop and apply algorithms that capture changes and disease phenotypes. We reviewed mathematical models that have been developed for biological sciences and applied them in the context of small cell lung cancer (SCLC) growth, mutational heterogeneity, and mechanisms of metastasis. The ultimate goal is to develop the stochastic and deterministic nature of this disease, to link this comprehensive set of tools back to its fractalness and to provide a platform for accurate biomarker development. These techniques may be particularly useful in the context of drug development research, such as combination with existing omics approaches. The integration of these tools will be important to further understand the biology of SCLC and ultimately develop novel therapeutics.

  11. Energy Auditor and Quality Control Inspector Competency Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Head, Heather R.; Kurnik, Charles W.; Schroeder, Derek

    The Energy Auditor (EA) and Quality Control Inspector (QCI) Competency model was developed to identify the soft skills, foundational competencies and define the levels of Knowledge, Skills, and Abilities (KSAs) required to successfully perform the tasks defined in the EA and QCI Job Task Analysis (JTAs), the U.S. Department of Energy (DOE) used the U.S. Department of Labor's (DOL) Competency Model Clearinghouse resources to develop a QCI and EA Competency Model. To keep the QCI and EA competency model consistent with other construction and energy management competency models, DOE and the National Renewable Energy Laboratory used the existing 'Residential Constructionmore » Competency Model' and the 'Advanced Commercial Building Competency Model' where appropriate.« less

  12. Development of a Symptom-Based Patient-Reported Outcome Instrument for Functional Dyspepsia: A Preliminary Conceptual Model and an Evaluation of the Adequacy of Existing Instruments.

    PubMed

    Taylor, Fiona; Reasner, David S; Carson, Robyn T; Deal, Linda S; Foley, Catherine; Iovin, Ramon; Lundy, J Jason; Pompilus, Farrah; Shields, Alan L; Silberg, Debra G

    2016-10-01

    The aim was to document, from the perspective of the empirical literature, the primary symptoms of functional dyspepsia (FD), evaluate the extent to which existing questionnaires target those symptoms, and, finally, identify any missing evidence that would impact the questionnaires' use in regulated clinical trials to assess treatment efficacy claims intended for product labeling. A literature review was conducted to identify the primary symptoms of FD and existing symptom-based FD patient-reported outcome (PRO) instruments. Following a database search, abstracts were screened and articles were retrieved for review. The primary symptoms of FD were organized into a conceptual model and the PRO instruments were evaluated for conceptual coverage as well as compared against evidentiary requirements presented in the FDA's PRO Guidance for Industry. Fifty-six articles and 16 instruments assessing FD symptoms were reviewed. Concepts listed in the Rome III criteria for FD (n = 7), those assessed by existing FD instruments (n = 34), and symptoms reported by patients in published qualitative research (n = 6) were summarized in the FD conceptual model. Except for vomiting, all of the identified symptoms from the published qualitative research reports were also specified in the Rome III criteria. Only three of the 16 instruments, the Dyspepsia Symptom Severity Index (DSSI), Nepean Dyspepsia Index (NDI), and Short-Form Nepean Dyspepsia Index (SF-NDI), measure all seven FD symptoms defined by the Rome III criteria. Among these three, each utilizes a 2-week recall period and 5-point Likert-type scale, and had evidence of patient involvement in development. Despite their coverage, when these instruments were evaluated in light of regulatory expectations, several issues jeopardized their potential qualification for substantiation of a labeling claim. No existing PRO instruments that measured all seven symptoms adhered to the regulatory principles necessary to support product labeling. As such, the development of a new FD symptom PRO instrument is supported.

  13. Marine Planning for Potential Wave Energy Facility Placement Amongst a Crowded Sea of Existing Resource Uses

    NASA Astrophysics Data System (ADS)

    Feist, B. E.; Fuller, E.; Plummer, M. L.

    2016-12-01

    Conversion to renewable energy sources is a logical response to increasing pressure to reduce greenhouse gas emissions. Ocean wave energy is the least developed renewable energy source, despite having the highest energy per unit area. While many hurdles remain in developing wave energy, assessing potential conflicts and evaluating tradeoffs with existing uses is essential. Marine planning encompasses a broad array of activities that take place in and affect large marine ecosystems, making it an ideal tool for evaluating wave energy resource use conflicts. In this study, we focus on the potential conflicts between wave energy conversion (WEC) facilities and existing marine uses in the context of marine planning, within the California Current Large Marine Ecosystem. First, we evaluated wave energy facility development using the Wave Energy Model (WEM) of the Integrated Valuation of Ecosystem Services and Trade-offs (InVEST) toolkit. Second, we ran spatial analyses on model output to identify conflicts with existing marine uses including AIS based vessel traffic, VMS and observer based measures of commercial fishing effort, and marine conservation areas. We found that regions with the highest wave energy potential were distant from major cities and that infrastructure limitations (cable landing sites) restrict integration with existing power grids. We identified multiple spatial conflicts with existing marine uses; especially shipping vessels and various commercial fishing fleets, and overlap with marine conservation areas varied by conservation designation. While wave energy generation facilities may be economically viable in the California Current, this viability must be considered within the context of the costs associated with conflicts that arise with existing marine uses. Our analyses can be used to better inform placement of WEC devices (as well as other types of renewable energy facilities) in the context of marine planning by accounting for economic tradeoffs and providing spatially explicit site prioritization.

  14. Negotiating Northern Resource Development Frontiers: People, Energy, and Decision-Making in Yamal

    NASA Astrophysics Data System (ADS)

    Osipov, Igor A.

    This dissertation examines contemporary models of co-existence and partnerships negotiated between local communities, government, and resource corporations in the Russian District of Purovsky (Arctic Yamal), with a particular focus on the relations of these partnerships to Russia's wider socio-cultural and political contexts and, more broadly, the circumpolar world. Yamal has Eurasia's richest oil and gas reserves, and is an important crossroads region where various geopolitical and financial interests intersect. With the opening up of new gas and oil fields, and construction of roads and pipelines, Yamal is experiencing rapid changes; and is being challenged to reshape its many 'frontiers' in which people, energy, and decisions are closely linked to one another. Since the late 1970s, resource development projects have had significant impacts on the lives of the local people in the Purovsky tundra. Along with experiencing negative consequences, such as water and soil contamination, impacts on land, wildlife, and local communities have also nurtured creative ways of adaptation, decision-making, and self-organization. Since 1998, a number of unique models of co-existence and participatory dialogue, involving public project reviews, and sound participation of local indigenous activist groups have been developed and implemented in Yamal. Furthermore, during the past decade the Purovsky District has served as a unique decision-making polygon for the Northeastern Urals. Several joint community-industry-government political and economic cooperation models have been tested and their elements have subsequently been implemented in other Arctic Russian localities. From 2006-2008 this project was focused on documenting these important developments by investigating and explicating the on-the-ground models of agreement-making in the context that these models have been developing since the 1970s. This project, as such, strives to benefit the areas of anthropology, political science, rural economy, as well as Northern studies in indigenous-state-industry relations spectrums. More specifically, this research contributes to a better understanding of the forms of public participation, negotiation, local activism; and their interconnections to the broader sociopolitical context, rural economic capacity building, power relations, and decision-making environments that local communities, governments, and corporations create effective co-existence/partnership models.

  15. Efficient Translation of LTL Formulae into Buchi Automata

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Lerda, Flavio

    2001-01-01

    Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.

  16. Systems Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, R.L.

    1998-03-17

    The Systems Studies Activity had two objectives: (1) to investigate nontechnical barriers to the deployment of biomass production and supply systems and (2) to enhance and extend existing systems models of bioenergy supply and use. For the first objective, the Activity focused on existing bioenergy markets. Four projects were undertaken: a comparative analysis of bioenergy in Sweden and Austria; a one-day workshop on nontechnical barriers jointly supported by the Production Systems Activity; the development and testing of a framework for analyzing barriers and drivers to bioenergy markets; and surveys of wood pellet users in Sweden, Austria and the US. Formore » the second objective, two projects were undertaken. First, the Activity worked with the Integrated BioEnergy Systems (TBS) Activity of TEA Bioenergy Task XIII to enhance the BioEnergy Assessment Model (BEAM). This model is documented in the final report of the IBS Activity. The Systems Studies Activity contributed to enhancing the feedstock portion of the model by developing a coherent set of willow, poplar, and switchgrass production modules relevant to both the US and the UK. The Activity also developed a pretreatment module for switchgrass. Second, the Activity sponsored a three-day workshop on modeling bioenergy systems with the objectives of providing an overview of the types of models used to evaluate bioenergy and promoting communication among bioenergy modelers. There were nine guest speakers addressing different types of models used to evaluate different aspects of bioenergy, ranging from technoeconomic models based on the ASPEN software to linear programming models to develop feedstock supply curves for the US. The papers from this workshop have been submitted to Biomass and Bioenergy and are under editorial review.« less

  17. Development of a database for chemical mechanism assignments for volatile organic emissions.

    PubMed

    Carter, William P L

    2015-10-01

    The development of a database for making model species assignments when preparing total organic gas (TOG) emissions input for atmospheric models is described. This database currently has assignments of model species for 12 different gas-phase chemical mechanisms for over 1700 chemical compounds and covers over 3000 chemical categories used in five different anthropogenic TOG profile databases or output by two different biogenic emissions models. This involved developing a unified chemical classification system, assigning compounds to mixtures, assigning model species for the mechanisms to the compounds, and making assignments for unknown, unassigned, and nonvolatile mass. The comprehensiveness of the assignments, the contributions of various types of speciation categories to current profile and total emissions data, inconsistencies with existing undocumented model species assignments, and remaining speciation issues and areas of needed work are also discussed. The use of the system to prepare input for SMOKE, the Speciation Tool, and for biogenic models is described in the supplementary materials. The database, associated programs and files, and a users manual are available online at http://www.cert.ucr.edu/~carter/emitdb . Assigning air quality model species to the hundreds of emitted chemicals is a necessary link between emissions data and modeling effects of emissions on air quality. This is not easy and makes it difficult to implement new and more chemically detailed mechanisms in models. If done incorrectly, it is similar to errors in emissions speciation or the chemical mechanism used. Nevertheless, making such assignments is often an afterthought in chemical mechanism development and emissions processing, and existing assignments are usually undocumented and have errors and inconsistencies. This work is designed to address some of these problems.

  18. Algorithms and analytical solutions for rapidly approximating long-term dispersion from line and area sources

    NASA Astrophysics Data System (ADS)

    Barrett, Steven R. H.; Britter, Rex E.

    Predicting long-term mean pollutant concentrations in the vicinity of airports, roads and other industrial sources are frequently of concern in regulatory and public health contexts. Many emissions are represented geometrically as ground-level line or area sources. Well developed modelling tools such as AERMOD and ADMS are able to model dispersion from finite (i.e. non-point) sources with considerable accuracy, drawing upon an up-to-date understanding of boundary layer behaviour. Due to mathematical difficulties associated with line and area sources, computationally expensive numerical integration schemes have been developed. For example, some models decompose area sources into a large number of line sources orthogonal to the mean wind direction, for which an analytical (Gaussian) solution exists. Models also employ a time-series approach, which involves computing mean pollutant concentrations for every hour over one or more years of meteorological data. This can give rise to computer runtimes of several days for assessment of a site. While this may be acceptable for assessment of a single industrial complex, airport, etc., this level of computational cost precludes national or international policy assessments at the level of detail available with dispersion modelling. In this paper, we extend previous work [S.R.H. Barrett, R.E. Britter, 2008. Development of algorithms and approximations for rapid operational air quality modelling. Atmospheric Environment 42 (2008) 8105-8111] to line and area sources. We introduce approximations which allow for the development of new analytical solutions for long-term mean dispersion from line and area sources, based on hypergeometric functions. We describe how these solutions can be parameterized from a single point source run from an existing advanced dispersion model, thereby accounting for all processes modelled in the more costly algorithms. The parameterization method combined with the analytical solutions for long-term mean dispersion are shown to produce results several orders of magnitude more efficiently with a loss of accuracy small compared to the absolute accuracy of advanced dispersion models near sources. The method can be readily incorporated into existing dispersion models, and may allow for additional computation time to be expended on modelling dispersion processes more accurately in future, rather than on accounting for source geometry.

  19. Education of Blind Persons in Ethiopia.

    ERIC Educational Resources Information Center

    Maru, A. A.; Cook, M. J.

    1990-01-01

    The paper reviews the historical and cultural attitudes of Ethiopians toward blind children, the education of blind children, the special situation of orphaned blind children, limitations of existing educational models, and development of a new model that relies on elements of community-based rehabilitation and the employment of blind high school…

  20. From Conceptual Frameworks to Mental Models for Astronomy: Students' Perceptions

    ERIC Educational Resources Information Center

    Pundak, David; Liberman, Ido; Shacham, Miri

    2017-01-01

    Considerable debate exists among discipline-based astronomy education researchers about how students change their perceptions in science and astronomy. The study questioned the development of astronomical models among students in institutions of higher education by examining how college students change their initial conceptual frameworks and…

  1. Math and Science Model Programs Manual.

    ERIC Educational Resources Information Center

    Sawyer, Donna, Comp.; And Others

    This implementation manual has been developed to describe four model mathematics and science programs designed to increase African-American students' interest in mathematics and science. The manual will help affiliates of the Urban League to mobilize existing community resources to achieve the goals of the national education initiative. The four…

  2. Spatial Dynamics and Determinants of County-Level Education Expenditure in China

    ERIC Educational Resources Information Center

    Gu, Jiafeng

    2012-01-01

    In this paper, a multivariate spatial autoregressive model of local public education expenditure determination with autoregressive disturbance is developed and estimated. The existence of spatial interdependence is tested using Moran's I statistic and Lagrange multiplier test statistics for both the spatial error and spatial lag models. The full…

  3. Dystrophin insufficiency causes a Becker muscular dystrophy-like phenotype in swine

    USDA-ARS?s Scientific Manuscript database

    Duchenne muscular dystrophy (DMD) is caused by a dystrophin deficiency while Becker MD is caused by a dystrophin insufficiency or expression of a partially functional dystrophin protein. Deficiencies in existing mouse and dog models necessitate the development of a novel large animal model. Our pu...

  4. Determining the Supply of Material Resources for High-Rise Construction: Scenario Approach

    NASA Astrophysics Data System (ADS)

    Minnullina, Anna; Vasiliev, Vladimir

    2018-03-01

    This article presents a multi-criteria approach to determining the supply of material resources for high-rise construction under certain and uncertain conditions, which enables integrating a number of existing models into a fairly compact generalised economic and mathematical model developed for two extreme scenarios.

  5. Models and Resources for Advancing Sustainable Institutional and Societal Progress

    ERIC Educational Resources Information Center

    Litten, Larry H.; Terkla, Dawn Geronimo

    2007-01-01

    Institutional researchers can take advantage of a variety of resources for understanding sustainability issues and keeping abreast of developments on this front. Models exist both within and outside of higher education for analyzing and presenting data. Sustainability Indicators in Comprehensive Sustainability Reports and Fact Books are appended.…

  6. Examining, Documenting, and Modeling the Problem Space of a Variable Domain

    DTIC Science & Technology

    2002-06-14

    Feature-Oriented Domain Analysis ( FODA ) .............................................................................................. 9...development of this proposed process include: Feature-Oriented Domain Analysis ( FODA ) [3,4], Organization Domain Modeling (ODM) [2,5,6], Family-Oriented...configuration knowledge using generators [2]. 8 Existing Methods of Domain Engineering Feature-Oriented Domain Analysis ( FODA ) FODA is a domain

  7. Information-System Structure by Communication-Technology Concepts: A Cybernetic Model Approach.

    ERIC Educational Resources Information Center

    Reisig, Gerhard H. R.

    1978-01-01

    Presents the "Evidence-of-Existence" information system in which the structure is developed, with application of cybernetic concepts, as an isomorphic model in analogy to the system structure of communication technology. Three criteria of structuring are postulated: (1) source-channel-sink, with input-output characteristics, (2) filter-type…

  8. Predicting the regeneration of Appalachian hardwoods: adapting the REGEN model for the Appalachian Plateau

    Treesearch

    Lance A. Vickers; Thomas R. Fox; David L. Loftis; David A. Boucugnani

    2013-01-01

    The difficulty of achieving reliable oak (Quercus spp.) regeneration is well documented. Application of silvicultural techniques to facilitate oak regeneration largely depends on current regeneration potential. A computer model to assess regeneration potential based on existing advanced reproduction in Appalachian hardwoods was developed by David...

  9. A SCREENING MODEL FOR SIMULATING DNAPL FLOW AND TRANSPORT IN POROUS MEDIA: THEORETICAL DEVELOPMENT

    EPA Science Inventory

    There exists a need for a simple tool that will allow us to analyze a DNAPL contamination scenario from free-product release to transport of soluble constituents to downgradient receptor wells. The objective of this manuscript is to present the conceptual model and formulate the ...

  10. Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics

    NASA Astrophysics Data System (ADS)

    Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.

    2017-01-01

    The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.

  11. Toward a New Model of Fertility: The Effects of the World Economic System and the Status of Women on Fertility Behavior.

    ERIC Educational Resources Information Center

    Ward, Kathryn B.

    A relationship exists between high birth rates and the lowered status of women in developing nations, resulting from their country's economic development. Research was based on data from various sources on 34 developed nations and 92 developing nations throughout the world. Variables included income inequality, foreign trade structure and…

  12. Trajectories of Future Land Use for Earth System Modeling of the Northeast United States

    NASA Astrophysics Data System (ADS)

    Rosenzweig, B.; Vorosmarty, C. J.; Lu, X.; Kicklighter, D. W.

    2015-12-01

    The U.S. Northeast includes some of the nation's most populated cities and their supporting hinterlands, with an urban corridor spanning from Maine to Virginia. The megaregion's centuries-long history of landscape transformations has had enduring impact on the region's hydrology, ecosystems and socioeconomy. Driven by policy decisions made in the next decade, future landscape changes will also interplay with climate change, with multi-decadal effects that are currently poorly understood. While existing national and global land cover trajectories will play an important role in understanding these future impacts, they do not allow for investigation of many issues of interest to regional stakeholders, such as local zoning and suburban sprawl, the development of a regional food system, or varying rates of natural lands protection. Existing land cover trajectories also do not usually provide the detail needed as input drivers for earth system models, such as disaggregated vegetation types or harmonized time series of infrastructure management. We discuss the development of a simple land use/land cover allocation scheme to develop such needed trajectories, their implementation for 4 regional socioeconomic pathways developed collaboratively with regional stakeholders, and their preliminary use in regional ecosystem modeling.

  13. A Hybrid Fuzzy Model for Lean Product Development Performance Measurement

    NASA Astrophysics Data System (ADS)

    Osezua Aikhuele, Daniel; Mohd Turan, Faiz

    2016-02-01

    In the effort for manufacturing companies to meet up with the emerging consumer demands for mass customized products, many are turning to the application of lean in their product development process, and this is gradually moving from being a competitive advantage to a necessity. However, due to lack of clear understanding of the lean performance measurements, many of these companies are unable to implement and fully integrated the lean principle into their product development process. Extensive literature shows that only few studies have focus systematically on the lean product development performance (LPDP) evaluation. In order to fill this gap, the study therefore proposed a novel hybrid model based on Fuzzy Reasoning Approach (FRA), and the extension of Fuzzy-AHP and Fuzzy-TOPSIS methods for the assessment of the LPDP. Unlike the existing methods, the model considers the importance weight of each of the decision makers (Experts) since the performance criteria/attributes are required to be rated, and these experts have different level of expertise. The rating is done using a new fuzzy Likert rating scale (membership-scale) which is designed such that it can address problems resulting from information lost/distortion due to closed-form scaling and the ordinal nature of the existing Likert scale.

  14. SAS macro programs for geographically weighted generalized linear modeling with spatial point data: applications to health research.

    PubMed

    Chen, Vivian Yi-Ju; Yang, Tse-Chuan

    2012-08-01

    An increasing interest in exploring spatial non-stationarity has generated several specialized analytic software programs; however, few of these programs can be integrated natively into a well-developed statistical environment such as SAS. We not only developed a set of SAS macro programs to fill this gap, but also expanded the geographically weighted generalized linear modeling (GWGLM) by integrating the strengths of SAS into the GWGLM framework. Three features distinguish our work. First, the macro programs of this study provide more kernel weighting functions than the existing programs. Second, with our codes the users are able to better specify the bandwidth selection process compared to the capabilities of existing programs. Third, the development of the macro programs is fully embedded in the SAS environment, providing great potential for future exploration of complicated spatially varying coefficient models in other disciplines. We provided three empirical examples to illustrate the use of the SAS macro programs and demonstrated the advantages explained above. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  16. Adapting a scenario tree model for freedom from disease as surveillance progresses: the Canadian notifiable avian influenza model.

    PubMed

    Christensen, Jette; El Allaki, Farouk; Vallières, André

    2014-05-01

    Scenario tree models with temporal discounting have been applied in four continents to support claims of freedom from animal disease. Recently, a second (new) model was developed for the same population and disease. This is a natural development because surveillance is a dynamic process that needs to adapt to changing circumstances - the difficulty is the justification for, documentation of, presentation of and the acceptance of the changes. Our objective was to propose a systematic approach to present changes to an existing scenario tree model for freedom from disease. We used the example of how we adapted the deterministic Canadian Notifiable Avian Influenza scenario tree model published in 2011 to a stochastic scenario tree model where the definition of sub-populations and the estimation of probability of introduction of the pathogen were modified. We found that the standardized approach by Vanderstichel et al. (2013) with modifications provided a systematic approach to make and present changes to an existing scenario tree model. We believe that the new 2013 CanNAISS scenario tree model is a better model than the 2011 model because the 2013 model included more surveillance data. In particular, the new data on Notifiable Avian Influenza in Canada from the last 5 years were used to improve input parameters and model structure. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  17. Tools for visually exploring biological networks.

    PubMed

    Suderman, Matthew; Hallett, Michael

    2007-10-15

    Many tools exist for visually exploring biological networks including well-known examples such as Cytoscape, VisANT, Pathway Studio and Patika. These systems play a key role in the development of integrative biology, systems biology and integrative bioinformatics. The trend in the development of these tools is to go beyond 'static' representations of cellular state, towards a more dynamic model of cellular processes through the incorporation of gene expression data, subcellular localization information and time-dependent behavior. We provide a comprehensive review of the relative advantages and disadvantages of existing systems with two goals in mind: to aid researchers in efficiently identifying the appropriate existing tools for data visualization; to describe the necessary and realistic goals for the next generation of visualization tools. In view of the first goal, we provide in the Supplementary Material a systematic comparison of more than 35 existing tools in terms of over 25 different features. Supplementary data are available at Bioinformatics online.

  18. Bioethics: A Rationale and a Model

    ERIC Educational Resources Information Center

    Barman, Charles R.; Rusch, John J.

    1978-01-01

    Discusses the rationale for and development of an undergraduate bioethics course. Based on experiences with the course, general suggestions are offered to instructors planning to add bioethics to existing curricula. (MA)

  19. Secret Wisdom: Spiritual Intelligence in Adolescents

    ERIC Educational Resources Information Center

    Kilcup, Charmayne

    2016-01-01

    Current models of spiritual development suggest that adolescents have limited capacity for spirituality and spiritual experiences. Adolescents are seen to have immature moral and ethical judgment and be incapable of deep spiritual experience due to lack of cognitive development. This mixed-methods study explored the existence of spiritual…

  20. Active Tensor Magnetic Gradiometer System

    DTIC Science & Technology

    2007-11-01

    Modify Forward Computer Models .............................................................................................2 Modify TMGS Simulator...active magnetic gradient measurement system are based upon the existing tensor magnetic gradiometer system ( TMGS ) developed under project MM-1328...Magnetic Gradiometer System ( TMGS ) for UXO Detection, Imaging, and Discrimination.” The TMGS developed under MM-1328 was successfully tested at the

  1. The application of feature selection to the development of Gaussian process models for percutaneous absorption.

    PubMed

    Lam, Lun Tak; Sun, Yi; Davey, Neil; Adams, Rod; Prapopoulou, Maria; Brown, Marc B; Moss, Gary P

    2010-06-01

    The aim was to employ Gaussian processes to assess mathematically the nature of a skin permeability dataset and to employ these methods, particularly feature selection, to determine the key physicochemical descriptors which exert the most significant influence on percutaneous absorption, and to compare such models with established existing models. Gaussian processes, including automatic relevance detection (GPRARD) methods, were employed to develop models of percutaneous absorption that identified key physicochemical descriptors of percutaneous absorption. Using MatLab software, the statistical performance of these models was compared with single linear networks (SLN) and quantitative structure-permeability relationships (QSPRs). Feature selection methods were used to examine in more detail the physicochemical parameters used in this study. A range of statistical measures to determine model quality were used. The inherently nonlinear nature of the skin data set was confirmed. The Gaussian process regression (GPR) methods yielded predictive models that offered statistically significant improvements over SLN and QSPR models with regard to predictivity (where the rank order was: GPR > SLN > QSPR). Feature selection analysis determined that the best GPR models were those that contained log P, melting point and the number of hydrogen bond donor groups as significant descriptors. Further statistical analysis also found that great synergy existed between certain parameters. It suggested that a number of the descriptors employed were effectively interchangeable, thus questioning the use of models where discrete variables are output, usually in the form of an equation. The use of a nonlinear GPR method produced models with significantly improved predictivity, compared with SLN or QSPR models. Feature selection methods were able to provide important mechanistic information. However, it was also shown that significant synergy existed between certain parameters, and as such it was possible to interchange certain descriptors (i.e. molecular weight and melting point) without incurring a loss of model quality. Such synergy suggested that a model constructed from discrete terms in an equation may not be the most appropriate way of representing mechanistic understandings of skin absorption.

  2. Evaluating the 100 year floodplain as an indicator of flood risk in low-lying coastal watersheds

    NASA Astrophysics Data System (ADS)

    Sebastian, A.; Brody, S.; Bedient, P. B.

    2013-12-01

    The Gulf of Mexico is the fastest growing region in the United States. Since 1960, the number of housing units built in the low-lying coastal counties has increased by 246%. The region experiences some of the most intense rainfall events in the country and coastal watersheds are prone to severe flooding characterized by wide floodplains and ponding. This flooding is further exacerbated as urban development encroaches on existing streams and waterways. While the 100 year floodplain should play an important role in our ability to develop disaster resilient communities, recent research has indicated that existing floodplain delineations are a poor indicator of actual flood losses in low-lying coastal regions. Between 2001 and 2005, more than 30% of insurance claims made to FEMA in the Gulf Coast region were outside of the 100 year floodplain and residential losses amounted to more than $19.3 billion. As population density and investments in this region continue to increase, addressing flood risk in coastal communities should become a priority for engineers, urban planners, and decision makers. This study compares the effectiveness of 1-D and a 2-D modeling approaches to spatially capture flood claims from historical events. Initial results indicate that 2-D models perform much better in coastal environments and may serve better for floodplain modeling helping to prevent unintended losses. The results of this study encourage a shift towards better engineering practices using existing 2-D models in order to protect resources and provide guidance for urban development in low-lying coastal regions.

  3. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  4. Plans for Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Ballmann, Josef; Bhatia, Kumar; Blades, Eric; Boucke, Alexander; Chwalowski, Pawel; Dietz, Guido; Dowell, Earl; Florance, Jennifer P.; Hansen, Thorsten; hide

    2011-01-01

    This paper summarizes the plans for the first Aeroelastic Prediction Workshop. The workshop is designed to assess the state of the art of computational methods for predicting unsteady flow fields and aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques, and to identify computational and experimental areas needing additional research and development. Three subject configurations have been chosen from existing wind tunnel data sets where there is pertinent experimental data available for comparison. For each case chosen, the wind tunnel testing was conducted using forced oscillation of the model at specified frequencies

  5. Model-Driven Theme/UML

    NASA Astrophysics Data System (ADS)

    Carton, Andrew; Driver, Cormac; Jackson, Andrew; Clarke, Siobhán

    Theme/UML is an existing approach to aspect-oriented modelling that supports the modularisation and composition of concerns, including crosscutting ones, in design. To date, its lack of integration with model-driven engineering (MDE) techniques has limited its benefits across the development lifecycle. Here, we describe our work on facilitating the use of Theme/UML as part of an MDE process. We have developed a transformation tool that adopts model-driven architecture (MDA) standards. It defines a concern composition mechanism, implemented as a model transformation, to support the enhanced modularisation features of Theme/UML. We evaluate our approach by applying it to the development of mobile, context-aware applications-an application area characterised by many non-functional requirements that manifest themselves as crosscutting concerns.

  6. Diagnostic utility of appetite loss in addition to existing prediction models for community-acquired pneumonia in the elderly: a prospective diagnostic study in acute care hospitals in Japan.

    PubMed

    Takada, Toshihiko; Yamamoto, Yosuke; Terada, Kazuhiko; Ohta, Mitsuyasu; Mikami, Wakako; Yokota, Hajime; Hayashi, Michio; Miyashita, Jun; Azuma, Teruhisa; Fukuma, Shingo; Fukuhara, Shunichi

    2017-11-08

    Diagnosis of community-acquired pneumonia (CAP) in the elderly is often delayed because of atypical presentation and non-specific symptoms, such as appetite loss, falls and disturbance in consciousness. The aim of this study was to investigate the external validity of existing prediction models and the added value of the non-specific symptoms for the diagnosis of CAP in elderly patients. Prospective cohort study. General medicine departments of three teaching hospitals in Japan. A total of 109 elderly patients who consulted for upper respiratory symptoms between 1 October 2014 and 30 September 2016. The reference standard for CAP was chest radiograph evaluated by two certified radiologists. The existing models were externally validated for diagnostic performance by calibration plot and discrimination. To evaluate the additional value of the non-specific symptoms to the existing prediction models, we developed an extended logistic regression model. Calibration, discrimination, category-free net reclassification improvement (NRI) and decision curve analysis (DCA) were investigated in the extended model. Among the existing models, the model by van Vugt demonstrated the best performance, with an area under the curve of 0.75(95% CI 0.63 to 0.88); calibration plot showed good fit despite a significant Hosmer-Lemeshow test (p=0.017). Among the non-specific symptoms, appetite loss had positive likelihood ratio of 3.2 (2.0-5.3), negative likelihood ratio of 0.4 (0.2-0.7) and OR of 7.7 (3.0-19.7). Addition of appetite loss to the model by van Vugt led to improved calibration at p=0.48, NRI of 0.53 (p=0.019) and higher net benefit by DCA. Information on appetite loss improved the performance of an existing model for the diagnosis of CAP in the elderly. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Conceptualising paediatric health disparities: a metanarrative systematic review and unified conceptual framework.

    PubMed

    Ridgeway, Jennifer L; Wang, Zhen; Finney Rutten, Lila J; van Ryn, Michelle; Griffin, Joan M; Murad, M Hassan; Asiedu, Gladys B; Egginton, Jason S; Beebe, Timothy J

    2017-08-04

    There exists a paucity of work in the development and testing of theoretical models specific to childhood health disparities even though they have been linked to the prevalence of adult health disparities including high rates of chronic disease. We conducted a systematic review and thematic analysis of existing models of health disparities specific to children to inform development of a unified conceptual framework. We systematically reviewed articles reporting theoretical or explanatory models of disparities on a range of outcomes related to child health. We searched Ovid Medline In-Process & Other Non-Indexed Citations, Ovid MEDLINE, Ovid Embase, Ovid Cochrane Central Register of Controlled Trials, Ovid Cochrane Database of Systematic Reviews, and Scopus (database inception to 9 July 2015). A metanarrative approach guided the analysis process. A total of 48 studies presenting 48 models were included. This systematic review found multiple models but no consensus on one approach. However, we did discover a fair amount of overlap, such that the 48 models reviewed converged into the unified conceptual framework. The majority of models included factors in three domains: individual characteristics and behaviours (88%), healthcare providers and systems (63%), and environment/community (56%), . Only 38% of models included factors in the health and public policies domain. A disease-agnostic unified conceptual framework may inform integration of existing knowledge of child health disparities and guide future research. This multilevel framework can focus attention among clinical, basic and social science research on the relationships between policy, social factors, health systems and the physical environment that impact children's health outcomes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  8. Combining inferences from models of capture efficiency, detectability, and suitable habitat to classify landscapes for conservation of threatened bull trout

    Treesearch

    James T. Peterson; Jason Dunham

    2003-01-01

    Effective conservation efforts for at-risk species require knowledge of the locations of existing populations. Species presence can be estimated directly by conducting field-sampling surveys or alternatively by developing predictive models. Direct surveys can be expensive and inefficient, particularly for rare and difficult- to-sample species, and models of species...

  9. Comparison of the sensitivity of landscape-fire-succession models to variation in terrain, fuel pattern, climate and weather

    Treesearch

    Geoffrey J. Cary; Robert E. Keane; Robert H. Gardner; Sandra Lavorel; Mike D. Flannigan; Ian D. Davies; Chao Li; James M. Lenihan; T. Scott Rupp; Florent Mouillot

    2006-01-01

    The relative importance of variables in determining area burned is an important management consideration although gaining insights from existing empirical data has proven difficult. The purpose of this study was to compare the sensitivity of modeled area burned to environmental factors across a range of independently-developed landscape-fire-succession models. The...

  10. A Linked Model for Simulating Stand Development and Growth Processes of Loblolly Pine

    Treesearch

    V. Clark Baldwin; Phillip M. Dougherty; Harold E. Burkhart

    1998-01-01

    Linking models of different scales (e.g., process, tree-stand-ecosystem) is essential for furthering our understanding of stand, climatic, and edaphic effects on tree growth and forest productivity. Moreover, linking existing models that differ in scale and levels of resolution quickly identifies knowledge gaps in information required to scale from one level to another...

  11. Decentralizing the "Future Planning" of Public Education. Project SIMU School: Santa Clara County Component.

    ERIC Educational Resources Information Center

    Candoli, I. C.; Leu, Donald J.

    This analysis draws on a variety of experiences with and models of centralized and decentralized school systems now in existence. The decentralized model or profile posed for consideration is intended as a basis for the development of a process by which indigenous models can be established for any locale as unique local variables are identified…

  12. Modeling water yield response to forest cover changes in northern Minnesota

    Treesearch

    S.C. Bernath; E.S. Verry; K.N. Brooks; P.F. Ffolliott

    1982-01-01

    A water yield model (TIMWAT) has been developed to predict changes in water yield following changes in forest cover in northern Minnesota. Two versions of the model exist; one predicts changes in water yield as a function of gross precipitation and time after clearcutting. The second version predicts changes in water yield due to changes in above-ground biomass...

  13. Physical characteristics of shrub and conifer fuels for fire behavior models

    Treesearch

    Jonathan R. Gallacher; Thomas H. Fletcher; Victoria Lansinger; Sydney Hansen; Taylor Ellsworth; David R. Weise

    2017-01-01

    The physical properties and dimensions of foliage are necessary inputs for some fire spread models. Currently, almost no data exist on these plant characteristics to fill this need. In this report, we measured the physical properties and dimensions of the foliage from 10 live shrub and conifer fuels throughout a 1-year period. We developed models to predict relative...

  14. Open-source approaches for the repurposing of existing or failed candidate drugs: learning from and applying the lessons across diseases

    PubMed Central

    Allarakhia, Minna

    2013-01-01

    Repurposing has the objective of targeting existing drugs and failed, abandoned, or yet-to-be-pursued clinical candidates to new disease areas. The open-source model permits for the sharing of data, resources, compounds, clinical molecules, small libraries, and screening platforms to cost-effectively advance old drugs and/or candidates into clinical re-development. Clearly, at the core of drug-repurposing activities is collaboration, in many cases progressing beyond the open sharing of resources, technology, and intellectual property, to the sharing of facilities and joint program development to foster drug-repurposing human-capacity development. A variety of initiatives under way for drug repurposing, including those targeting rare and neglected diseases, are discussed in this review and provide insight into the stakeholders engaged in drug-repurposing discovery, the models of collaboration used, the intellectual property-management policies crafted, and human capacity developed. In the case of neglected tropical diseases, it is suggested that the development of human capital be a central aspect of drug-repurposing programs. Open-source models can support human-capital development through collaborative data generation, open compound access, open and collaborative screening, preclinical and possibly clinical studies. Given the urgency of drug development for neglected tropical diseases, the review suggests elements from current repurposing programs be extended to the neglected tropical diseases arena. PMID:23966771

  15. Open-source approaches for the repurposing of existing or failed candidate drugs: learning from and applying the lessons across diseases.

    PubMed

    Allarakhia, Minna

    2013-01-01

    Repurposing has the objective of targeting existing drugs and failed, abandoned, or yet-to-be-pursued clinical candidates to new disease areas. The open-source model permits for the sharing of data, resources, compounds, clinical molecules, small libraries, and screening platforms to cost-effectively advance old drugs and/or candidates into clinical re-development. Clearly, at the core of drug-repurposing activities is collaboration, in many cases progressing beyond the open sharing of resources, technology, and intellectual property, to the sharing of facilities and joint program development to foster drug-repurposing human-capacity development. A variety of initiatives under way for drug repurposing, including those targeting rare and neglected diseases, are discussed in this review and provide insight into the stakeholders engaged in drug-repurposing discovery, the models of collaboration used, the intellectual property-management policies crafted, and human capacity developed. In the case of neglected tropical diseases, it is suggested that the development of human capital be a central aspect of drug-repurposing programs. Open-source models can support human-capital development through collaborative data generation, open compound access, open and collaborative screening, preclinical and possibly clinical studies. Given the urgency of drug development for neglected tropical diseases, the review suggests elements from current repurposing programs be extended to the neglected tropical diseases arena.

  16. Modeling the dynamic crush of impact mitigating materials

    NASA Astrophysics Data System (ADS)

    Logan, R. W.; McMichael, L. D.

    1995-05-01

    Crushable materials are commonly utilized in the design of structural components to absorb energy and mitigate shock during the dynamic impact of a complex structure, such as an automobile chassis or drum-type shipping container. The development and application of several finite-element material models which have been developed at various times at LLNL for DYNA3D are discussed. Between the models, they are able to account for several of the predominant mechanisms which typically influence the dynamic mechanical behavior of crushable materials. One issue we addressed was that no single existing model would account for the entire gambit of constitutive features which are important for crushable materials. Thus, we describe the implementation and use of an additional material model which attempts to provide a more comprehensive model of the mechanics of crushable material behavior. This model combines features of the pre-existing DYNA models and incorporates some new features as well in an invariant large-strain formulation. In addition to examining the behavior of a unit cell in uniaxial compression, two cases were chosen to evaluate the capabilities and accuracy of the various material models in DYNA. In the first case, a model for foam filled box beams was developed and compared to test data from a four-point bend test. The model was subsequently used to study its effectiveness in energy absorption in an aluminum extrusion, spaceframe, vehicle chassis. The second case examined the response of the AT-400A shipping container and the performance of the overpack material during accident environments selected from 10CFR71 and IAEA regulations.

  17. Hysteretic Models Considering Axial-Shear-Flexure Interaction

    NASA Astrophysics Data System (ADS)

    Ceresa, Paola; Negrisoli, Giorgio

    2017-10-01

    Most of the existing numerical models implemented in finite element (FE) software, at the current state of the art, are not capable to describe, with enough reliability, the interaction between axial, shear and flexural actions under cyclic loading (e.g. seismic actions), neglecting crucial effects for predicting the nature of the collapse of reinforced concrete (RC) structural elements. Just a few existing 3D volume models or fibre beam models can lead to a quite accurate response, but they are still computationally inefficient for typical applications in earthquake engineering and also characterized by very complex formulation. Thus, discrete models with lumped plasticity hinges may be the preferred choice for modelling the hysteretic behaviour due to cyclic loading conditions, in particular with reference to its implementation in a commercial software package. These considerations lead to this research work focused on the development of a model for RC beam-column elements able to consider degradation effects and interaction between the actions under cyclic loading conditions. In order to develop a model for a general 3D discrete hinge element able to take into account the axial-shear-flexural interaction, it is necessary to provide an implementation which involves a corrector-predictor iterative scheme. Furthermore, a reliable constitutive model based on damage plasticity theory is formulated and implemented for its numerical validation. Aim of this research work is to provide the formulation of a numerical model, which will allow implementation within a FE software package for nonlinear cyclic analysis of RC structural members. The developed model accounts for stiffness degradation effect and stiffness recovery for loading reversal.

  18. Short-term responses of leaf growth rate to water deficit scale up to whole-plant and crop levels: an integrated modelling approach in maize.

    PubMed

    Chenu, Karine; Chapman, Scott C; Hammer, Graeme L; McLean, Greg; Salah, Halim Ben Haj; Tardieu, François

    2008-03-01

    Physiological and genetic studies of leaf growth often focus on short-term responses, leaving a gap to whole-plant models that predict biomass accumulation, transpiration and yield at crop scale. To bridge this gap, we developed a model that combines an existing model of leaf 6 expansion in response to short-term environmental variations with a model coordinating the development of all leaves of a plant. The latter was based on: (1) rates of leaf initiation, appearance and end of elongation measured in field experiments; and (2) the hypothesis of an independence of the growth between leaves. The resulting whole-plant leaf model was integrated into the generic crop model APSIM which provided dynamic feedback of environmental conditions to the leaf model and allowed simulation of crop growth at canopy level. The model was tested in 12 field situations with contrasting temperature, evaporative demand and soil water status. In observed and simulated data, high evaporative demand reduced leaf area at the whole-plant level, and short water deficits affected only leaves developing during the stress, either visible or still hidden in the whorl. The model adequately simulated whole-plant profiles of leaf area with a single set of parameters that applied to the same hybrid in all experiments. It was also suitable to predict biomass accumulation and yield of a similar hybrid grown in different conditions. This model extends to field conditions existing knowledge of the environmental controls of leaf elongation, and can be used to simulate how their genetic controls flow through to yield.

  19. On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang

    2015-02-01

    The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesianmore » inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.« less

  20. Development and Testing of Building Energy Model Using Non-Linear Auto Regression Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Arida, Maya Ahmad

    In 1972 sustainable development concept existed and during The years it became one of the most important solution to save natural resources and energy, but now with rising energy costs and increasing awareness of the effect of global warming, the development of building energy saving methods and models become apparently more necessary for sustainable future. According to U.S. Energy Information Administration EIA (EIA), today buildings in the U.S. consume 72 percent of electricity produced, and use 55 percent of U.S. natural gas. Buildings account for about 40 percent of the energy consumed in the United States, more than industry and transportation. Of this energy, heating and cooling systems use about 55 percent. If energy-use trends continue, buildings will become the largest consumer of global energy by 2025. This thesis proposes procedures and analysis techniques for building energy system and optimization methods using time series auto regression artificial neural networks. The model predicts whole building energy consumptions as a function of four input variables, dry bulb and wet bulb outdoor air temperatures, hour of day and type of day. The proposed model and the optimization process are tested using data collected from an existing building located in Greensboro, NC. The testing results show that the model can capture very well the system performance, and The optimization method was also developed to automate the process of finding the best model structure that can produce the best accurate prediction against the actual data. The results show that the developed model can provide results sufficiently accurate for its use in various energy efficiency and saving estimation applications.

  1. Refinements in a viscoplastic model

    NASA Technical Reports Server (NTRS)

    Freed, A. D.; Walker, K. P.

    1989-01-01

    A thermodynamically admissible theory of viscoplasticity with two internal variables (a back stress and a drag strength) is presented. Six material functions characterize a specific viscoplastic model. In the pursuit of compromise between accuracy and simplicity, a model is developed that is a hybrid of two existing viscoplastic models. A limited number of applications of the model to Al, Cu, and Ni are presented. A novel implicit integration method is also discussed. Applications are made to obtain solutions using this viscoplastic model.

  2. Sperm economy between female mating frequency and male ejaculate allocation.

    PubMed

    Abe, Jun; Kamimura, Yoshitaka

    2015-03-01

    Why females of many species mate multiply is a major question in evolutionary biology. Furthermore, if females accept matings more than once, ejaculates from different males compete for fertilization (sperm competition), which confronts males with the decision of how to allocate their reproductive resources to each mating event. Although most existing models have examined either female mating frequency or male ejaculate allocation while assuming fixed levels of the opposite sex's strategies, these strategies are likely to coevolve. To investigate how the interaction of the two sexes' strategies is influenced by the level of sperm limitation in the population, we developed models in which females adjust their number of allowable matings and males allocate their ejaculate in each mating. Our model predicts that females mate only once or less than once at an even sex ratio or in an extremely female-biased condition, because of female resistance and sperm limitation in the population, respectively. However, in a moderately female-biased condition, males favor partitioning their reproductive budgets across many females, whereas females favor multiple matings to obtain sufficient sperm, which contradicts the predictions of most existing models. We discuss our model's predictions and relationships with the existing models and demonstrate applications for empirical findings.

  3. TumorML: Concept and requirements of an in silico cancer modelling markup language.

    PubMed

    Johnson, David; Cooper, Jonathan; McKeever, Steve

    2011-01-01

    This paper describes the initial groundwork carried out as part of the European Commission funded Transatlantic Tumor Model Repositories project, to develop a new markup language for computational cancer modelling, TumorML. In this paper we describe the motivations for such a language, arguing that current state-of-the-art biomodelling languages are not suited to the cancer modelling domain. We go on to describe the work that needs to be done to develop TumorML, the conceptual design, and a description of what existing markup languages will be used to compose the language specification.

  4. Python scripting in the nengo simulator.

    PubMed

    Stewart, Terrence C; Tripp, Bryan; Eliasmith, Chris

    2009-01-01

    Nengo (http://nengo.ca) is an open-source neural simulator that has been greatly enhanced by the recent addition of a Python script interface. Nengo provides a wide range of features that are useful for physiological simulations, including unique features that facilitate development of population-coding models using the neural engineering framework (NEF). This framework uses information theory, signal processing, and control theory to formalize the development of large-scale neural circuit models. Notably, it can also be used to determine the synaptic weights that underlie observed network dynamics and transformations of represented variables. Nengo provides rich NEF support, and includes customizable models of spike generation, muscle dynamics, synaptic plasticity, and synaptic integration, as well as an intuitive graphical user interface. All aspects of Nengo models are accessible via the Python interface, allowing for programmatic creation of models, inspection and modification of neural parameters, and automation of model evaluation. Since Nengo combines Python and Java, it can also be integrated with any existing Java or 100% Python code libraries. Current work includes connecting neural models in Nengo with existing symbolic cognitive models, creating hybrid systems that combine detailed neural models of specific brain regions with higher-level models of remaining brain areas. Such hybrid models can provide (1) more realistic boundary conditions for the neural components, and (2) more realistic sub-components for the larger cognitive models.

  5. Python Scripting in the Nengo Simulator

    PubMed Central

    Stewart, Terrence C.; Tripp, Bryan; Eliasmith, Chris

    2008-01-01

    Nengo (http://nengo.ca) is an open-source neural simulator that has been greatly enhanced by the recent addition of a Python script interface. Nengo provides a wide range of features that are useful for physiological simulations, including unique features that facilitate development of population-coding models using the neural engineering framework (NEF). This framework uses information theory, signal processing, and control theory to formalize the development of large-scale neural circuit models. Notably, it can also be used to determine the synaptic weights that underlie observed network dynamics and transformations of represented variables. Nengo provides rich NEF support, and includes customizable models of spike generation, muscle dynamics, synaptic plasticity, and synaptic integration, as well as an intuitive graphical user interface. All aspects of Nengo models are accessible via the Python interface, allowing for programmatic creation of models, inspection and modification of neural parameters, and automation of model evaluation. Since Nengo combines Python and Java, it can also be integrated with any existing Java or 100% Python code libraries. Current work includes connecting neural models in Nengo with existing symbolic cognitive models, creating hybrid systems that combine detailed neural models of specific brain regions with higher-level models of remaining brain areas. Such hybrid models can provide (1) more realistic boundary conditions for the neural components, and (2) more realistic sub-components for the larger cognitive models. PMID:19352442

  6. A model for predicting thermal properties of asphalt mixtures from their constituents

    NASA Astrophysics Data System (ADS)

    Keller, Merlin; Roche, Alexis; Lavielle, Marc

    Numerous theoretical and experimental approaches have been developed to predict the effective thermal conductivity of composite materials such as polymers, foams, epoxies, soils and concrete. None of such models have been applied to asphalt concrete. This study attempts to develop a model to predict the thermal conductivity of asphalt concrete from its constituents that will contribute to the asphalt industry by reducing costs and saving time on laboratory testing. The necessity to do the laboratory testing would be no longer required when a mix for the pavement is created with desired thermal properties at the design stage by selecting correct constituents. This thesis investigated six existing predictive models for applicability to asphalt mixtures, and four standard mathematical techniques were used to develop a regression model to predict the effective thermal conductivity. The effective thermal conductivities of 81 asphalt specimens were used as the response variables, and the thermal conductivities and volume fractions of their constituents were used as the predictors. The conducted statistical analyses showed that the measured values of thermal conductivities of the mixtures are affected by the bitumen and aggregate content, but not by the air content. Contrarily, the predicted data for some investigated models are highly sensitive to air voids, but not to bitumen and/or aggregate content. Additionally, the comparison of the experimental with analytical data showed that none of the existing models gave satisfactory results; on the other hand, two regression models (Exponential 1* and Linear 3*) are promising for asphalt concrete.

  7. A novel client service quality measuring model and an eHealthcare mitigating approach.

    PubMed

    Cheng, L M; Choi, Wai Ping Choi; Wong, Anita Yiu Ming

    2016-07-01

    Facing population ageing in Hong Kong, the demand of long-term elderly health care services is increasing. The challenge is to support a good quality service under the constraints faced by recent shortage of nursing and care services professionals without redesigning the work flow operated in the existing elderly health care industries. the existing elderly health care industries. The Total QoS measure based on Finite Capacity Queuing Model is a reliable method and an effective measurement for Quality of services. The value is good for measuring the staffing level and offers a measurement for efficiency enhancement when incorporate new technologies like ICT. The implemented system has improved the Quality of Service by more than 14% and the extra released manpower resource will allow clinical care provider to offer further value added services without actually increasing head count. We have developed a novel Quality of Service measurement for Clinical Care services based on multi-queue using finite capacity queue model M/M/c/K/n and the measurement is useful for estimating the shortage of staff resource in a caring institution. It is essential for future integration with the existing widely used assessment model to develop reliable measuring limits which allow an effective measurement of public fund used in health care industries. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Effects of different Fe supplies on mineral partitioning and remobilization during the reproductive development of rice (Oryza sativa L.)

    USDA-ARS?s Scientific Manuscript database

    Minimal information exists on whole-plant dynamics of mineral flow through rice plants or on the source tissues responsible for mineral export to developing seeds. Understanding these phenomena in a model plant could help in the development of nutritionally enhanced crop cultivars. A whole-plant acc...

  9. Developing Library GIS Services for Humanities and Social Science: An Action Research Approach

    ERIC Educational Resources Information Center

    Kong, Ningning; Fosmire, Michael; Branch, Benjamin Dewayne

    2017-01-01

    In the academic libraries' efforts to support digital humanities and social science, GIS service plays an important role. However, there is no general service model existing about how libraries can develop GIS services to best engage with digital humanities and social science. In this study, we adopted the action research method to develop and…

  10. A Point System for Predicting 10-Year Risk of Developing Type 2 Diabetes Mellitus in Japanese Men: Aichi Workers' Cohort Study.

    PubMed

    Yatsuya, Hiroshi; Li, Yuanying; Hirakawa, Yoshihisa; Ota, Atsuhiko; Matsunaga, Masaaki; Haregot, Hilawe Esayas; Chiang, Chifa; Zhang, Yan; Tamakoshi, Koji; Toyoshima, Hideaki; Aoyama, Atsuko

    2018-03-17

    Relatively little evidence exists for type 2 diabetes mellitus (T2DM) prediction models from long-term follow-up studies in East Asians. This study aims to develop a point-based prediction model for 10-year risk of developing T2DM in middle-aged Japanese men. We followed 3,540 male participants of Aichi Workers' Cohort Study, who were aged 35-64 years and were free of diabetes in 2002, until March 31, 2015. Baseline age, body mass index (BMI), smoking status, alcohol consumption, regular exercise, medication for dyslipidemia, diabetes family history, and blood levels of triglycerides (TG), high density lipoprotein cholesterol (HDLC) and fasting blood glucose (FBG) were examined using Cox proportional hazard model. Variables significantly associated with T2DM in univariable models were simultaneously entered in a multivariable model for determination of the final model using backward variable selection. Performance of an existing T2DM model when applied to the current dataset was compared to that obtained in the present study's model. During the median follow-up of 12.2 years, 342 incident T2DM cases were documented. The prediction system using points assigned to age, BMI, smoking status, diabetes family history, and TG and FBG showed reasonable discrimination (c-index: 0.77) and goodness-of-fit (Hosmer-Lemeshow test, P = 0.22). The present model outperformed the previous one in the present subjects. The point system, once validated in the other populations, could be applied to middle-aged Japanese male workers to identify those at high risk of developing T2DM. In addition, further investigation is also required to examine whether the use of this system will reduce incidence.

  11. Modeling highly transient flow, mass, and heat transport in the Chattahoochee River near Atlanta, Georgia

    USGS Publications Warehouse

    Jobson, Harvey E.; Keefer, Thomas N.

    1979-01-01

    A coupled flow-temperature model has been developed and verified for a 27.9-km reach of the Chattahoochee River between Buford Dam and Norcross, Ga. Flow in this reach of the Chattahoochee is continuous but highly regulated by Buford Dam, a flood-control and hydroelectric facility located near Buford, Ga. Calibration and verification utilized two sets of data collected under highly unsteady discharge conditions. Existing solution techniques, with certain minor improvements, were applied to verify the existing technology of flow and transport modeling. A linear, implicit finite-difference flow model was coupled with implicit, finite-difference transport and temperature models. Both the conservative and nonconservative forms of the transport equation were solved, and the difference in the predicted concentrations of dye were found to be insignificant. The temperature model, therefore, was based on the simpler nonconservative form of the transport equation. (Woodard-USGS)

  12. Mammographic density, breast cancer risk and risk prediction

    PubMed Central

    Vachon, Celine M; van Gils, Carla H; Sellers, Thomas A; Ghosh, Karthik; Pruthi, Sandhya; Brandt, Kathleen R; Pankratz, V Shane

    2007-01-01

    In this review, we examine the evidence for mammographic density as an independent risk factor for breast cancer, describe the risk prediction models that have incorporated density, and discuss the current and future implications of using mammographic density in clinical practice. Mammographic density is a consistent and strong risk factor for breast cancer in several populations and across age at mammogram. Recently, this risk factor has been added to existing breast cancer risk prediction models, increasing the discriminatory accuracy with its inclusion, albeit slightly. With validation, these models may replace the existing Gail model for clinical risk assessment. However, absolute risk estimates resulting from these improved models are still limited in their ability to characterize an individual's probability of developing cancer. Promising new measures of mammographic density, including volumetric density, which can be standardized using full-field digital mammography, will likely result in a stronger risk factor and improve accuracy of risk prediction models. PMID:18190724

  13. A passive and active microwave-vector radiative transfer (PAM-VRT) model

    NASA Astrophysics Data System (ADS)

    Yang, Jun; Min, Qilong

    2015-11-01

    A passive and active microwave vector radiative transfer (PAM-VRT) package has been developed. This fast and accurate forward microwave model, with flexible and versatile input and output components, self-consistently and realistically simulates measurements/radiation of passive and active microwave sensors. The core PAM-VRT, microwave radiative transfer model, consists of five modules: gas absorption (two line-by-line databases and four fast models); hydrometeor property of water droplets and ice (spherical and nonspherical) particles; surface emissivity (from Community Radiative Transfer Model (CRTM)); vector radiative transfer of successive order of scattering (VSOS); and passive and active microwave simulation. The PAM-VRT package has been validated against other existing models, demonstrating good accuracy. The PAM-VRT not only can be used to simulate or assimilate measurements of existing microwave sensors, but also can be used to simulate observation results at some new microwave sensors.

  14. Humanitarian response: improving logistics to save lives.

    PubMed

    McCoy, Jessica

    2008-01-01

    Each year, millions of people worldwide are affected by disasters, underscoring the importance of effective relief efforts. Many highly visible disaster responses have been inefficient and ineffective. Humanitarian agencies typically play a key role in disaster response (eg, procuring and distributing relief items to an affected population, assisting with evacuation, providing healthcare, assisting in the development of long-term shelter), and thus their efficiency is critical for a successful disaster response. The field of disaster and emergency response modeling is well established, but the application of such techniques to humanitarian logistics is relatively recent. This article surveys models of humanitarian response logistics and identifies promising opportunities for future work. Existing models analyze a variety of preparation and response decisions (eg, warehouse location and the distribution of relief supplies), consider both natural and manmade disasters, and typically seek to minimize cost or unmet demand. Opportunities to enhance the logistics of humanitarian response include the adaptation of models developed for general disaster response; the use of existing models, techniques, and insights from the literature on commercial supply chain management; the development of working partnerships between humanitarian aid organizations and private companies with expertise in logistics; and the consideration of behavioral factors relevant to a response. Implementable, realistic models that support the logistics of humanitarian relief can improve the preparation for and the response to disasters, which in turn can save lives.

  15. System of experts for intelligent data management (SEIDAM)

    NASA Technical Reports Server (NTRS)

    Goodenough, David G.; Iisaka, Joji; Fung, KO

    1993-01-01

    A proposal to conduct research and development on a system of expert systems for intelligent data management (SEIDAM) is being developed. CCRS has much expertise in developing systems for integrating geographic information with space and aircraft remote sensing data and in managing large archives of remotely sensed data. SEIDAM will be composed of expert systems grouped in three levels. At the lowest level, the expert systems will manage and integrate data from diverse sources, taking account of symbolic representation differences and varying accuracies. Existing software can be controlled by these expert systems, without rewriting existing software into an Artificial Intelligence (AI) language. At the second level, SEIDAM will take the interpreted data (symbolic and numerical) and combine these with data models. at the top level, SEIDAM will respond to user goals for predictive outcomes given existing data. The SEIDAM Project will address the research areas of expert systems, data management, storage and retrieval, and user access and interfaces.

  16. Usability evaluation of mobile applications; where do we stand?

    NASA Astrophysics Data System (ADS)

    Zahra, Fatima; Hussain, Azham; Mohd, Haslina

    2017-10-01

    The range and availability of mobile applications is expanding rapidly. With the increased processing power available on portable devices, developers are increasing the range of services by embracing smartphones in their extensive and diverse practices. While usability testing and evaluations of mobile applications have not yet touched the accuracy level of other web based applications. The existing usability models do not adequately capture the complexities of interacting with applications on a mobile platform. Therefore, this study aims to presents review on existing usability models for mobile applications. These models are in their infancy but with time and more research they may eventually be adopted. Moreover, different categories of mobile apps (medical, entertainment, education) possess different functional and non-functional requirements thus customized models are required for diverse mobile applications.

  17. Prospects for rebuilding primary care using the patient-centered medical home.

    PubMed

    Landon, Bruce E; Gill, James M; Antonelli, Richard C; Rich, Eugene C

    2010-05-01

    Existing research suggests that models of enhanced primary care lead to health care systems with better performance. What the research does not show is whether such an approach is feasible or likely to be effective within the U.S. health care system. Many commentators have adopted the model of the patient-centered medical home as policy shorthand to address the reinvention of primary care in the United States. We analyze potential barriers to implementing the medical home model for policy makers and practitioners. Among others, these include developing new payment models, as well as the need for up-front funding to assemble the personnel and infrastructure required by an enhanced non-visit-based primary care practice and methods to facilitate transformation of existing practices to functioning medical homes.

  18. Stem cell-derived organoids to model gastrointestinal facets of cystic fibrosis

    PubMed Central

    Hohwieler, Meike; Perkhofer, Lukas; Liebau, Stefan; Seufferlein, Thomas; Müller, Martin

    2016-01-01

    Cystic fibrosis (CF) is one of the most frequently occurring inherited human diseases caused by mutations in the cystic fibrosis transmembrane conductance regulator (CFTR) which lead to ample defects in anion transport and epithelial fluid secretion. Existing models lack both access to early stages of CF development and a coeval focus on the gastrointestinal CF phenotypes, which become increasingly important due increased life span of the affected individuals. Here, we provide a comprehensive overview of gastrointestinal facets of CF and the opportunity to model these in various systems in an attempt to understand and treat CF. A particular focus is given on forward-leading organoid cultures, which may circumvent current limitations of existing models and thereby provide a platform for drug testing and understanding of disease pathophysiology in gastrointestinal organs. PMID:28815024

  19. Extending Maxwell's equations for dielectric materials using analytical principles from viscoelasticity based on the fractional calculus

    NASA Astrophysics Data System (ADS)

    Wharmby, Andrew William

    Existing fractional calculus models having a non-empirical basis used to describe constitutive relationships between stress and strain in viscoelastic materials are modified to employ all orders of fractional derivatives between zero and one. Parallels between viscoelastic and dielectric theory are drawn so that these modified fractional calculus based models for viscoelastic materials may be used to describe relationships between electric flux density and electric field intensity in dielectric materials. The resulting fractional calculus based dielectric relaxation model is tested using existing complex permittivity data in the radio-frequency bandwidth of a wide variety of homogeneous materials. The consequences that the application of this newly developed fractional calculus based dielectric relaxation model has on Maxwell's equations are also examined through the effects of dielectric dissipation and dispersion.

  20. Options for developing modernized geodetic datum for Nepal following the April 25, 2015 Mw7.8 Gorkha earthquake

    NASA Astrophysics Data System (ADS)

    Pearson, Chris; Manandhar, Niraj; Denys, Paul

    2017-09-01

    Along with the damage to buildings and infrastructure, the April 25, 2015 Mw7.8 Gorkha earthquake caused significant deformation over a large area of eastern Nepal with displacements of over 2 m recorded in the vicinity of Kathmandu. Nepal currently uses a classical datum developed in 1984 by the Royal (UK) Engineers in collaboration with the Nepal Survey Department. It has served Nepal well; however, the recent earthquakes have provided an impetus for developing a semi-dynamic datum that will be based on ITRF2014 and have the capacity to correct for tectonic deformation. In the scenario we present here, the datum would be based on ITRF2014 with a reference epoch set some time after the end of the current sequence of earthquakes. The deformation model contains a grid of the secular velocity field combined with models of the Gorkha Earthquake and the May 12 Mw7.3 aftershock. We have developed a preliminary velocity field by collating GPS derived crustal velocities from four previous studies for Nepal and adjacent parts of China and India and aligning them to the ITRF. Patches for the co-seismic part of the deformation for the Gorkha earthquake and the May 12, 2015 Mw 7.2 aftershock are based on published dislocation models. High order control would be a CORS network based around the existing Nepal GPS Array. Coordinates for existing lower order control would be determined by readjusting existing survey measurements and these would be combined with a series of new control stations spread throughout Nepal.

  1. A stirling engine computer model for performance calculations

    NASA Technical Reports Server (NTRS)

    Tew, R.; Jefferies, K.; Miao, D.

    1978-01-01

    To support the development of the Stirling engine as a possible alternative to the automobile spark-ignition engine, the thermodynamic characteristics of the Stirling engine were analyzed and modeled on a computer. The modeling techniques used are presented. The performance of an existing rhombic-drive Stirling engine was simulated by use of this computer program, and some typical results are presented. Engine tests are planned in order to evaluate this model.

  2. Recent Developments in Smart Adaptive Structures for Solar Sailcraft

    NASA Technical Reports Server (NTRS)

    Whorton, M. S.; Kim, Y. K.; Oakley, J.; Adetona, O.; Keel, L. H.

    2007-01-01

    The "Smart Adaptive Structures for Solar Sailcraft" development activity at MSFC has investigated issues associated with understanding how to model and scale the subsystem and multi-body system dynamics of a gossamer solar sailcraft with the objective of designing sailcraft attitude control systems. This research and development activity addressed three key tasks that leveraged existing facilities and core competencies of MSFC to investigate dynamics and control issues of solar sails. Key aspects of this effort included modeling and testing of a 30 m deployable boom; modeling of the multi-body system dynamics of a gossamer sailcraft; investigation of control-structures interaction for gossamer sailcraft; and development and experimental demonstration of adaptive control technologies to mitigate control-structures interaction.

  3. Exploring international clinical education in US-based programs: identifying common practices and modifying an existing conceptual model of international service-learning.

    PubMed

    Pechak, Celia M; Black, Jill D

    2014-02-01

    Increasingly physical therapist students complete part of their clinical training outside of their home country. This trend is understudied. The purposes of this study were to: (1) explore, in depth, various international clinical education (ICE) programs; and (2) determine whether the Conceptual Model of Optimal International Service-Learning (ISL) could be applied or adapted to represent ICE. Qualitative content analysis was used to analyze ICE programs and consider modification of an existing ISL conceptual model for ICE. Fifteen faculty in the United States currently involved in ICE were interviewed. The interview transcriptions were systematically analyzed by two researchers. Three models of ICE practices emerged: (1) a traditional clinical education model where local clinical instructors (CIs) focus on the development of clinical skills; (2) a global health model where US-based CIs provide the supervision in the international setting, and learning outcomes emphasized global health and cultural competency; and (3) an ICE/ISL hybrid where US-based CIs supervise the students, and the foci includes community service. Additionally the data supported revising the ISL model's essential core conditions, components and consequence for ICE. The ICE conceptual model may provide a useful framework for future ICE program development and research.

  4. Towards a New Generation of Agricultural System Data, Models and Knowledge Products: Design and Improvement

    NASA Technical Reports Server (NTRS)

    Antle, John M.; Basso, Bruno; Conant, Richard T.; Godfray, H. Charles J.; Jones, James W.; Herrero, Mario; Howitt, Richard E.; Keating, Brian A.; Munoz-Carpena, Rafael; Rosenzweig, Cynthia

    2016-01-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  5. Towards a new generation of agricultural system data, models and knowledge products: Design and improvement.

    PubMed

    Antle, John M; Basso, Bruno; Conant, Richard T; Godfray, H Charles J; Jones, James W; Herrero, Mario; Howitt, Richard E; Keating, Brian A; Munoz-Carpena, Rafael; Rosenzweig, Cynthia; Tittonell, Pablo; Wheeler, Tim R

    2017-07-01

    This paper presents ideas for a new generation of agricultural system models that could meet the needs of a growing community of end-users exemplified by a set of Use Cases. We envision new data, models and knowledge products that could accelerate the innovation process that is needed to achieve the goal of achieving sustainable local, regional and global food security. We identify desirable features for models, and describe some of the potential advances that we envisage for model components and their integration. We propose an implementation strategy that would link a "pre-competitive" space for model development to a "competitive space" for knowledge product development and through private-public partnerships for new data infrastructure. Specific model improvements would be based on further testing and evaluation of existing models, the development and testing of modular model components and integration, and linkages of model integration platforms to new data management and visualization tools.

  6. Radiation Detection Computational Benchmark Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing differentmore » techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for compilation. This is a report describing the details of the selected Benchmarks and results from various transport codes.« less

  7. Tools for Evaluating Fault Detection and Diagnostic Methods for HVAC Secondary Systems

    NASA Astrophysics Data System (ADS)

    Pourarian, Shokouh

    Although modern buildings are using increasingly sophisticated energy management and control systems that have tremendous control and monitoring capabilities, building systems routinely fail to perform as designed. More advanced building control, operation, and automated fault detection and diagnosis (AFDD) technologies are needed to achieve the goal of net-zero energy commercial buildings. Much effort has been devoted to develop such technologies for primary heating ventilating and air conditioning (HVAC) systems, and some secondary systems. However, secondary systems, such as fan coil units and dual duct systems, although widely used in commercial, industrial, and multifamily residential buildings, have received very little attention. This research study aims at developing tools that could provide simulation capabilities to develop and evaluate advanced control, operation, and AFDD technologies for these less studied secondary systems. In this study, HVACSIM+ is selected as the simulation environment. Besides developing dynamic models for the above-mentioned secondary systems, two other issues related to the HVACSIM+ environment are also investigated. One issue is the nonlinear equation solver used in HVACSIM+ (Powell's Hybrid method in subroutine SNSQ). It has been found from several previous research projects (ASRHAE RP 825 and 1312) that SNSQ is especially unstable at the beginning of a simulation and sometimes unable to converge to a solution. Another issue is related to the zone model in the HVACSIM+ library of components. Dynamic simulation of secondary HVAC systems unavoidably requires an interacting zone model which is systematically and dynamically interacting with building surrounding. Therefore, the accuracy and reliability of the building zone model affects operational data generated by the developed dynamic tool to predict HVAC secondary systems function. The available model does not simulate the impact of direct solar radiation that enters a zone through glazing and the study of zone model is conducted in this direction to modify the existing zone model. In this research project, the following tasks are completed and summarized in this report: 1. Develop dynamic simulation models in the HVACSIM+ environment for common fan coil unit and dual duct system configurations. The developed simulation models are able to produce both fault-free and faulty operational data under a wide variety of faults and severity levels for advanced control, operation, and AFDD technology development and evaluation purposes; 2. Develop a model structure, which includes the grouping of blocks and superblocks, treatment of state variables, initial and boundary conditions, and selection of equation solver, that can simulate a dual duct system efficiently with satisfactory stability; 3. Design and conduct a comprehensive and systematic validation procedure using collected experimental data to validate the developed simulation models under both fault-free and faulty operational conditions; 4. Conduct a numerical study to compare two solution techniques: Powell's Hybrid (PH) and Levenberg-Marquardt (LM) in terms of their robustness and accuracy. 5. Modification of the thermal state of the existing building zone model in HVACSIM+ library of component. This component is revised to consider the transmitted heat through glazing as a heat source for transient building zone load prediction In this report, literature, including existing HVAC dynamic modeling environment and models, HVAC model validation methodologies, and fault modeling and validation methodologies, are reviewed. The overall methodologies used for fault free and fault model development and validation are introduced. Detailed model development and validation results for the two secondary systems, i.e., fan coil unit and dual duct system are summarized. Experimental data mostly from the Iowa Energy Center Energy Resource Station are used to validate the models developed in this project. Satisfactory model performance in both fault free and fault simulation studies is observed for all studied systems.

  8. Humanization of pediatric care in the world: focus and review of existing models and measurement tools.

    PubMed

    Tripodi, Marina; Siano, Maria Anna; Mandato, Claudia; De Anseris, Anna Giulia Elena; Quitadamo, Paolo; Guercio Nuzio, Salvatore; Viggiano, Claudia; Fasolino, Francesco; Bellopede, Annalisa; Annunziata, Maria; Massa, Grazia; Pepe, Francesco Maria; De Chiara, Maria; Siani, Paolo; Vajro, Pietro

    2017-08-30

    The term "humanization" indicates the process by which people try to make something more human and civilized, more in line with what is believed to be the human nature. The humanization of care is an important and not yet a well-defined issue which includes a wide range of aspects related to the approach to the patient and care modalities. In pediatrics, the humanization concept is even vaguer due to the dual involvement of both the child and his/her family and by the existence of multiple proposed models. The present study aims to analyze the main existing humanization models regarding pediatric care, and the tools for assessing its grade. The main Humanization care programs have been elaborated and developed both in America (Brazil, USA) and Europe. The North American and European models specifically concern pediatric care, while the model developed in Brazil is part of a broader program aimed at all age groups. The first emphasis is on the importance of the family in child care, the second emphasis is on the child's right to be a leader, to be heard and to be able to express its opinion on the program's own care. Several tools have been created and used to evaluate humanization of care programs and related aspects. None, however, had been mutually compared. The major models of humanization care and the related assessment tools here reviewed highlight the urgent need for a more unifying approach, which may help in realizing health care programs closer to the young patient's and his/her family needs.

  9. Direct use of linear time-domain aerodynamics in aeroservoelastic analysis: Aerodynamic model

    NASA Technical Reports Server (NTRS)

    Woods, J. A.; Gilbert, Michael G.

    1990-01-01

    The work presented here is the first part of a continuing effort to expand existing capabilities in aeroelasticity by developing the methodology which is necessary to utilize unsteady time-domain aerodynamics directly in aeroservoelastic design and analysis. The ultimate objective is to define a fully integrated state-space model of an aeroelastic vehicle's aerodynamics, structure and controls which may be used to efficiently determine the vehicle's aeroservoelastic stability. Here, the current status of developing a state-space model for linear or near-linear time-domain indicial aerodynamic forces is presented.

  10. Warming Up to STS. Activities to Encourage Environmental Awareness.

    ERIC Educational Resources Information Center

    Rosenthal, Dorothy B.

    1990-01-01

    Developed is an interdisciplinary unit that deals with global warming and the greenhouse effect. Included are 10 lessons that can be used to supplement existing plans or used as a basis for developing a new unit. Included are modeling, laboratory, graphing, role-playing, and discussion activities. (KR)

  11. SOCIAL: An Integrative Framework for the Development of Social Skills

    ERIC Educational Resources Information Center

    Beauchamp, Miriam H.; Anderson, Vicki

    2010-01-01

    Despite significant advances in the field of social neuroscience, much remains to be understood regarding the development and maintenance of social skills across the life span. Few comprehensive models exist that integrate multidisciplinary perspectives and explain the multitude of factors that influence the emergence and expression of social…

  12. Improvement of High-Resolution Tropical Cyclone Structure and Intensity Forecasts using COAMPS-TC

    DTIC Science & Technology

    2013-09-30

    scientific community including the recent T- PARC /TCS08, ITOP, and HS3 field campaigns to build upon the existing modeling capabilities. We will...heating and cooling rates in developing and non-developing tropical disturbances during tcs-08: radar -equivalent retrievals from mesoscale numerical

  13. Curriculum Development in History Using Systems Approach

    ERIC Educational Resources Information Center

    Acun, Ramazan

    2011-01-01

    This work provides a conceptual framework for developing coherent history curricula at university level. It can also be used for evaluating existing curricula in terms of coherence. For this purpose, two models that are closely inter-connected called History Education System (Tarih Egitim Sistemi or TES) and History Research System (Tarih…

  14. Massification and Diversification as Complementary Strategies for Economic Growth in Developed and Developing Countries

    ERIC Educational Resources Information Center

    Tyndorf, Darryl; Glass, Chris R.

    2016-01-01

    Numerous microeconomic studies demonstrate the significant individual returns to tertiary education; however, little empirical evidence exists regarding the effects of higher education massification and diversification agendas on long-term macroeconomic growth. The researchers used the Uzawa-Lucas endogenous growth model to tertiary education…

  15. Human systems dynamics: Toward a computational model

    NASA Astrophysics Data System (ADS)

    Eoyang, Glenda H.

    2012-09-01

    A robust and reliable computational model of complex human systems dynamics could support advancements in theory and practice for social systems at all levels, from intrapersonal experience to global politics and economics. Models of human interactions have evolved from traditional, Newtonian systems assumptions, which served a variety of practical and theoretical needs of the past. Another class of models has been inspired and informed by models and methods from nonlinear dynamics, chaos, and complexity science. None of the existing models, however, is able to represent the open, high dimension, and nonlinear self-organizing dynamics of social systems. An effective model will represent interactions at multiple levels to generate emergent patterns of social and political life of individuals and groups. Existing models and modeling methods are considered and assessed against characteristic pattern-forming processes in observed and experienced phenomena of human systems. A conceptual model, CDE Model, based on the conditions for self-organizing in human systems, is explored as an alternative to existing models and methods. While the new model overcomes the limitations of previous models, it also provides an explanatory base and foundation for prospective analysis to inform real-time meaning making and action taking in response to complex conditions in the real world. An invitation is extended to readers to engage in developing a computational model that incorporates the assumptions, meta-variables, and relationships of this open, high dimension, and nonlinear conceptual model of the complex dynamics of human systems.

  16. Caring for people with dementia in residential aged care: successes with a composite person-centered care model featuring Montessori-based activities.

    PubMed

    Roberts, Gail; Morley, Catherine; Walters, Wendy; Malta, Sue; Doyle, Colleen

    2015-01-01

    Person-centered models of dementia care commonly merge aspects of existing models with additional influences from published and unpublished evidence and existing government policy. This study reports on the development and evaluation of one such composite model of person-centered dementia care, the ABLE model. The model was based on building the capacity and ability of residents living with dementia, using environmental changes, staff education and organizational and community engagement. Montessori principles were also used. The evaluation of the model employed mixed methods. Significant behavior changes were evident among residents of the dementia care Unit after the model was introduced, as were reductions in anti-psychotic and sedative medication. Staff reported increased knowledge about meeting the needs of people with dementia, and experienced organizational culture change that supported the ABLE model of care. Families were very satisfied with the changes. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Improving the process of process modelling by the use of domain process patterns

    NASA Astrophysics Data System (ADS)

    Koschmider, Agnes; Reijers, Hajo A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process models would be to reuse relevant parts of existing models. At this point, however, limited support exists to guide process modellers towards the usage of appropriate model content. In this paper, a set of content-oriented patterns is presented, which is extracted from a large set of process models from the order management and manufacturing production domains. The patterns are derived using a newly proposed set of algorithms, which are being discussed in this paper. The authors demonstrate how such Domain Process Patterns, in combination with information on their historic usage, can support process modellers in generating new models. To support the wider dissemination and development of Domain Process Patterns within and beyond the studied domains, an accompanying website has been set up.

  18. Recent Progresses in Incorporating Human Land-Water Management into Global Land Surface Models Toward Their Integration into Earth System Models

    NASA Technical Reports Server (NTRS)

    Pokhrel, Yadu N.; Hanasaki, Naota; Wada, Yoshihide; Kim, Hyungjun

    2016-01-01

    The global water cycle has been profoundly affected by human land-water management. As the changes in the water cycle on land can affect the functioning of a wide range of biophysical and biogeochemical processes of the Earth system, it is essential to represent human land-water management in Earth system models (ESMs). During the recent past, noteworthy progress has been made in large-scale modeling of human impacts on the water cycle but sufficient advancements have not yet been made in integrating the newly developed schemes into ESMs. This study reviews the progresses made in incorporating human factors in large-scale hydrological models and their integration into ESMs. The study focuses primarily on the recent advancements and existing challenges in incorporating human impacts in global land surface models (LSMs) as a way forward to the development of ESMs with humans as integral components, but a brief review of global hydrological models (GHMs) is also provided. The study begins with the general overview of human impacts on the water cycle. Then, the algorithms currently employed to represent irrigation, reservoir operation, and groundwater pumping are discussed. Next, methodological deficiencies in current modeling approaches and existing challenges are identified. Furthermore, light is shed on the sources of uncertainties associated with model parameterizations, grid resolution, and datasets used for forcing and validation. Finally, representing human land-water management in LSMs is highlighted as an important research direction toward developing integrated models using ESM frameworks for the holistic study of human-water interactions within the Earths system.

  19. Potential effects of existing and proposed groundwater withdrawals on water levels and natural groundwater discharge in Snake Valley and surrounding areas, Utah and Nevada

    USGS Publications Warehouse

    Masbruch, Melissa D.; Brooks, Lynette E.

    2017-04-14

    Several U.S. Department of Interior (DOI) agencies are concerned about the cumulative effects of groundwater development on groundwater resources managed by, and other groundwater resources of interest to, these agencies in Snake Valley and surrounding areas. The new water uses that potentially concern the DOI agencies include 12 water-right applications filed in 2005, totaling approximately 8,864 acre-feet per year. To date, only one of these applications has been approved and partially developed. In addition, the DOI agencies are interested in the potential effects of three new water-right applications (UT 18-756, UT 18-758, and UT 18-759) and one water-right change application (UT a40687), which were the subject of a water-right hearing on April 19, 2016.This report presents a hydrogeologic analysis of areas in and around Snake Valley to assess potential effects of existing and future groundwater development on groundwater resources, specifically groundwater discharge sites, of interest to the DOI agencies. A previously developed steady-state numerical groundwater-flow model was modified to transient conditions with respect to well withdrawals and used to quantify drawdown and capture (withdrawals that result in depletion) of natural discharge from existing and proposed groundwater withdrawals. The original steady-state model simulates and was calibrated to 2009 conditions. To investigate the potential effects of existing and proposed groundwater withdrawals on the groundwater resources of interest to the DOI agencies, 10 withdrawal scenarios were simulated. All scenarios were simulated for periods of 5, 10, 15, 30, 55, and 105 years from the start of 2010; additionally, all scenarios were simulated to a new steady state to determine the ultimate long-term effects of the withdrawals. Capture maps were also constructed as part of this analysis. The simulations used to develop the capture maps test the response of the system, specifically the reduction of natural discharge, to future stresses at a point in the area represented by the model. In this way, these maps can be used as a tool to determine the source of water to, and potential effects at specific areas from, future well withdrawals.Downward trends in water levels measured in wells indicate that existing groundwater withdrawals in Snake Valley are affecting water levels. The numerical model simulates similar downward trends in water levels; simulated drawdowns in the model, however, are generally less than observed water-level declines. At the groundwater discharge sites of interest to the DOI agencies, simulated drawdowns from existing well withdrawals (projected into the future) range from 0 to about 50 feet. Following the addition of the proposed withdrawals, simulated drawdowns at some sites increase by 25 feet. Simulated drawdown resulting from the proposed withdrawals began in as few as 5 years after 2014 at several of the sites. At the groundwater discharge sites of interest to the DOI agencies, simulated capture of natural discharge resulting from the existing withdrawals ranged from 0 to 87 percent. Following the addition of the proposed withdrawals, simulated capture at several of the sites reached 100 percent, indicating that groundwater discharge at that site would cease. Simulated capture following the addition of the proposed withdrawals increased in as few as 5 years after 2014 at several of the sites.

  20. SWATShare- A Platform for Collaborative Hydrology Research and Education with Cyber-enabled Sharing, Running and Visualization of SWAT Models

    NASA Astrophysics Data System (ADS)

    Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.

    2014-12-01

    Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.

Top