Generic domain models in software engineering
NASA Technical Reports Server (NTRS)
Maiden, Neil
1992-01-01
This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.
Development and application of air quality models at the US ...
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Frątczak-Łagiewska, Katarzyna; Matuszewski, Szymon
2018-05-01
Differences in size between males and females, called the sexual size dimorphism, are common in insects. These differences may be followed by differences in the duration of development. Accordingly, it is believed that insect sex may be used to increase the accuracy of insect age estimates in forensic entomology. Here, the sex-specific differences in the development of Creophilus maxillosus were studied at seven constant temperatures. We have also created separate developmental models for males and females of C. maxillosus and tested them in a validation study to answer a question whether sex-specific developmental models improve the accuracy of insect age estimates. Results demonstrate that males of C. maxillosus developed significantly longer than females. The sex-specific and general models for the total immature development had the same optimal temperature range and similar developmental threshold but different thermal constant K, which was the largest in the case of the male-specific model and the smallest in the case of the female-specific model. Despite these differences, validation study revealed just minimal and statistically insignificant differences in the accuracy of age estimates using sex-specific and general thermal summation models. This finding indicates that in spite of statistically significant differences in the duration of immature development between females and males of C. maxillosus, there is no increase in the accuracy of insect age estimates while using the sex-specific thermal summation models compared to the general model. Accordingly, this study does not support the use of sex-specific developmental data for the estimation of insect age in forensic entomology.
A PBPK model for TCE with specificity for the male LE rat that accurately predicts TCE tissue time-course data has not been developed, although other PBPK models for TCE exist. Development of such a model was the present aim. The PBPK model consisted of 5 compartments: fat; slowl...
Knowledge-based approach for generating target system specifications from a domain model
NASA Technical Reports Server (NTRS)
Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan
1992-01-01
Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.
Developing Formal Object-oriented Requirements Specifications: A Model, Tool and Technique.
ERIC Educational Resources Information Center
Jackson, Robert B.; And Others
1995-01-01
Presents a formal object-oriented specification model (OSS) for computer software system development that is supported by a tool that automatically generates a prototype from an object-oriented analysis model (OSA) instance, lets the user examine the prototype, and permits the user to refine the OSA model instance to generate a requirements…
Marini, Giacomo; Studer, Harald; Huber, Gerd; Püschel, Klaus; Ferguson, Stephen J
2016-06-01
Patient-specific modelling of the spine is a powerful tool to explore the prevention and the treatment of injuries and pathologies. Albeit several methods have been proposed for the discretization of the bony structures, the efficient representation of the intervertebral disc anisotropy remains a challenge, especially with complex geometries. Furthermore, the swelling of the disc's nucleus pulposus is normally added to the model after geometry definition, at the cost of changes of the material properties and an unrealistic description of the prestressed state. The aim of this study was to develop techniques, which preserve the patient-specific geometry of the disc and allow the representation of the system anisotropy and residual stresses, independent of the system discretization. Depending on the modelling features, the developed approaches resulted in a response of patient-specific models that was in good agreement with the physiological response observed in corresponding experiments. The proposed methods represent a first step towards the development of patient-specific models of the disc which respect both the geometry and the mechanical properties of the specific disc.
Disentangling the Role of Domain-Specific Knowledge in Student Modeling
NASA Astrophysics Data System (ADS)
Ruppert, John; Duncan, Ravit Golan; Chinn, Clark A.
2017-08-01
This study explores the role of domain-specific knowledge in students' modeling practice and how this knowledge interacts with two domain-general modeling strategies: use of evidence and developing a causal mechanism. We analyzed models made by middle school students who had a year of intensive model-based instruction. These models were made to explain a familiar but unstudied biological phenomenon: late onset muscle pain. Students were provided with three pieces of evidence related to this phenomenon and asked to construct a model to account for this evidence. Findings indicate that domain-specific resources play a significant role in the extent to which the models accounted for provided evidence. On the other hand, familiarity with the situation appeared to contribute to the mechanistic character of models. Our results indicate that modeling strategies alone are insufficient for the development of a mechanistic model that accounts for provided evidence and that, while learners can develop a tentative model with a basic familiarity of the situation, scaffolding certain domain-specific knowledge is necessary to assist students with incorporating evidence in modeling tasks.
ERIC Educational Resources Information Center
Varjas, Kris; Meyers, Joel; Henrich, Christopher C.; Graybill, Emily C.; Dew, Brian J.; Marshall, Megan L.; Williamson, Zachary; Skoczylas, Rebecca B.; Avant, Marty
2006-01-01
The purpose of the Peer Victimization Intervention (PVI) was to develop and implement a culture-specific pilot intervention to address the effects of bullying on middle school students who are victims utilizing the Participatory Culture-Specific Intervention Model (PCSIM; Nastasi, Moore, & Varjas, 2004). The involvement of participants who serve…
Jiang, Guoqian; Evans, Julie; Endle, Cory M; Solbrig, Harold R; Chute, Christopher G
2016-01-01
The Biomedical Research Integrated Domain Group (BRIDG) model is a formal domain analysis model for protocol-driven biomedical research, and serves as a semantic foundation for application and message development in the standards developing organizations (SDOs). The increasing sophistication and complexity of the BRIDG model requires new approaches to the management and utilization of the underlying semantics to harmonize domain-specific standards. The objective of this study is to develop and evaluate a Semantic Web-based approach that integrates the BRIDG model with ISO 21090 data types to generate domain-specific templates to support clinical study metadata standards development. We developed a template generation and visualization system based on an open source Resource Description Framework (RDF) store backend, a SmartGWT-based web user interface, and a "mind map" based tool for the visualization of generated domain-specific templates. We also developed a RESTful Web Service informed by the Clinical Information Modeling Initiative (CIMI) reference model for access to the generated domain-specific templates. A preliminary usability study is performed and all reviewers (n = 3) had very positive responses for the evaluation questions in terms of the usability and the capability of meeting the system requirements (with the average score of 4.6). Semantic Web technologies provide a scalable infrastructure and have great potential to enable computable semantic interoperability of models in the intersection of health care and clinical research.
The development of advanced manufacturing systems
NASA Astrophysics Data System (ADS)
Doumeingts, Guy; Vallespir, Bruno; Darricau, Didier; Roboam, Michel
Various methods for the design of advanced manufacturing systems (AMSs) are reviewed. The specifications for AMSs and problems inherent in their development are first discussed. Three models, the Computer Aided Manufacturing-International model, the National Bureau of Standards model, and the GRAI model, are considered in detail. Hierarchical modeling tools such as structured analysis and design techniques, Petri nets, and the Icam definition method are used in the development of integrated manufacturing models. Finally, the GRAI method is demonstrated in the design of specifications for the production management system of the Snecma AMS.
Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice
2005-04-15
For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.
End-to-end observatory software modeling using domain specific languages
NASA Astrophysics Data System (ADS)
Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José
2014-07-01
The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.
A simulation study to quantify the impacts of exposure ...
A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
Accomplishments are described for the second year effort of a 3-year program to develop methodology for component specific modeling of aircraft engine hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models; (2) geometry model generators; (3) remeshing; (4) specialty 3-D inelastic stuctural analysis; (5) computationally efficient solvers, (6) adaptive solution strategies; (7) engine performance parameters/component response variables decomposition and synthesis; (8) integrated software architecture and development, and (9) validation cases for software developed.
Component-specific modeling. [jet engine hot section components
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Maffeo, R. J.; Tipton, M. T.; Weber, G.
1992-01-01
Accomplishments are described for a 3 year program to develop methodology for component-specific modeling of aircraft hot section components (turbine blades, turbine vanes, and burner liners). These accomplishments include: (1) engine thermodynamic and mission models, (2) geometry model generators, (3) remeshing, (4) specialty three-dimensional inelastic structural analysis, (5) computationally efficient solvers, (6) adaptive solution strategies, (7) engine performance parameters/component response variables decomposition and synthesis, (8) integrated software architecture and development, and (9) validation cases for software developed.
Patient-specific finite element modeling of bones.
Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A
2013-04-01
Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
Parameterized Linear Longitudinal Airship Model
NASA Technical Reports Server (NTRS)
Kulczycki, Eric; Elfes, Alberto; Bayard, David; Quadrelli, Marco; Johnson, Joseph
2010-01-01
A parameterized linear mathematical model of the longitudinal dynamics of an airship is undergoing development. This model is intended to be used in designing control systems for future airships that would operate in the atmospheres of Earth and remote planets. Heretofore, the development of linearized models of the longitudinal dynamics of airships has been costly in that it has been necessary to perform extensive flight testing and to use system-identification techniques to construct models that fit the flight-test data. The present model is a generic one that can be relatively easily specialized to approximate the dynamics of specific airships at specific operating points, without need for further system identification, and with significantly less flight testing. The approach taken in the present development is to merge the linearized dynamical equations of an airship with techniques for estimation of aircraft stability derivatives, and to thereby make it possible to construct a linearized dynamical model of the longitudinal dynamics of a specific airship from geometric and aerodynamic data pertaining to that airship. (It is also planned to develop a model of the lateral dynamics by use of the same methods.) All of the aerodynamic data needed to construct the model of a specific airship can be obtained from wind-tunnel testing and computational fluid dynamics
Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site
NASA Astrophysics Data System (ADS)
Wang, Yu; Aladejare, Adeyemi Emman
2016-09-01
Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.
JEDI Natural Gas Model | Jobs and Economic Development Impact Models | NREL
Natural Gas Model JEDI Natural Gas Model The Jobs and Economic Development Impacts (JEDI) Natural Gas model allows users to estimate economic development impacts from natural gas power generation -specific data should be used to obtain the best estimate of economic development impacts. This model has
ERIC Educational Resources Information Center
Graybill, Emily C.; Varjas, Kris; Meyers, Joel; Greenberg, Daphne; Roach, Andrew T.
2013-01-01
The Participatory Culture-Specific Model of Course Development (PCSMCD), adapted from the Participatory Culture-Specific Intervention Model, is a proposed framework to address challenges to social justice education by addressing the following four course variables: instructor characteristics, instructor experiences, student characteristics, and…
NASA Astrophysics Data System (ADS)
Prakash, Punit; Diederich, Chris J.
2010-03-01
Interstitial and transurethral catheter-based ultrasound devices are under development for treatment of prostate cancer and BPH, uterine fibroids, liver tumors and other soft tissue disease. Accurate 3D thermal modeling is essential for designing site-specific applicators, exploring treatment delivery strategies, and integration of patient-specific treatment planning of thermal ablations. We are developing a comprehensive 3D modeling and treatment planning platform for ultrasound ablation of tissue using catheter-based applicators. We explored the applicability of assessing thermal effects in tissue using critical temperature, thermal dose and Arrhenius thermal damage thresholds and performed a comparative analysis of dynamic tissue properties critical to accurate modeling. We used the model to assess the feasibility of automatic feedback control with MR thermometry, and demonstrated the utility of the modeling platform for 3D patient-specific treatment planning. We have identified critical temperature, thermal dose and thermal damage thresholds for assessing treatment endpoint. Dynamic changes in tissue attenuation/absorption and perfusion must be included for accurate prediction of temperature profiles and extents of the ablation zone. Lastly, we demonstrated use of the modeling platform for patient-specific treatment planning.
Development of a patient-specific model for calculation of pulmonary function
NASA Astrophysics Data System (ADS)
Zhong, Hualiang; Ding, Mingyue; Movsas, Benjamin; Chetty, Indrin J.
2011-06-01
The purpose of this paper is to develop a patient-specific finite element model (FEM) to calculate the pulmonary function of lung cancer patients for evaluation of radiation treatment. The lung model was created with an in-house developed FEM software with region-specific parameters derived from a four-dimensional CT (4DCT) image. The model was used first to calculate changes in air volume and elastic stress in the lung, and then to calculate regional compliance defined as the change in air volume corrected by its associated stress. The results have shown that the resultant compliance images can reveal the regional elastic property of lung tissue, and could be useful for radiation treatment planning and assessment.
Determination of photovoltaic concentrator optical design specifications using performance modeling
NASA Astrophysics Data System (ADS)
Kerschen, Kevin A.; Levy, Sheldon L.
The strategy used to develop an optical design specification for a 500X concentration photovoltaic module to be used with a 28-percent-efficient concentrator photovoltaic cell is reported. The computer modeling code (PVOPTICS) developed for this purpose, a Fresnel lens design strategy, and optical component specification procedures are described. Comparisons are made between the predicted performance and the measured performance of components fabricated to those specifications. An acrylic lens and a reflective secondary optical element have been tested, showing efficiencies exceeding 88 percent.
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-v...
Component Models for Semantic Web Languages
NASA Astrophysics Data System (ADS)
Henriksson, Jakob; Aßmann, Uwe
Intelligent applications and agents on the Semantic Web typically need to be specified with, or interact with specifications written in, many different kinds of formal languages. Such languages include ontology languages, data and metadata query languages, as well as transformation languages. As learnt from years of experience in development of complex software systems, languages need to support some form of component-based development. Components enable higher software quality, better understanding and reusability of already developed artifacts. Any component approach contains an underlying component model, a description detailing what valid components are and how components can interact. With the multitude of languages developed for the Semantic Web, what are their underlying component models? Do we need to develop one for each language, or is a more general and reusable approach achievable? We present a language-driven component model specification approach. This means that a component model can be (automatically) generated from a given base language (actually, its specification, e.g. its grammar). As a consequence, we can provide components for different languages and simplify the development of software artifacts used on the Semantic Web.
iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations
NASA Astrophysics Data System (ADS)
Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.
The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.
A Machine Learning Approach to Predict Gene Regulatory Networks in Seed Development in Arabidopsis
Ni, Ying; Aghamirzaie, Delasa; Elmarakeby, Haitham; Collakova, Eva; Li, Song; Grene, Ruth; Heath, Lenwood S.
2016-01-01
Gene regulatory networks (GRNs) provide a representation of relationships between regulators and their target genes. Several methods for GRN inference, both unsupervised and supervised, have been developed to date. Because regulatory relationships consistently reprogram in diverse tissues or under different conditions, GRNs inferred without specific biological contexts are of limited applicability. In this report, a machine learning approach is presented to predict GRNs specific to developing Arabidopsis thaliana embryos. We developed the Beacon GRN inference tool to predict GRNs occurring during seed development in Arabidopsis based on a support vector machine (SVM) model. We developed both global and local inference models and compared their performance, demonstrating that local models are generally superior for our application. Using both the expression levels of the genes expressed in developing embryos and prior known regulatory relationships, GRNs were predicted for specific embryonic developmental stages. The targets that are strongly positively correlated with their regulators are mostly expressed at the beginning of seed development. Potential direct targets were identified based on a match between the promoter regions of these inferred targets and the cis elements recognized by specific regulators. Our analysis also provides evidence for previously unknown inhibitory effects of three positive regulators of gene expression. The Beacon GRN inference tool provides a valuable model system for context-specific GRN inference and is freely available at https://github.com/BeaconProjectAtVirginiaTech/beacon_network_inference.git. PMID:28066488
Hunt, R.J.; Anderson, M.P.; Kelson, V.A.
1998-01-01
This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.
Hannah, Iain; Montefiori, Erica; Modenese, Luca; Prinold, Joe; Viceconti, Marco; Mazzà, Claudia
2017-01-01
Subject-specific musculoskeletal modelling is especially useful in the study of juvenile and pathological subjects. However, such methodologies typically require a human operator to identify key landmarks from medical imaging data and are thus affected by unavoidable variability in the parameters defined and subsequent model predictions. The aim of this study was to thus quantify the inter- and intra-operator repeatability of a subject-specific modelling methodology developed for the analysis of subjects with juvenile idiopathic arthritis. Three operators each created subject-specific musculoskeletal foot and ankle models via palpation of bony landmarks, adjustment of geometrical muscle points and definition of joint coordinate systems. These models were then fused to a generic Arnold lower limb model for each of three modelled patients. The repeatability of each modelling operation was found to be comparable to those previously reported for the modelling of healthy, adult subjects. However, the inter-operator repeatability of muscle point definition was significantly greater than intra-operator repeatability (p < 0.05) and predicted ankle joint contact forces ranged by up to 24% and 10% of the peak force for the inter- and intra-operator analyses, respectively. Similarly, the maximum inter- and intra-operator variations in muscle force output were 64% and 23% of peak force, respectively. Our results suggest that subject-specific modelling is operator dependent at the foot and ankle, with the definition of muscle geometry the most significant source of output uncertainty. The development of automated procedures to prevent the misplacement of crucial muscle points should therefore be considered a particular priority for those developing subject-specific models. PMID:28427313
Hannah, Iain; Montefiori, Erica; Modenese, Luca; Prinold, Joe; Viceconti, Marco; Mazzà, Claudia
2017-05-01
Subject-specific musculoskeletal modelling is especially useful in the study of juvenile and pathological subjects. However, such methodologies typically require a human operator to identify key landmarks from medical imaging data and are thus affected by unavoidable variability in the parameters defined and subsequent model predictions. The aim of this study was to thus quantify the inter- and intra-operator repeatability of a subject-specific modelling methodology developed for the analysis of subjects with juvenile idiopathic arthritis. Three operators each created subject-specific musculoskeletal foot and ankle models via palpation of bony landmarks, adjustment of geometrical muscle points and definition of joint coordinate systems. These models were then fused to a generic Arnold lower limb model for each of three modelled patients. The repeatability of each modelling operation was found to be comparable to those previously reported for the modelling of healthy, adult subjects. However, the inter-operator repeatability of muscle point definition was significantly greater than intra-operator repeatability ( p < 0.05) and predicted ankle joint contact forces ranged by up to 24% and 10% of the peak force for the inter- and intra-operator analyses, respectively. Similarly, the maximum inter- and intra-operator variations in muscle force output were 64% and 23% of peak force, respectively. Our results suggest that subject-specific modelling is operator dependent at the foot and ankle, with the definition of muscle geometry the most significant source of output uncertainty. The development of automated procedures to prevent the misplacement of crucial muscle points should therefore be considered a particular priority for those developing subject-specific models.
The Air Quality Model Evaluation International Initiative ...
This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Schaepe, Nathaniel J.; Soenksen, Philip J.; Rus, David L.
2014-01-01
The lower Platte River, Nebraska, provides drinking water, irrigation water, and in-stream flows for recreation, wildlife habitat, and vital habitats for several threatened and endangered species. The U.S. Geological Survey (USGS), in cooperation with the Lower Platte River Corridor Alliance (LPRCA) developed site-specific regression models for water-quality constituents at four sites (Shell Creek near Columbus, Nebraska [USGS site 06795500]; Elkhorn River at Waterloo, Nebr. [USGS site 06800500]; Salt Creek near Ashland, Nebr. [USGS site 06805000]; and Platte River at Louisville, Nebr. [USGS site 06805500]) in the lower Platte River corridor. The models were developed by relating continuously monitored water-quality properties (surrogate measurements) to discrete water-quality samples. These models enable existing web-based software to provide near-real-time estimates of stream-specific constituent concentrations to support natural resources management decisions. Since 2007, USGS, in cooperation with the LPRCA, has continuously monitored four water-quality properties seasonally within the lower Platte River corridor: specific conductance, water temperature, dissolved oxygen, and turbidity. During 2007 through 2011, the USGS and the Nebraska Department of Environmental Quality collected and analyzed discrete water-quality samples for nutrients, major ions, pesticides, suspended sediment, and bacteria. These datasets were used to develop the regression models. This report documents the collection of these various water-quality datasets and the development of the site-specific regression models. Regression models were developed for all four monitored sites. Constituent models for Shell Creek included nitrate plus nitrite, total phosphorus, orthophosphate, atrazine, acetochlor, suspended sediment, and Escherichia coli (E. coli) bacteria. Regression models that were developed for the Elkhorn River included nitrate plus nitrite, total Kjeldahl nitrogen, total phosphorus, orthophosphate, chloride, atrazine, acetochlor, suspended sediment, and E. coli. Models developed for Salt Creek included nitrate plus nitrite, total Kjeldahl nitrogen, suspended sediment, and E. coli. Lastly, models developed for the Platte River site included total Kjeldahl nitrogen, total phosphorus, sodium, metolachlor, atrazine, acetochlor, suspended sediment, and E. coli.
Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank
2006-04-01
This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.
Evaluation and application of regional turbidity-sediment regression models in Virginia
Hyer, Kenneth; Jastram, John D.; Moyer, Douglas; Webber, James S.; Chanat, Jeffrey G.
2015-01-01
Conventional thinking has long held that turbidity-sediment surrogate-regression equations are site specific and that regression equations developed at a single monitoring station should not be applied to another station; however, few studies have evaluated this issue in a rigorous manner. If robust regional turbidity-sediment models can be developed successfully, their applications could greatly expand the usage of these methods. Suspended sediment load estimation could occur as soon as flow and turbidity monitoring commence at a site, suspended sediment sampling frequencies for various projects potentially could be reduced, and special-project applications (sediment monitoring following dam removal, for example) could be significantly enhanced. The objective of this effort was to investigate the turbidity-suspended sediment concentration (SSC) relations at all available USGS monitoring sites within Virginia to determine whether meaningful turbidity-sediment regression models can be developed by combining the data from multiple monitoring stations into a single model, known as a “regional” model. Following the development of the regional model, additional objectives included a comparison of predicted SSCs between the regional model and commonly used site-specific models, as well as an evaluation of why specific monitoring stations did not fit the regional model.
Development and calibration of the statewide land use-transport model
DOT National Transportation Integrated Search
1999-02-12
The TRANUS package has been used to develop an integrated land use-transport model at the statewide level. It is based upon a specific modeling approach described by de la Barra (1989,1995). For the prototype statewide model developed during this pro...
A Neuroconstructivist Model of Past Tense Development and Processing
ERIC Educational Resources Information Center
Westermann, Gert; Ruh, Nicolas
2012-01-01
We present a neural network model of learning and processing the English past tense that is based on the notion that experience-dependent cortical development is a core aspect of cognitive development. During learning the model adds and removes units and connections to develop a task-specific final architecture. The model provides an integrated…
1984-12-01
model pr, vides a method for communicating a specific training equipment design to the procurement office after A"~- ISD ana lysis has est~blished a...maintenance trainer has been identified. The model provides a method by which a training equipment design can be communicated to the System Project Office...ensure * ase of development of procurement specifications and consistency between different documented designs. A completed application of this maodel
DEVS Unified Process for Web-Centric Development and Testing of System of Systems
2008-05-20
gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of
NASA Technical Reports Server (NTRS)
Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.
ERIC Educational Resources Information Center
Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon
2016-01-01
This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…
Generation of improved humanized mouse models for human infectious diseases
Brehm, Michael A.; Wiles, Michael V.; Greiner, Dale L.; Shultz, Leonard D.
2014-01-01
The study of human-specific infectious agents has been hindered by the lack of optimal small animal models. More recently development of novel strains of immunodeficient mice has begun to provide the opportunity to utilize small animal models for the study of many human-specific infectious agents. The introduction of a targeted mutation in the IL2 receptor common gamma chain gene (IL2rgnull) in mice already deficient in T and B cells led to a breakthrough in the ability to engraft hematopoietic stem cells, as well as functional human lymphoid cells and tissues, effectively creating human immune systems in immunodeficient mice. These humanized mice are becoming increasingly important as pre-clinical models for the study of human immunodeficiency virus-1 (HIV-1) and other human-specific infectious agents. However, there remain a number of opportunities to further improve humanized mouse models for the study of human-specific infectious agents. This is being done by the implementation of innovative technologies, which collectively will accelerate the development of new models of genetically modified mice, including; i) modifications of the host to reduce innate immunity, which impedes human cell engraftment; ii) genetic modification to provide human-specific growth factors and cytokines required for optimal human cell growth and function; iii) and new cell and tissue engraftment protocols. The development of “next generation” humanized mouse models continues to provide exciting opportunities for the establishment of robust small animal models to study the pathogenesis of human-specific infectious agents, as well as for testing the efficacy of therapeutic agents and experimental vaccines. PMID:24607601
Impacts of Lateral Boundary Conditions on US Ozone ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
ERIC Educational Resources Information Center
Faust, Stephen M.
1980-01-01
Presents a 3-phase model (content research, specification, delivery) for instructional development-operations research and describes its application in developing courses in zoology, geology, and paleontology. (MER)
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
An Automated Method for Landmark Identification and Finite-Element Modeling of the Lumbar Spine.
Campbell, Julius Quinn; Petrella, Anthony J
2015-11-01
The purpose of this study was to develop a method for the automated creation of finite-element models of the lumbar spine. Custom scripts were written to extract bone landmarks of lumbar vertebrae and assemble L1-L5 finite-element models. End-plate borders, ligament attachment points, and facet surfaces were identified. Landmarks were identified to maintain mesh correspondence between meshes for later use in statistical shape modeling. 90 lumbar vertebrae were processed creating 18 subject-specific finite-element models. Finite-element model surfaces and ligament attachment points were reproduced within 1e-5 mm of the bone surface, including the critical contact surfaces of the facets. Element quality exceeded specifications in 97% of elements for the 18 models created. The current method is capable of producing subject-specific finite-element models of the lumbar spine with good accuracy, quality, and robustness. The automated methods developed represent advancement in the state of the art of subject-specific lumbar spine modeling to a scale not possible with prior manual and semiautomated methods.
Formal Specification of Information Systems Requirements.
ERIC Educational Resources Information Center
Kampfner, Roberto R.
1985-01-01
Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)
ERIC Educational Resources Information Center
Tapps, Tyler; Passmore, Tim; Lindenmeier, Donna; Kensinger, Weston
2014-01-01
The experiential learning model for students working with community groups was developed for specific experiential learning experiences involving 40 hours of actual experience for high school physical education students working with groups in the community. This article discusses the development and specific segments of the model, as well as how…
Career Planning: Towards a More Inclusive Model for Women and Diverse Individuals
ERIC Educational Resources Information Center
Banks, Claretha H.
2006-01-01
Since the 1953 introduction of Super's model of career development, many publications regarding career development and career planning have been developed. However, career planning models for women and diverse individuals are not prevalent. This paper contains a literature review of various well-known models that have few specific applications for…
Ku-Band rendezvous radar performance computer simulation model
NASA Technical Reports Server (NTRS)
Magnusson, H. G.; Goff, M. F.
1984-01-01
All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.
Ku-Band rendezvous radar performance computer simulation model
NASA Astrophysics Data System (ADS)
Magnusson, H. G.; Goff, M. F.
1984-06-01
All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.
[Case study on health risk assessment based on site-specific conceptual model].
Zhong, Mao-Sheng; Jiang, Lin; Yao, Jue-Jun; Xia, Tian-Xiang; Zhu, Xiao-Ying; Han, Dan; Zhang, Li-Na
2013-02-01
Site investigation was carried out on an area to be redeveloped as a subway station, which is right downstream of the groundwater of a former chemical plant. The results indicate the subsurface soil and groundwater in the area are both polluted heavily by 1,2-dichloroethane, which was caused by the chemical plant upstream with the highest concentration was 104.08 mg.kg-1 for soil sample at 8.6 m below ground and the highest concentration was 18500 microg.L-1 for groundwater. Further, a site-specific contamination conceptual model, giving consideration to the specific structure configuration of the station, was developed, and the corresponding risk calculation equation was derived. The carcinogenic risks calculated with models developed on the generic site conceptual model and derived herein on the site-specific conceptual model were compared. Both models indicate that the carcinogenic risk is significantly higher than the acceptable level which is 1 x 10(-6). The comparison result reveals that the risk calculated with the former models for soil and groundwater are higher than the one calculated with the latter models by 2 times and 1.5 times, respectively. The finding in this paper indicates that the generic risk assessment model may underestimate the risk if specific site conditions and structure configuration are not considered.
Models, Measurements, and Local Decisions: Assessing and ...
This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
A Five- Year CMAQ Model Performance for Wildfires and ...
Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Steven Karl; Determan, John C.
Dynamic System Simulation (DSS) models of fissile solution systems have been developed and verified against a variety of historical configurations. DSS techniques have been applied specifically to subcritical accelerator-driven systems using fissile solution fuels of uranium. Initial DSS models were developed in DESIRE, a specialized simulation scripting language. In order to tailor the DSS models to specifically meet needs of system designers they were converted to a Visual Studio implementation, and one of these subsequently to National Instrument’s LabVIEW for human factors engineering and operator training. Specific operational characteristics of subcritical accelerator-driven systems have been examined using a DSS modelmore » tailored to this particular class using fissile fuel.« less
A discrete-element model for viscoelastic deformation and fracture of glacial ice
NASA Astrophysics Data System (ADS)
Riikilä, T. I.; Tallinen, T.; Åström, J.; Timonen, J.
2015-10-01
A discrete-element model was developed to study the behavior of viscoelastic materials that are allowed to fracture. Applicable to many materials, the main objective of this analysis was to develop a model specifically for ice dynamics. A realistic model of glacial ice must include elasticity, brittle fracture and slow viscous deformations. Here the model is described in detail and tested with several benchmark simulations. The model was used to simulate various ice-specific applications with resulting flow rates that were compatible with Glen's law, and produced under fragmentation fragment-size distributions that agreed with the known analytical and experimental results.
Pieces of the Puzzle: Tracking the Chemical Component of the ...
This presentation provides an overview of the risk assessment conducted at the U.S. EPA, as well as some research examples related to the exposome concept. This presentation also provides the recommendation of using two organizational and predictive frameworks for tracking chemical components in the exposome. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
SUMMA and Model Mimicry: Understanding Differences Among Land Models
NASA Astrophysics Data System (ADS)
Nijssen, B.; Nearing, G. S.; Ou, G.; Clark, M. P.
2016-12-01
Model inter-comparison and model ensemble experiments suffer from an inability to explain the mechanisms behind differences in model outcomes. We can clearly demonstrate that the models are different, but we cannot necessarily identify the reasons why, because most models exhibit myriad differences in process representations, model parameterizations, model parameters and numerical solution methods. This inability to identify the reasons for differences in model performance hampers our understanding and limits model improvement, because we cannot easily identify the most promising paths forward. We have developed the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to allow for controlled experimentation with model construction, numerical techniques, and parameter values and therefore isolate differences in model outcomes to specific choices during the model development process. In developing SUMMA, we recognized that hydrologic models can be thought of as individual instantiations of a master modeling template that is based on a common set of conservation equations for energy and water. Given this perspective, SUMMA provides a unified approach to hydrologic modeling that integrates different modeling methods into a consistent structure with the ability to instantiate alternative hydrologic models at runtime. Here we employ SUMMA to revisit a previous multi-model experiment and demonstrate its use for understanding differences in model performance. Specifically, we implement SUMMA to mimic the spread of behaviors exhibited by the land models that participated in the Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) and draw conclusions about the relative performance of specific model parameterizations for water and energy fluxes through the soil-vegetation continuum. SUMMA's ability to mimic the spread of model ensembles and the behavior of individual models can be an important tool in focusing model development and improvement efforts.
Miles, Brad; Kolos, Elizabeth; Walter, William L; Appleyard, Richard; Shi, Angela; Li, Qing; Ruys, Andrew J
2015-06-01
Subject-specific finite element (FE) modeling methodology could predict peri-prosthetic femoral fracture (PFF) for cementless hip arthoplasty in the early postoperative period. This study develops methodology for subject-specific finite element modeling by using the element deactivation technique to simulate bone failure and validate with experimental testing, thereby predicting peri-prosthetic femoral fracture in the early postoperative period. Material assignments for biphasic and triphasic models were undertaken. Failure modeling with the element deactivation feature available in ABAQUS 6.9 was used to simulate a crack initiation and propagation in the bony tissue based upon a threshold of fracture strain. The crack mode for the biphasic models was very similar to the experimental testing crack mode, with a similar shape and path of the crack. The fracture load is sensitive to the friction coefficient at the implant-bony interface. The development of a novel technique to simulate bone failure by element deactivation of subject-specific finite element models could aid prediction of fracture load in addition to fracture risk characterization for PFF. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Bijleveld, Yuma A; de Haan, Timo R; van der Lee, Johanna H; Groenendaal, Floris; Dijk, Peter H; van Heijst, Arno; de Jonge, Rogier C J; Dijkman, Koen P; van Straaten, Henrica L M; Rijken, Monique; Zonnenberg, Inge A; Cools, Filip; Zecic, Alexandra; Nuytemans, Debbie H G M; van Kaam, Anton H; Mathôt, Ron A A
2018-04-01
The pharmacokinetic (PK) properties of intravenous (i.v.) benzylpenicillin in term neonates undergoing moderate hypothermia after perinatal asphyxia were evaluated, as they have been unknown until now. A system-specific modeling approach was applied, in which our recently developed covariate model describing developmental and temperature-induced changes in amoxicillin clearance (CL) in the same patient study population was incorporated into a population PK model of benzylpenicillin with a priori birthweight (BW)-based allometric scaling. Pediatric population covariate models describing the developmental changes in drug elimination may constitute system-specific information and may therefore be incorporated into PK models of drugs cleared through the same pathway. The performance of this system-specific model was compared to that of a reference model. Furthermore, Monte-Carlo simulations were performed to evaluate the optimal dose. The system-specific model performed as well as the reference model. Significant correlations were found between CL and postnatal age (PNA), gestational age (GA), body temperature (TEMP), urine output (UO; system-specific model), and multiorgan failure (reference model). For a typical patient with a GA of 40 weeks, BW of 3,000 g, PNA of 2 days (TEMP, 33.5°C), and normal UO (2 ml/kg/h), benzylpenicillin CL was 0.48 liter/h (interindividual variability [IIV] of 49%) and the volume of distribution of the central compartment was 0.62 liter/kg (IIV of 53%) in the system-specific model. Based on simulations, we advise a benzylpenicillin i.v. dose regimen of 75,000 IU/kg/day every 8 h (q8h), 150,000 IU/kg/day q8h, and 200,000 IU/kg/day q6h for patients with GAs of 36 to 37 weeks, 38 to 41 weeks, and ≥42 weeks, respectively. The system-specific model may be used for other drugs cleared through the same pathway accelerating model development. Copyright © 2018 American Society for Microbiology.
Model Educational Specifications for Technology in Schools.
ERIC Educational Resources Information Center
Maryland State Dept. of Education, College Park. Office of Administration and Finance.
This description of the Model Edspec, which can be used by itself or in conjunction with the "Format Guide of Educational Specifications," serves as a comprehensive planning tool for the selection and application of technology. The model is designed to assist schools in implementing the facilities development process, thereby making…
NASA Technical Reports Server (NTRS)
Lee, T.; Boland, D. F., Jr.
1980-01-01
This document presents the results of an extensive survey and comparative evaluation of current atmosphere and wind models for inclusion in the Langley Atmospheric Information Retrieval System (LAIRS). It includes recommended models for use in LAIRS, estimated accuracies for the recommended models, and functional specifications for the development of LAIRS.
Archetype Model-Driven Development Framework for EHR Web System.
Kobayashi, Shinji; Kimura, Eizen; Ishihara, Ken
2013-12-01
This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems.
Developing Formal Correctness Properties from Natural Language Requirements
NASA Technical Reports Server (NTRS)
Nikora, Allen P.
2006-01-01
This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.
Specific model for the estimation of methane emission from municipal solid waste landfills in India.
Kumar, Sunil; Nimchuk, Nick; Kumar, Rakesh; Zietsman, Josias; Ramani, Tara; Spiegelman, Clifford; Kenney, Megan
2016-09-01
The landfill gas (LFG) model is a tool for measuring methane (CH4) generation rates and total CH4 emissions from a particular landfill. These models also have various applications including the sizing of the LFG collection system, evaluating the benefits of gas recovery projects, and measuring and controlling gaseous emissions. This research paper describes the development of a landfill model designed specifically for Indian climatic conditions and the landfill's waste characteristics. CH4, carbon dioxide (CO2), oxygen (O2) and temperature were considered as the prime factor for the development of this model. The developed model was validated for three landfill sites in India: Shillong, Kolkata, and Jaipur. The autocorrelation coefficient for the model was 0.915, while the R(2) value was 0.429. Copyright © 2016 Elsevier Ltd. All rights reserved.
Building new physiologically based pharmacokinetic (PBPK) models requires a lot data, such as the chemical-specific parameters and in vivo pharmacokinetic data. Previously-developed, well-parameterized, and thoroughly-vetted models can be great resource for supporting the constr...
Compositional Specification of Software Architecture
NASA Technical Reports Server (NTRS)
Penix, John; Lau, Sonie (Technical Monitor)
1998-01-01
This paper describes our experience using parameterized algebraic specifications to model properties of software architectures. The goal is to model the decomposition of requirements independent of the style used to implement the architecture. We begin by providing an overview of the role of architecture specification in software development. We then describe how architecture specifications are build up from component and connector specifications and give an overview of insights gained from a case study used to validate the method.
The South Carolina Comprehensive Career Development Program for Grades K-12.
ERIC Educational Resources Information Center
South Carolina State Dept. of Education, Columbia.
This document presents a model Comprehensive Career Development Program for grades K-12 developed for the state of South Carolina. The model provides the framework for local school districts to evolve a program that will meet the specific career development needs for their district's students. The model is planned to organize, expand, and extend…
Modeling Clinical Information Needs in the Context of a Specific Patient
Price, Susan L.
2000-01-01
Investigators have tried various approaches to link clinical information directly to information sources that may contain answers to clinical questions. Developing a model of clinical information needs that may arise in the context of viewing information about a specific patient is a preliminary step to finding an efficient, useful solution to the information retrieval problem. This poster illustrates a method of modeling clinical information needs in the context of a specific patient that that is adapted from entity-relationship models used in database design.
DICOM static and dynamic representation through unified modeling language
NASA Astrophysics Data System (ADS)
Martinez-Martinez, Alfonso; Jimenez-Alaniz, Juan R.; Gonzalez-Marquez, A.; Chavez-Avelar, N.
2004-04-01
The DICOM standard, as all standards, specifies in generic way the management in network and storage media environments of digital medical images and their related information. However, understanding the specifications for particular implementation is not a trivial work. Thus, this work is about understanding and modelling parts of the DICOM standard using Object Oriented methodologies, as part of software development processes. This has offered different static and dynamic views, according with the standard specifications, and the resultant models have been represented through the Unified Modelling Language (UML). The modelled parts are related to network conformance claim: Network Communication Support for Message Exchange, Message Exchange, Information Object Definitions, Service Class Specifications, Data Structures and Encoding, and Data Dictionary. The resultant models have given a better understanding about DICOM parts and have opened the possibility of create a software library to develop DICOM conformable PACS applications.
Modeling languages for biochemical network simulation: reaction vs equation based approaches.
Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya
2010-01-01
Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.
Brand, Matthias; Young, Kimberly S; Laier, Christian; Wölfling, Klaus; Potenza, Marc N
2016-12-01
Within the last two decades, many studies have addressed the clinical phenomenon of Internet-use disorders, with a particular focus on Internet-gaming disorder. Based on previous theoretical considerations and empirical findings, we suggest an Interaction of Person-Affect-Cognition-Execution (I-PACE) model of specific Internet-use disorders. The I-PACE model is a theoretical framework for the processes underlying the development and maintenance of an addictive use of certain Internet applications or sites promoting gaming, gambling, pornography viewing, shopping, or communication. The model is composed as a process model. Specific Internet-use disorders are considered to be the consequence of interactions between predisposing factors, such as neurobiological and psychological constitutions, moderators, such as coping styles and Internet-related cognitive biases, and mediators, such as affective and cognitive responses to situational triggers in combination with reduced executive functioning. Conditioning processes may strengthen these associations within an addiction process. Although the hypotheses regarding the mechanisms underlying the development and maintenance of specific Internet-use disorders, summarized in the I-PACE model, must be further tested empirically, implications for treatment interventions are suggested. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Projected 2050 Model Simulations for the Chesapeake Bay ...
The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Jones, Marian Moser; Roy, Kevin
2017-10-01
Purpose This article offers constructive commentary on The Life Course Health and Development Model (LCHD) as an organizing framework for MCH research. Description The LCHD has recently been proposed as an organizing framework for MCH research. This model integrates biomedical, biopsychosocial, and life course frameworks, to explain how "individual health trajectories" develop over time. In this article, we propose that the LCHD can improve its relevance to MCH policy and practice by: (1) placing individual health trajectories within the context of family health trajectories, which unfold within communities and societies, over historical and generational time; and (2) placing greater weight on the social determinants that shape health development trajectories of individuals and families to produce greater or lesser health equity. Assessment We argue that emphasizing these nested, historically specific social contexts in life course models will enrich study design and data analysis for future developmental science research, will make the LCHD model more relevant in shaping MCH policy and interventions, and will guard against its application as a deterministic framework. Specific ways to measure these and examples of how they can be integrated into the LCHD model are articulated. Conclusion Research applying the LCHD should incorporate the specific family and socio-historical contexts in which development occurs to serve as a useful basis for policy and interventions. Future longitudinal studies of maternal and child health should include collection of time-dependent data related to family environment and other social determinants of health, and analyze the impact of historical events and trends on specific cohorts.
Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.
2002-01-01
Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804
Uher, Jana
2015-12-01
As science seeks to make generalisations, a science of individual peculiarities encounters intricate challenges. This article explores these challenges by applying the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) and by exploring taxonomic "personality" research as an example. Analyses of researchers' interpretations of the taxonomic "personality" models, constructs and data that have been generated in the field reveal widespread erroneous assumptions about the abilities of previous methodologies to appropriately represent individual-specificity in the targeted phenomena. These assumptions, rooted in everyday thinking, fail to consider that individual-specificity and others' minds cannot be directly perceived, that abstract descriptions cannot serve as causal explanations, that between-individual structures cannot be isomorphic to within-individual structures, and that knowledge of compositional structures cannot explain the process structures of their functioning and development. These erroneous assumptions and serious methodological deficiencies in widely used standardised questionnaires have effectively prevented psychologists from establishing taxonomies that can comprehensively model individual-specificity in most of the kinds of phenomena explored as "personality", especially in experiencing and behaviour and in individuals' functioning and development. Contrary to previous assumptions, it is not universal models but rather different kinds of taxonomic models that are required for each of the different kinds of phenomena, variations and structures that are commonly conceived of as "personality". Consequently, to comprehensively explore individual-specificity, researchers have to apply a portfolio of complementary methodologies and develop different kinds of taxonomies, most of which have yet to be developed. Closing, the article derives some meta-desiderata for future research on individuals' "personality".
D'Elia, Jesse; Haig, Susan M.; Johnson, Matthew J.; Marcot, Bruce G.; Young, Richard
2015-01-01
Ecological niche models can be a useful tool to identify candidate reintroduction sites for endangered species but have been infrequently used for this purpose. In this paper, we (1) develop activity-specific ecological niche models (nesting, roosting, and feeding) for the critically endangered California condor (Gymnogyps californianus) to aid in reintroduction planning in California, Oregon, and Washington, USA, (2) test the accuracy of these models using empirical data withheld from model development, and (3) integrate model results with information on condor movement ecology and biology to produce predictive maps of reintroduction site suitability. Our approach, which disentangles niche models into activity-specific components, has applications for other species where it is routinely assumed (often incorrectly) that individuals fulfill all requirements for life within a single environmental space. Ecological niche models conformed to our understanding of California condor ecology, had good predictive performance when tested with data withheld from model development, and aided in the identification of several candidate reintroduction areas outside of the current distribution of the species. Our results suggest there are large unoccupied regions of the California condor’s historical range that have retained ecological features similar to currently occupied habitats, and thus could be considered for future reintroduction efforts. Combining our activity-specific ENMs with ground reconnaissance and information on other threat factors that could not be directly incorporated into empirical ENMs will ultimately improve our ability to select successful reintroduction sites for the California condor.
Dendrite and Axon Specific Geometrical Transformation in Neurite Development
Mironov, Vasily I.; Semyanov, Alexey V.; Kazantsev, Victor B.
2016-01-01
We propose a model of neurite growth to explain the differences in dendrite and axon specific neurite development. The model implements basic molecular kinetics, e.g., building protein synthesis and transport to the growth cone, and includes explicit dependence of the building kinetics on the geometry of the neurite. The basic assumption was that the radius of the neurite decreases with length. We found that the neurite dynamics crucially depended on the relationship between the rate of active transport and the rate of morphological changes. If these rates were in the balance, then the neurite displayed axon specific development with a constant elongation speed. For dendrite specific growth, the maximal length was rapidly saturated by degradation of building protein structures or limited by proximal part expansion reaching the characteristic cell size. PMID:26858635
Tomás, Inmaculada; Regueira-Iglesias, Alba; López, Maria; Arias-Bujanda, Nora; Novoa, Lourdes; Balsa-Castro, Carlos; Tomás, Maria
2017-01-01
Currently, there is little evidence available on the development of predictive models for the diagnosis or prognosis of chronic periodontitis based on the qPCR quantification of subgingival pathobionts. Our objectives were to: (1) analyze and internally validate pathobiont-based models that could be used to distinguish different periodontal conditions at site-specific level within the same patient with chronic periodontitis; (2) develop nomograms derived from predictive models. Subgingival plaque samples were obtained from control and periodontal sites (probing pocket depth and clinical attachment loss <4 mm and >4 mm, respectively) from 40 patients with moderate-severe generalized chronic periodontitis. The samples were analyzed by qPCR using TaqMan probes and specific primers to determine the concentrations of Actinobacillus actinomycetemcomitans (Aa) , Fusobacterium nucleatum (Fn) , Parvimonas micra (Pm) , Porphyromonas gingivalis (Pg) , Prevotella intermedia (Pi) , Tannerella forsythia (Tf) , and Treponema denticola (Td) . The pathobiont-based models were obtained using multivariate binary logistic regression. The best models were selected according to specified criteria. The discrimination was assessed using receiver operating characteristic curves and numerous classification measures were thus obtained. The nomograms were built based on the best predictive models. Eight bacterial cluster-based models showed an area under the curve (AUC) ≥0.760 and a sensitivity and specificity ≥75.0%. The PiTfFn cluster showed an AUC of 0.773 (sensitivity and specificity = 75.0%). When Pm and AaPm were incorporated in the TdPiTfFn cluster, we detected the two best predictive models with an AUC of 0.788 and 0.789, respectively (sensitivity and specificity = 77.5%). The TdPiTfAa cluster had an AUC of 0.785 (sensitivity and specificity = 75.0%). When Pm was incorporated in this cluster, a new predictive model appeared with better AUC and specificity values (0.787 and 80.0%, respectively). Distinct clusters formed by species with different etiopathogenic role (belonging to different Socransky's complexes) had a good predictive accuracy for distinguishing a site with periodontal destruction in a periodontal patient. The predictive clusters with the lowest number of bacteria were PiTfFn and TdPiTfAa , while TdPiTfAaFnPm had the highest number. In all the developed nomograms, high concentrations of these clusters were associated with an increased probability of having a periodontal site in a patient with chronic periodontitis.
Tomás, Inmaculada; Regueira-Iglesias, Alba; López, Maria; Arias-Bujanda, Nora; Novoa, Lourdes; Balsa-Castro, Carlos; Tomás, Maria
2017-01-01
Currently, there is little evidence available on the development of predictive models for the diagnosis or prognosis of chronic periodontitis based on the qPCR quantification of subgingival pathobionts. Our objectives were to: (1) analyze and internally validate pathobiont-based models that could be used to distinguish different periodontal conditions at site-specific level within the same patient with chronic periodontitis; (2) develop nomograms derived from predictive models. Subgingival plaque samples were obtained from control and periodontal sites (probing pocket depth and clinical attachment loss <4 mm and >4 mm, respectively) from 40 patients with moderate-severe generalized chronic periodontitis. The samples were analyzed by qPCR using TaqMan probes and specific primers to determine the concentrations of Actinobacillus actinomycetemcomitans (Aa), Fusobacterium nucleatum (Fn), Parvimonas micra (Pm), Porphyromonas gingivalis (Pg), Prevotella intermedia (Pi), Tannerella forsythia (Tf), and Treponema denticola (Td). The pathobiont-based models were obtained using multivariate binary logistic regression. The best models were selected according to specified criteria. The discrimination was assessed using receiver operating characteristic curves and numerous classification measures were thus obtained. The nomograms were built based on the best predictive models. Eight bacterial cluster-based models showed an area under the curve (AUC) ≥0.760 and a sensitivity and specificity ≥75.0%. The PiTfFn cluster showed an AUC of 0.773 (sensitivity and specificity = 75.0%). When Pm and AaPm were incorporated in the TdPiTfFn cluster, we detected the two best predictive models with an AUC of 0.788 and 0.789, respectively (sensitivity and specificity = 77.5%). The TdPiTfAa cluster had an AUC of 0.785 (sensitivity and specificity = 75.0%). When Pm was incorporated in this cluster, a new predictive model appeared with better AUC and specificity values (0.787 and 80.0%, respectively). Distinct clusters formed by species with different etiopathogenic role (belonging to different Socransky’s complexes) had a good predictive accuracy for distinguishing a site with periodontal destruction in a periodontal patient. The predictive clusters with the lowest number of bacteria were PiTfFn and TdPiTfAa, while TdPiTfAaFnPm had the highest number. In all the developed nomograms, high concentrations of these clusters were associated with an increased probability of having a periodontal site in a patient with chronic periodontitis. PMID:28848499
Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework
ERIC Educational Resources Information Center
Chen, Huilin; Chen, Jinsong
2016-01-01
Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…
Performance specifications and six sigma theory: Clinical chemistry and industry compared.
Oosterhuis, W P; Severens, M J M J
2018-04-11
Analytical performance specifications are crucial in test development and quality control. Although consensus has been reached on the use of biological variation to derive these specifications, no consensus has been reached which model should be preferred. The Six Sigma concept is widely applied in industry for quality specifications of products and can well be compared with Six Sigma models in clinical chemistry. However, the models for measurement specifications differ considerably between both fields: where the sigma metric is used in clinical chemistry, in industry the Number of Distinct Categories is used instead. In this study the models in both fields are compared and discussed. Copyright © 2018. Published by Elsevier Inc.
Development of a Prototype Decision Support System to Manage the Air Force Alternative Care Program
1990-09-01
development model was selected to structure the development process. Since it is necessary to ensure...uncertainty. Furthermore, the SDLC model provides a specific framework "by which an application is conceived, developed , and implemented" (Davis and Olson...associated with the automation of the manual ACP procedures. The SDLC Model has three stages: (1) definition, (2) development , and (3) installation
Framework for risk analysis in Multimedia Environmental Systems (FRAMES)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whelan, G.; Buck, J.W.; Castleton, K.J.
The objectives of this workshop are to (1) provide the NRC staff and the public with an overview of currently available Federally-Sponsored dose models appropriate for decommissioning assessments and (2) discuss NRC staff-developed questions related to model selection criteria with the final rule on ``Radiological Criteria for License Termination`` (62 FR 39058). For over 40 years, medium specific models have been and will continue to be developed in an effort to understand and predict environmental phenomena, including fluid-flow patterns, contaminant migration and fate, human or wildlife exposures, impacts from specific toxicants to specific species and their organs, cost-benefit analyses, impactsmore » from remediation alternatives, etc. For nearly 40 years, medium-specific models have been combined for either sequential or concurrent assessments. The evolution of multiple-media assessment tools has followed a logic progression. To allow a suite of users the flexibility and versatility to construct, combine, and couple attributes that meet their specific needs without unnecessarily burdening the user with extraneous capabilities, the development of a computer-based methodology to implement a Risk Analysis in Multimedia Environmental Systems (FRAMES) was begun in 1994. FRAMES represents a platform which links elements together and yet does not represent the models that are linked to or within it; therefore, changes to elements that are linked to or within FRAMES do not change the framework.« less
Jesse D’Elia; Susan M. Haig; Matthew Johnson; Richard Young; Bruce G. Marcot
2015-01-01
Ecological niche models can be a useful tool to identify candidate reintroduction sites for endangered species but have been infrequently used for this purpose. In this paper, we (1) develop activity-specific ecological niche models (nesting, roosting, and feeding) for the critically endangered California condor (Gymnogyps californianus) to aid in...
NASA Astrophysics Data System (ADS)
Loges, André; Herberger, Sabrina; Seegert, Philipp; Wetzel, Thomas
2016-12-01
Thermal models of Li-ion cells on various geometrical scales and with various complexity have been developed in the past to account for the temperature dependent behaviour of Li-ion cells. These models require accurate data on thermal material properties to offer reliable validation and interpretation of the results. In this context a thorough study on the specific heat capacities of Li-ion cells starting from raw materials and electrode coatings to representative unit cells of jelly rolls/electrode stacks with lumped values was conducted. The specific heat capacity is reported as a function of temperature and state of charge (SOC). Seven Li-ion cells from different manufactures with different cell chemistry, application and design were considered and generally applicable correlations were developed. A 2D thermal model of an automotive Li-ion cell for plug-in hybrid electric vehicle (PHEV) application illustrates the influence of specific heat capacity on the effectivity of cooling concepts and the temperature development of Li-ion cells.
Rapid Prototyping of Hydrologic Model Interfaces with IPython
NASA Astrophysics Data System (ADS)
Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.
2014-12-01
A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.
Automatic Review of Abstract State Machines by Meta Property Verification
NASA Technical Reports Server (NTRS)
Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia
2010-01-01
A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.
ERIC Educational Resources Information Center
Zhumasheva, Anara; Zhumabaeva, Zaida; Sakenov, Janat; Vedilina, Yelena; Zhaxylykova, Nuriya; Sekenova, Balkumis
2016-01-01
The current study focuses on the research topic of creating a theoretical model of development of information competence among students enrolled in elective courses. In order to examine specific features of the theoretical model of development of information competence among students enrolled in elective courses, we performed an analysis of…
Proposed best practice for projects that involve modelling and simulation.
O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad
2017-03-01
Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.
A Conceptual Model To Assist Educational Leaders Manage Change.
ERIC Educational Resources Information Center
Cochren, John R.
This paper presents a conceptual model to help school leaders manage change effectively. The model was developed from a literature review of theory development and model construction. Specifically, the paper identifies the major components that inhibit organizational change, and synthesizes the most salient features of these components through a…
Advanced local area network concepts
NASA Technical Reports Server (NTRS)
Grant, Terry
1985-01-01
Development of a good model of the data traffic requirements for Local Area Networks (LANs) onboard the Space Station is the driving problem in this work. A parameterized workload model is under development. An analysis contract has been started specifically to capture the distributed processing requirements for the Space Station and then to develop a top level model to simulate how various processing scenarios can handle the workload and what data communication patterns result. A summary of the Local Area Network Extendsible Simulator 2 Requirements Specification and excerpts from a grant report on the topological design of fiber optic local area networks with application to Expressnet are given.
NASA Astrophysics Data System (ADS)
Tang, L.; Titov, V. V.; Chamberlin, C. D.
2009-12-01
The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.
Development of machine learning models for diagnosis of glaucoma.
Kim, Seong Jae; Cho, Kyong Jin; Oh, Sejong
2017-01-01
The study aimed to develop machine learning models that have strong prediction power and interpretability for diagnosis of glaucoma based on retinal nerve fiber layer (RNFL) thickness and visual field (VF). We collected various candidate features from the examination of retinal nerve fiber layer (RNFL) thickness and visual field (VF). We also developed synthesized features from original features. We then selected the best features proper for classification (diagnosis) through feature evaluation. We used 100 cases of data as a test dataset and 399 cases of data as a training and validation dataset. To develop the glaucoma prediction model, we considered four machine learning algorithms: C5.0, random forest (RF), support vector machine (SVM), and k-nearest neighbor (KNN). We repeatedly composed a learning model using the training dataset and evaluated it by using the validation dataset. Finally, we got the best learning model that produces the highest validation accuracy. We analyzed quality of the models using several measures. The random forest model shows best performance and C5.0, SVM, and KNN models show similar accuracy. In the random forest model, the classification accuracy is 0.98, sensitivity is 0.983, specificity is 0.975, and AUC is 0.979. The developed prediction models show high accuracy, sensitivity, specificity, and AUC in classifying among glaucoma and healthy eyes. It will be used for predicting glaucoma against unknown examination records. Clinicians may reference the prediction results and be able to make better decisions. We may combine multiple learning models to increase prediction accuracy. The C5.0 model includes decision rules for prediction. It can be used to explain the reasons for specific predictions.
Setting development goals using stochastic dynamical system models
Nicolis, Stamatios C.; Bali Swain, Ranjula; Sumpter, David J. T.
2017-01-01
The Millennium Development Goals (MDG) programme was an ambitious attempt to encourage a globalised solution to important but often-overlooked development problems. The programme led to wide-ranging development but it has also been criticised for unrealistic and arbitrary targets. In this paper, we show how country-specific development targets can be set using stochastic, dynamical system models built from historical data. In particular, we show that the MDG target of two-thirds reduction of child mortality from 1990 levels was infeasible for most countries, especially in sub-Saharan Africa. At the same time, the MDG targets were not ambitious enough for fast-developing countries such as Brazil and China. We suggest that model-based setting of country-specific targets is essential for the success of global development programmes such as the Sustainable Development Goals (SDG). This approach should provide clear, quantifiable targets for policymakers. PMID:28241057
Setting development goals using stochastic dynamical system models.
Ranganathan, Shyam; Nicolis, Stamatios C; Bali Swain, Ranjula; Sumpter, David J T
2017-01-01
The Millennium Development Goals (MDG) programme was an ambitious attempt to encourage a globalised solution to important but often-overlooked development problems. The programme led to wide-ranging development but it has also been criticised for unrealistic and arbitrary targets. In this paper, we show how country-specific development targets can be set using stochastic, dynamical system models built from historical data. In particular, we show that the MDG target of two-thirds reduction of child mortality from 1990 levels was infeasible for most countries, especially in sub-Saharan Africa. At the same time, the MDG targets were not ambitious enough for fast-developing countries such as Brazil and China. We suggest that model-based setting of country-specific targets is essential for the success of global development programmes such as the Sustainable Development Goals (SDG). This approach should provide clear, quantifiable targets for policymakers.
Development of a Multidisciplinary Middle School Mathematics Infusion Model
ERIC Educational Resources Information Center
Russo, Maria; Hecht, Deborah; Burghardt, M. David; Hacker, Michael; Saxman, Laura
2011-01-01
The National Science Foundation (NSF) funded project "Mathematics, Science, and Technology Partnership" (MSTP) developed a multidisciplinary instructional model for connecting mathematics to science, technology and engineering content areas at the middle school level. Specifically, the model infused mathematics into middle school curriculum…
DEVELOPMENT OF CAPE-OPEN COMPLIANT PROCESS MODELING COMPONENTS IN MICROSOFT .NET
The CAPE-OPEN middleware standards were created to allow process modeling components (PMCs) developed by third parties to be used in any process modeling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compone...
Refining metabolic models and accounting for regulatory effects.
Kim, Joonhoon; Reed, Jennifer L
2014-10-01
Advances in genome-scale metabolic modeling allow us to investigate and engineer metabolism at a systems level. Metabolic network reconstructions have been made for many organisms and computational approaches have been developed to convert these reconstructions into predictive models. However, due to incomplete knowledge these reconstructions often have missing or extraneous components and interactions, which can be identified by reconciling model predictions with experimental data. Recent studies have provided methods to further improve metabolic model predictions by incorporating transcriptional regulatory interactions and high-throughput omics data to yield context-specific metabolic models. Here we discuss recent approaches for resolving model-data discrepancies and building context-specific metabolic models. Once developed highly accurate metabolic models can be used in a variety of biotechnology applications. Copyright © 2014 Elsevier Ltd. All rights reserved.
Enhanced semantic interoperability by profiling health informatics standards.
López, Diego M; Blobel, Bernd
2009-01-01
Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.
Age- and sex-specific thorax finite element model development and simulation.
Schoell, Samantha L; Weaver, Ashley A; Vavalle, Nicholas A; Stitzel, Joel D
2015-01-01
The shape, size, bone density, and cortical thickness of the thoracic skeleton vary significantly with age and sex, which can affect the injury tolerance, especially in at-risk populations such as the elderly. Computational modeling has emerged as a powerful and versatile tool to assess injury risk. However, current computational models only represent certain ages and sexes in the population. The purpose of this study was to morph an existing finite element (FE) model of the thorax to depict thorax morphology for males and females of ages 30 and 70 years old (YO) and to investigate the effect on injury risk. Age- and sex-specific FE models were developed using thin-plate spline interpolation. In order to execute the thin-plate spline interpolation, homologous landmarks on the reference, target, and FE model are required. An image segmentation and registration algorithm was used to collect homologous rib and sternum landmark data from males and females aged 0-100 years. The Generalized Procrustes Analysis was applied to the homologous landmark data to quantify age- and sex-specific isolated shape changes in the thorax. The Global Human Body Models Consortium (GHBMC) 50th percentile male occupant model was morphed to create age- and sex-specific thoracic shape change models (scaled to a 50th percentile male size). To evaluate the thoracic response, 2 loading cases (frontal hub impact and lateral impact) were simulated to assess the importance of geometric and material property changes with age and sex. Due to the geometric and material property changes with age and sex, there were observed differences in the response of the thorax in both the frontal and lateral impacts. Material property changes alone had little to no effect on the maximum thoracic force or the maximum percent compression. With age, the thorax becomes stiffer due to superior rotation of the ribs, which can result in increased bone strain that can increase the risk of fracture. For the 70-YO models, the simulations predicted a higher number of rib fractures in comparison to the 30-YO models. The male models experienced more superior rotation of the ribs in comparison to the female models, which resulted in a higher number of rib fractures for the males. In this study, age- and sex-specific thoracic models were developed and the biomechanical response was studied using frontal and lateral impact simulations. The development of these age- and sex-specific FE models of the thorax will lead to an improved understanding of the complex relationship between thoracic geometry, age, sex, and injury risk.
Archetype Model-Driven Development Framework for EHR Web System
Kimura, Eizen; Ishihara, Ken
2013-01-01
Objectives This article describes the Web application framework for Electronic Health Records (EHRs) we have developed to reduce construction costs for EHR sytems. Methods The openEHR project has developed clinical model driven architecture for future-proof interoperable EHR systems. This project provides the specifications to standardize clinical domain model implementations, upon which the ISO/CEN 13606 standards are based. The reference implementation has been formally described in Eiffel. Moreover C# and Java implementations have been developed as reference. While scripting languages had been more popular because of their higher efficiency and faster development in recent years, they had not been involved in the openEHR implementations. From 2007, we have used the Ruby language and Ruby on Rails (RoR) as an agile development platform to implement EHR systems, which is in conformity with the openEHR specifications. Results We implemented almost all of the specifications, the Archetype Definition Language parser, and RoR scaffold generator from archetype. Although some problems have emerged, most of them have been resolved. Conclusions We have provided an agile EHR Web framework, which can build up Web systems from archetype models using RoR. The feasibility of the archetype model to provide semantic interoperability of EHRs has been demonstrated and we have verified that that it is suitable for the construction of EHR systems. PMID:24523991
Sato, Tatsuhiko; Furusawa, Yoshiya
2012-10-01
Estimation of the survival fractions of cells irradiated with various particles over a wide linear energy transfer (LET) range is of great importance in the treatment planning of charged-particle therapy. Two computational models were developed for estimating survival fractions based on the concept of the microdosimetric kinetic model. They were designated as the double-stochastic microdosimetric kinetic and stochastic microdosimetric kinetic models. The former model takes into account the stochastic natures of both domain and cell nucleus specific energies, whereas the latter model represents the stochastic nature of domain specific energy by its approximated mean value and variance to reduce the computational time. The probability densities of the domain and cell nucleus specific energies are the fundamental quantities for expressing survival fractions in these models. These densities are calculated using the microdosimetric and LET-estimator functions implemented in the Particle and Heavy Ion Transport code System (PHITS) in combination with the convolution or database method. Both the double-stochastic microdosimetric kinetic and stochastic microdosimetric kinetic models can reproduce the measured survival fractions for high-LET and high-dose irradiations, whereas a previously proposed microdosimetric kinetic model predicts lower values for these fractions, mainly due to intrinsic ignorance of the stochastic nature of cell nucleus specific energies in the calculation. The models we developed should contribute to a better understanding of the mechanism of cell inactivation, as well as improve the accuracy of treatment planning of charged-particle therapy.
A site specific model and analysis of the neutral somatic mutation rate in whole-genome cancer data.
Bertl, Johanna; Guo, Qianyun; Juul, Malene; Besenbacher, Søren; Nielsen, Morten Muhlig; Hornshøj, Henrik; Pedersen, Jakob Skou; Hobolth, Asger
2018-04-19
Detailed modelling of the neutral mutational process in cancer cells is crucial for identifying driver mutations and understanding the mutational mechanisms that act during cancer development. The neutral mutational process is very complex: whole-genome analyses have revealed that the mutation rate differs between cancer types, between patients and along the genome depending on the genetic and epigenetic context. Therefore, methods that predict the number of different types of mutations in regions or specific genomic elements must consider local genomic explanatory variables. A major drawback of most methods is the need to average the explanatory variables across the entire region or genomic element. This procedure is particularly problematic if the explanatory variable varies dramatically in the element under consideration. To take into account the fine scale of the explanatory variables, we model the probabilities of different types of mutations for each position in the genome by multinomial logistic regression. We analyse 505 cancer genomes from 14 different cancer types and compare the performance in predicting mutation rate for both regional based models and site-specific models. We show that for 1000 randomly selected genomic positions, the site-specific model predicts the mutation rate much better than regional based models. We use a forward selection procedure to identify the most important explanatory variables. The procedure identifies site-specific conservation (phyloP), replication timing, and expression level as the best predictors for the mutation rate. Finally, our model confirms and quantifies certain well-known mutational signatures. We find that our site-specific multinomial regression model outperforms the regional based models. The possibility of including genomic variables on different scales and patient specific variables makes it a versatile framework for studying different mutational mechanisms. Our model can serve as the neutral null model for the mutational process; regions that deviate from the null model are candidates for elements that drive cancer development.
Stephen N. Matthews; Louis R. Iverson; Anantha M. Prasad; Matthew P. Peters; Paul G. Rodewald
2011-01-01
Species distribution models (SDMs) to evaluate trees' potential responses to climate change are essential for developing appropriate forest management strategies. However, there is a great need to better understand these models' limitations and evaluate their uncertainties. We have previously developed statistical models of suitable habitat, based on both...
Park, Jungkap; Saitou, Kazuhiro
2014-09-18
Multibody potentials accounting for cooperative effects of molecular interactions have shown better accuracy than typical pairwise potentials. The main challenge in the development of such potentials is to find relevant structural features that characterize the tightly folded proteins. Also, the side-chains of residues adopt several specific, staggered conformations, known as rotamers within protein structures. Different molecular conformations result in different dipole moments and induce charge reorientations. However, until now modeling of the rotameric state of residues had not been incorporated into the development of multibody potentials for modeling non-bonded interactions in protein structures. In this study, we develop a new multibody statistical potential which can account for the influence of rotameric states on the specificity of atomic interactions. In this potential, named "rotamer-dependent atomic statistical potential" (ROTAS), the interaction between two atoms is specified by not only the distance and relative orientation but also by two state parameters concerning the rotameric state of the residues to which the interacting atoms belong. It was clearly found that the rotameric state is correlated to the specificity of atomic interactions. Such rotamer-dependencies are not limited to specific type or certain range of interactions. The performance of ROTAS was tested using 13 sets of decoys and was compared to those of existing atomic-level statistical potentials which incorporate orientation-dependent energy terms. The results show that ROTAS performs better than other competing potentials not only in native structure recognition, but also in best model selection and correlation coefficients between energy and model quality. A new multibody statistical potential, ROTAS accounting for the influence of rotameric states on the specificity of atomic interactions was developed and tested on decoy sets. The results show that ROTAS has improved ability to recognize native structure from decoy models compared to other potentials. The effectiveness of ROTAS may provide insightful information for the development of many applications which require accurate side-chain modeling such as protein design, mutation analysis, and docking simulation.
Sonic Boom Modeling Technical Challenge
NASA Technical Reports Server (NTRS)
Sullivan, Brenda M.
2007-01-01
This viewgraph presentation reviews the technical challenges in modeling sonic booms. The goal of this program is to develop knowledge, capabilities and technologies to enable overland supersonic flight. The specific objectives of the modeling are: (1) Develop and validate sonic boom propagation model through realistic atmospheres, including effects of turbulence (2) Develop methods enabling prediction of response of and acoustic transmission into structures impacted by sonic booms (3) Develop and validate psychoacoustic model of human response to sonic booms under both indoor and outdoor listening conditions, using simulators.
Guidelines for applying the Composite Specification Model (CSM)
NASA Technical Reports Server (NTRS)
Agresti, William
1987-01-01
The Composite Specification Model (CSM) is an approach to representing software requirements. Guidelines are provided for applying CSM and developing each of the three descriptive views of the software: the contextual view, using entities and relationships; the dynamic view, using states and transitions; and the function view, using data flows and processes. Using CSM results in a software specification document, which is outlined.
The Multilingual Lexicon: Modelling Selection and Control
ERIC Educational Resources Information Center
de Bot, Kees
2004-01-01
In this paper an overview of research on the multilingual lexicon is presented as the basis for a model for processing multiple languages. With respect to specific issues relating to the processing of more than two languages, it is suggested that there is no need to develop a specific model for such multilingual processing, but at the same time we…
4D Subject-Specific Inverse Modeling of the Chick Embryonic Heart Outflow Tract Hemodynamics
Goenezen, Sevan; Chivukula, Venkat Keshav; Midgett, Madeline; Phan, Ly; Rugonyi, Sandra
2015-01-01
Blood flow plays a critical role in regulating embryonic cardiac growth and development, with altered flow leading to congenital heart disease. Progress in the field, however, is hindered by a lack of quantification of hemodynamic conditions in the developing heart. In this study, we present a methodology to quantify blood flow dynamics in the embryonic heart using subject-specific computational fluid dynamics (CFD) models. While the methodology is general, we focused on a model of the chick embryonic heart outflow tract (OFT), which distally connects the heart to the arterial system, and is the region of origin of many congenital cardiac defects. Using structural and Doppler velocity data collected from optical coherence tomography (OCT), we generated 4D (3D + time) embryo-specific CFD models of the heart OFT. To replicate the blood flow dynamics over time during the cardiac cycle, we developed an iterative inverse-method optimization algorithm, which determines the CFD model boundary conditions such that differences between computed velocities and measured velocities at one point within the OFT lumen are minimized. Results from our developed CFD model agree with previously measured hemodynamics in the OFT. Further, computed velocities and measured velocities differ by less than 15% at locations that were not used in the optimization, validating the model. The presented methodology can be used in quantifications of embryonic cardiac hemodynamics under normal and altered blood flow conditions, enabling an in depth quantitative study of how blood flow influences cardiac development. PMID:26361767
Multiple imputation for estimating the risk of developing dementia and its impact on survival.
Yu, Binbing; Saczynski, Jane S; Launer, Lenore
2010-10-01
Dementia, Alzheimer's disease in particular, is one of the major causes of disability and decreased quality of life among the elderly and a leading obstacle to successful aging. Given the profound impact on public health, much research has focused on the age-specific risk of developing dementia and the impact on survival. Early work has discussed various methods of estimating age-specific incidence of dementia, among which the illness-death model is popular for modeling disease progression. In this article we use multiple imputation to fit multi-state models for survival data with interval censoring and left truncation. This approach allows semi-Markov models in which survival after dementia depends on onset age. Such models can be used to estimate the cumulative risk of developing dementia in the presence of the competing risk of dementia-free death. Simulations are carried out to examine the performance of the proposed method. Data from the Honolulu Asia Aging Study are analyzed to estimate the age-specific and cumulative risks of dementia and to examine the effect of major risk factors on dementia onset and death.
Modeling of Selenium for the San Diego Creek Watershed and Newport Bay, California
Presser, Theresa S.; Luoma, Samuel N.
2009-01-01
The San Diego Creek watershed and Newport Bay in southern California are contaminated with selenium (Se) as a result of groundwater associated with urban development overlying a historical wetland, the Swamp of the Frogs. The primary Se source is drainage from surrounding seleniferous marine sedimentary formations. An ecosystem-scale model was employed as a tool to assist development of a site-specific Se objective for the region. The model visualizes outcomes of different exposure scenarios in terms of bioaccumulation in predators using partitioning coefficients, trophic transfer factors, and site-specific data for food-web inhabitants and particulate phases. Predicted Se concentrations agreed well with field observations, validating the use of the model as realistic tool for testing exposure scenarios. Using the fish tissue and bird egg guidelines suggested by regulatory agencies, allowable water concentrations were determined for different conditions and locations in the watershed and the bay. The model thus facilitated development of a site-specific Se objective that was locally relevant and provided a basis for step-by-step implementation of source control.
Patient-specific system for prognosis of surgical treatment outcomes of human cardiovascular system
NASA Astrophysics Data System (ADS)
Golyadkina, Anastasiya A.; Kalinin, Aleksey A.; Kirillova, Irina V.; Kossovich, Elena L.; Kossovich, Leonid Y.; Menishova, Liyana R.; Polienko, Asel V.
2015-03-01
Object of study: Improvement of life quality of patients with high stroke risk ia the main goal for development of system for patient-specific modeling of cardiovascular system. This work is dedicated at increase of safety outcomes for surgical treatment of brain blood supply alterations. The objects of study are common carotid artery, internal and external carotid arteries and bulb. Methods: We estimated mechanical properties of carotid arteries tissues and patching materials utilized at angioplasty. We studied angioarchitecture features of arteries. We developed and clinically adapted computer biomechanical models, which are characterized by geometrical, physical and mechanical similarity with carotid artery in norm and with pathology (atherosclerosis, pathological tortuosity, and their combination). Results: Collaboration of practicing cardiovascular surgeons and specialists in the area of Mathematics and Mechanics allowed to successfully conduct finite-element modeling of surgical treatment taking into account various features of operation techniques and patching materials for a specific patient. Numerical experiment allowed to reveal factors leading to brain blood supply decrease and atherosclerosis development. Modeling of carotid artery reconstruction surgery for a specific patient on the basis of the constructed biomechanical model demonstrated the possibility of its application in clinical practice at approximation of numerical experiment to the real conditions.
Jobs and Economic Development Impacts from Small Wind: JEDI Model in the Works (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tegen, S.
2012-06-01
This presentation covers the National Renewable Energy Laboratory's role in economic impact analysis for wind power Jobs and Economic Development Impacts (JEDI) models, JEDI results, small wind JEDI specifics, and a request for information to complete the model.
The State and Future of the Primary Care Behavioral Health Model of Service Delivery Workforce.
Serrano, Neftali; Cordes, Colleen; Cubic, Barbara; Daub, Suzanne
2018-06-01
The growth of the Primary Care Behavioral Health model (PCBH) nationally has highlighted and created a workforce development challenge given that most mental health professionals are not trained for primary care specialization. This work provides a review of the current efforts to retrain mental health professionals to fulfill roles as Behavioral Health Consultants (BHCs) including certificate programs, technical assistance programs, literature and on-the-job training, as well as detail the future needs of the workforce if the model is to sustainably proliferate. Eight recommendations are offered including: (1) the development of an interprofessional certification body for PCBH training criteria, (2) integration of PCBH model specific curricula in graduate studies, (3) integration of program development skill building in curricula, (4) efforts to develop faculty for PCBH model awareness, (5) intentional efforts to draw students to graduate programs for PCBH model training, (6) a national employment clearinghouse, (7) efforts to coalesce current knowledge around the provision of technical assistance to sites, and (8) workforce specific research efforts.
Yamamoto, Yumi; Välitalo, Pyry A.; Huntjens, Dymphy R.; Proost, Johannes H.; Vermeulen, An; Krauwinkel, Walter; Beukers, Margot W.; van den Berg, Dirk‐Jan; Hartman, Robin; Wong, Yin Cheong; Danhof, Meindert; van Hasselt, John G. C.
2017-01-01
Drug development targeting the central nervous system (CNS) is challenging due to poor predictability of drug concentrations in various CNS compartments. We developed a generic physiologically based pharmacokinetic (PBPK) model for prediction of drug concentrations in physiologically relevant CNS compartments. System‐specific and drug‐specific model parameters were derived from literature and in silico predictions. The model was validated using detailed concentration‐time profiles from 10 drugs in rat plasma, brain extracellular fluid, 2 cerebrospinal fluid sites, and total brain tissue. These drugs, all small molecules, were selected to cover a wide range of physicochemical properties. The concentration‐time profiles for these drugs were adequately predicted across the CNS compartments (symmetric mean absolute percentage error for the model prediction was <91%). In conclusion, the developed PBPK model can be used to predict temporal concentration profiles of drugs in multiple relevant CNS compartments, which we consider valuable information for efficient CNS drug development. PMID:28891201
Resources and Commitment as Critical Factors in the Development of "Gifted" Athletes
ERIC Educational Resources Information Center
Baker, Joseph; Cote, Jean
2003-01-01
Several sport-specific talent detection models have been developed over the last 30 years (Durand-Bush & Salmela, 2001). However, these models have failed in at least one important standard of judgment--accurately predicting who will develop into an elite level athlete. The authors believe that the WICS model presented by Robert Sternberg also…
ERIC Educational Resources Information Center
Sammis, Theodore W.; Shukla, Manoj K.; Mexal, John G.; Wang, Junming; Miller, David R.
2013-01-01
Universities develop strategic planning documents, and as part of that planning process, logic models are developed for specific programs within the university. This article examines the long-standing pecan program at New Mexico State University and the deficiencies and successes in the evolution of its logic model. The university's agricultural…
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
SimVascular: An Open Source Pipeline for Cardiovascular Simulation.
Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C
2017-03-01
Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.
Grain-Boundary Resistance in Copper Interconnects: From an Atomistic Model to a Neural Network
NASA Astrophysics Data System (ADS)
Valencia, Daniel; Wilson, Evan; Jiang, Zhengping; Valencia-Zapata, Gustavo A.; Wang, Kuang-Chung; Klimeck, Gerhard; Povolotskyi, Michael
2018-04-01
Orientation effects on the specific resistance of copper grain boundaries are studied systematically with two different atomistic tight-binding methods. A methodology is developed to model the specific resistance of grain boundaries in the ballistic limit using the embedded atom model, tight- binding methods, and nonequilibrium Green's functions. The methodology is validated against first-principles calculations for thin films with a single coincident grain boundary, with 6.4% deviation in the specific resistance. A statistical ensemble of 600 large, random structures with grains is studied. For structures with three grains, it is found that the distribution of specific resistances is close to normal. Finally, a compact model for grain-boundary-specific resistance is constructed based on a neural network.
C/sup 3/ and combat simulation - a survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, S.A. Jr.
1983-01-04
This article looks at the overlap between C/sup 3/ and combat simulation, from the point of view of the developer of combat simulations and models. In this context, there are two different questions. The first is: How and to what extent should specific models of the C/sup 3/ processes be incorporated in simulations of combat. Here the key point is the assessment of impact. In which types or levels of combat does C/sup 3/ play a role sufficiently intricate and closely coupled with combat performance that it would significantly affect combat results. Conversely, when is C/sup 3/ a known factormore » or modifier which can be simply accommodated without a specific detailed model being made for it. The second question is the inverse one. In the development of future C/sup 3/ systems, what rule should combat simulation play. Obviously, simulation of the operation of the hardware, software and other parts of the C/sup 3/ system would be useful in its design and specification, but this is not combat simulation. When is it necessary to encase the C/sup 3/ simulation model in a combat model which has enough detail to be considered a simulation itself. How should this outer combat model be scoped out as to the components needed. In order to build a background for answering these questions a two-pronged approach will be taken. First a framework for C/sup 3/ modeling will be developed, in which the various types of modeling which can be done to include or encase C/sup 3/ in a combat model are organized. This framework will hopefully be useful in describing the particular assumptions made in specific models in terms of what could be done in a more general way. Then a few specific models will be described, concentrating on the C/sup 3/ portion of the simulations, or what could be interpreted as the C/sup 3/ assumptions.« less
Empirical modeling for intelligent, real-time manufacture control
NASA Technical Reports Server (NTRS)
Xu, Xiaoshu
1994-01-01
Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.
ISO Technical Specification for the Ionosphere -IRI Recent Activities
NASA Astrophysics Data System (ADS)
Bilitza, Dieter; Reinisch, Bodo; Tamara, Gulyaeva
ISO Technical Specification TS 16457 recommends the International Reference Ionosphere (IRI) for the specification of ionospheric densities and temperatures. We review the latest develop-ments towards improving the IRI model and the newest version of the model IRI-2010. IRI-2010 includes several important improvements and additions. This presentation introduces these changes and discusses their benefits. The changes affect primarily the density profiles in the bottomside ionosphere and the density and height of the F2 peak, the point of highest density in the ionosphere. An important new addition to the model is the inclusion of auroral boundaries and their movement with magnetic activity. We will also discuss the status of other ongoing IRI activities and some of the recent applications of the IRI model. The homepage for the IRI project is at http://IRI.gsfc.nasa.gov/.
NASA Astrophysics Data System (ADS)
Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.
2014-12-01
Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming interfaces, the general model interface and five case studies, including a regression model, Noah-MP, FASST, SAC-HTET/SNOW-17, and FLake. These different models vary in complexity with software structure. Also, we will describe how these complexities were overcome through using this approach and results of model benchmarks within LIS.
Modeling ready biodegradability of fragrance materials.
Ceriani, Lidia; Papa, Ester; Kovarich, Simona; Boethling, Robert; Gramatica, Paola
2015-06-01
In the present study, quantitative structure activity relationships were developed for predicting ready biodegradability of approximately 200 heterogeneous fragrance materials. Two classification methods, classification and regression tree (CART) and k-nearest neighbors (kNN), were applied to perform the modeling. The models were validated with multiple external prediction sets, and the structural applicability domain was verified by the leverage approach. The best models had good sensitivity (internal ≥80%; external ≥68%), specificity (internal ≥80%; external 73%), and overall accuracy (≥75%). Results from the comparison with BIOWIN global models, based on group contribution method, show that specific models developed in the present study perform better in prediction than BIOWIN6, in particular for the correct classification of not readily biodegradable fragrance materials. © 2015 SETAC.
Development of estrogen receptor beta binding prediction model using large sets of chemicals.
Sakkiah, Sugunadevi; Selvaraj, Chandrabose; Gong, Ping; Zhang, Chaoyang; Tong, Weida; Hong, Huixiao
2017-11-03
We developed an ER β binding prediction model to facilitate identification of chemicals specifically bind ER β or ER α together with our previously developed ER α binding model. Decision Forest was used to train ER β binding prediction model based on a large set of compounds obtained from EADB. Model performance was estimated through 1000 iterations of 5-fold cross validations. Prediction confidence was analyzed using predictions from the cross validations. Informative chemical features for ER β binding were identified through analysis of the frequency data of chemical descriptors used in the models in the 5-fold cross validations. 1000 permutations were conducted to assess the chance correlation. The average accuracy of 5-fold cross validations was 93.14% with a standard deviation of 0.64%. Prediction confidence analysis indicated that the higher the prediction confidence the more accurate the predictions. Permutation testing results revealed that the prediction model is unlikely generated by chance. Eighteen informative descriptors were identified to be important to ER β binding prediction. Application of the prediction model to the data from ToxCast project yielded very high sensitivity of 90-92%. Our results demonstrated ER β binding of chemicals could be accurately predicted using the developed model. Coupling with our previously developed ER α prediction model, this model could be expected to facilitate drug development through identification of chemicals that specifically bind ER β or ER α .
An Object-Based Requirements Modeling Method.
ERIC Educational Resources Information Center
Cordes, David W.; Carver, Doris L.
1992-01-01
Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…
Optimal Charging of Nickel-Hydrogen Batteries for Life Extension
NASA Technical Reports Server (NTRS)
Hartley, Tom T.; Lorenzo, Carl F.
2002-01-01
We are exploring the possibility of extending the cycle life of battery systems by using a charging profile that minimizes cell damage. Only nickel-hydrogen cells are discussed at this time, but applications to lithium-ion cells are being considered. The process first requires the development of a fractional calculus based nonlinear dynamic model of the specific cells being used. The parameters of this model are determined from the cell transient responses. To extend cell cycle life, an instantaneous damage rate model is developed. The model is based on cycle life data and is highly dependent on cell voltage. Once both the cell dynamic model and the instantaneous damage rate model have been determined, the charging profile for a specific cell is determined by numerical optimization. Results concerning the percentage life extension for different charging strategies are presented. The overall procedure is readily adaptable to real-time implementations where the charging profile can maintain its minimum damage nature as the specific cell ages.
Evaluation of the Community Multi-scale Air Quality (CMAQ) ...
The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In the fall of 2015, CMAQ version 5.1 was released. This new version of CMAQ will contain important bug fixes to several issues that were identified in CMAQv5.0.2 and additionally include updates to other portions of the code. Several annual, and numerous episodic, CMAQv5.1 simulations were performed to assess the impact of these improvements on the model results. These results will be presented, along with a base evaluation of the performance of the CMAQv5.1 modeling system against available surface and upper-air measurements available during the time period simulated. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, proces
NASA Technical Reports Server (NTRS)
Colborn, B. L.; Armstong, T. W.
1993-01-01
A three-dimensional geometry and mass model of the Long Duration Exposure Facility (LDEF) spacecraft and experiment trays was developed for use in predictions and data interpretation related to ionizing radiation measurements. The modeling approach, level of detail incorporated, example models for specific experiments and radiation dosimeters, and example applications of the model are described.
NASA Astrophysics Data System (ADS)
Manninen, L. M.
1993-12-01
The document describes TKKMOD, a simulation model developed at Helsinki University of Technology for a specific wind-diesel system layout, with special emphasis on the battery submodel and its use in simulation. The model has been included into the European wind-diesel modeling software package WDLTOOLS under the CEC JOULE project 'Engineering Design Tools for Wind-Diesel Systems' (JOUR-0078). WDLTOOLS serves as the user interface and processes the input and output data of different logistic simulation models developed by the project participants. TKKMOD cannot be run without this shell. The report only describes the simulation principles and model specific parameters of TKKMOD and gives model specific user instructions. The input and output data processing performed outside this model is described in the documentation of the shell. The simulation model is utilized for calculation of long-term performance of the reference system configuration for given wind and load conditions. The main results are energy flows, losses in the system components, diesel fuel consumption, and the number of diesel engine starts.
Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K P
2002-01-01
Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification.
Improving the use of health data for health system strengthening.
Nutley, Tara; Reynolds, Heidi W
2013-02-13
Good quality and timely data from health information systems are the foundation of all health systems. However, too often data sit in reports, on shelves or in databases and are not sufficiently utilised in policy and program development, improvement, strategic planning and advocacy. Without specific interventions aimed at improving the use of data produced by information systems, health systems will never fully be able to meet the needs of the populations they serve. To employ a logic model to describe a pathway of how specific activities and interventions can strengthen the use of health data in decision making to ultimately strengthen the health system. A logic model was developed to provide a practical strategy for developing, monitoring and evaluating interventions to strengthen the use of data in decision making. The model draws on the collective strengths and similarities of previous work and adds to those previous works by making specific recommendations about interventions and activities that are most proximate to affect the use of data in decision making. The model provides an organizing framework for how interventions and activities work to strengthen the systematic demand, synthesis, review, and use of data. The logic model and guidance are presented to facilitate its widespread use and to enable improved data-informed decision making in program review and planning, advocacy, policy development. Real world examples from the literature support the feasible application of the activities outlined in the model. The logic model provides specific and comprehensive guidance to improve data demand and use. It can be used to design, monitor and evaluate interventions, and to improve demand for, and use of, data in decision making. As more interventions are implemented to improve use of health data, those efforts need to be evaluated.
NASA Technical Reports Server (NTRS)
1972-01-01
A unified approach to computer vision and manipulation is developed which is called choreographic vision. In the model, objects to be viewed by a projected robot in the Viking missions to Mars are seen as objects to be manipulated within choreographic contexts controlled by a multimoded remote, supervisory control system on Earth. A new theory of context relations is introduced as a basis for choreographic programming languages. A topological vision model is developed for recognizing objects by shape and contour. This model is integrated with a projected vision system consisting of a multiaperture image dissector TV camera and a ranging laser system. System program specifications integrate eye-hand coordination and topological vision functions and an aerospace multiprocessor implementation is described.
Model Design for Military Advisors
2013-05-02
needs of their counterpart. This paper explores one area that would significantly improve advising outcomes; using advising models to match the...more specific. This paper develops three dominate models for advisors; the Stoic Acquaintance, the General Manger, and the Entertainer which can...then outcomes related to the individual counterpart’s developmental needs will be more predictable and specific. This paper will focus only on
NASA Astrophysics Data System (ADS)
Hanafiah, Hazlenah; Jemain, Abdul Aziz
2013-11-01
In recent years, the study of fertility has been getting a lot of attention among research abroad following fear of deterioration of fertility led by the rapid economy development. Hence, this study examines the feasibility of developing fertility forecasts based on age structure. Lee Carter model (1992) is applied in this study as it is an established and widely used model in analysing demographic aspects. A singular value decomposition approach is incorporated with an ARIMA model to estimate age specific fertility rates in Peninsular Malaysia over the period 1958-2007. Residual plots is used to measure the goodness of fit of the model. Fertility index forecast using random walk drift is then utilised to predict the future age specific fertility. Results indicate that the proposed model provides a relatively good and reasonable data fitting. In addition, there is an apparent and continuous decline in age specific fertility curves in the next 10 years, particularly among mothers' in their early 20's and 40's. The study on the fertility is vital in order to maintain a balance between the population growth and the provision of facilities related resources.
Systems, methods and apparatus for pattern matching in procedure development and verification
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which, in some embodiments, a formal specification is pattern-matched from scenarios, the formal specification is analyzed, and flaws in the formal specification are corrected. The systems, methods and apparatus may include pattern-matching an equivalent formal model from an informal specification. Such a model can be analyzed for contradictions, conflicts, use of resources before the resources are available, competition for resources, and so forth. From such a formal model, an implementation can be automatically generated in a variety of notations. The approach can improve the resulting implementation, which, in some embodiments, is provably equivalent to the procedures described at the outset, which in turn can improve confidence that the system reflects the requirements, and in turn reduces system development time and reduces the amount of testing required of a new system. Moreover, in some embodiments, two or more implementations can be "reversed" to appropriate formal models, the models can be combined, and the resulting combination checked for conflicts. Then, the combined, error-free model can be used to generate a new (single) implementation that combines the functionality of the original separate implementations, and may be more likely to be correct.
Synthesis of geophysical data with space-acquired imagery: a review
Hastings, David A.
1983-01-01
Statistical correlation has been used to determine the applicability of specific data sets to the development of geologic or exploration models. Various arithmetic functions have proven useful in developing models from such data sets.
Program management model study
NASA Technical Reports Server (NTRS)
Connelly, J. J.; Russell, J. E.; Seline, J. R.; Sumner, N. R., Jr.
1972-01-01
Two models, a system performance model and a program assessment model, have been developed to assist NASA management in the evaluation of development alternatives for the Earth Observations Program. Two computer models were developed and demonstrated on the Goddard Space Flight Center Computer Facility. Procedures have been outlined to guide the user of the models through specific evaluation processes, and the preparation of inputs describing earth observation needs and earth observation technology. These models are intended to assist NASA in increasing the effectiveness of the overall Earth Observation Program by providing a broader view of system and program development alternatives.
Jennifer A. Holm; H.H. Shugart; Skip J. Van Bloem; G.R. Larocque
2012-01-01
Because of human pressures, the need to understand and predict the long-term dynamics and development of subtropical dry forests is urgent. Through modifications to the ZELIG simulation model, including the development of species- and site-specific parameters and internal modifications, the capability to model and predict forest change within the 4500-ha Guanica State...
JEDI Methodology | Jobs and Economic Development Impact Models | NREL
Methodology JEDI Methodology The intent of the Jobs and Economic Development Impact (JEDI) models costs) to demonstrate the employment and economic impacts that will likely result during the estimate of overall economic impacts from specific scenarios. Please see Limitations of JEDI Models for
Generative Topic Modeling in Image Data Mining and Bioinformatics Studies
ERIC Educational Resources Information Center
Chen, Xin
2012-01-01
Probabilistic topic models have been developed for applications in various domains such as text mining, information retrieval and computer vision and bioinformatics domain. In this thesis, we focus on developing novel probabilistic topic models for image mining and bioinformatics studies. Specifically, a probabilistic topic-connection (PTC) model…
To facilitate evaluation of existing site characterization data, ORD has developed on-line tools and models that integrate data and models into innovative applications. Forty calculators have been developed in four groups: parameter estimators, models, scientific demos and unit ...
Development and Validation of Osteoporosis Risk-Assessment Model for Korean Men
Oh, Sun Min; Song, Bo Mi; Nam, Byung-Ho; Rhee, Yumie; Moon, Seong-Hwan; Kim, Deog Young; Kang, Dae Ryong
2016-01-01
Purpose The aim of the present study was to develop an osteoporosis risk-assessment model to identify high-risk individuals among Korean men. Materials and Methods The study used data from 1340 and 1110 men ≥50 years who participated in the 2009 and 2010 Korean National Health and Nutrition Examination Survey, respectively, for development and validation of an osteoporosis risk-assessment model. Osteoporosis was defined as T score ≤-2.5 at either the femoral neck or lumbar spine. Performance of the candidate models and the Osteoporosis Self-assessment Tool for Asian (OSTA) was compared with sensitivity, specificity, and area under the receiver operating characteristics curve (AUC). A net reclassification improvement was further calculated to compare the developed Korean Osteoporosis Risk-Assessment Model for Men (KORAM-M) with OSTA. Results In the development dataset, the prevalence of osteoporosis was 8.1%. KORAM-M, consisting of age and body weight, had a sensitivity of 90.8%, a specificity of 42.4%, and an AUC of 0.666 with a cut-off score of -9. In the validation dataset, similar results were shown: sensitivity 87.9%, specificity 39.7%, and AUC 0.638. Additionally, risk categorization with KORAM-M showed improved reclassification over that of OSTA up to 22.8%. Conclusion KORAM-M can be simply used as a pre-screening tool to identify candidates for dual energy X-ray absorptiometry tests. PMID:26632400
Regulatory agencies must develop fish consumption advisories for many lakes and rivers with limited resources. Process-based mathematical models are potentially valuable tools for developing regional fish advisories. The Regional Mercury Cycling model (R-MCM) was specifically d...
Design and Development of a Microscopic Model for Polarization
ERIC Educational Resources Information Center
Petridou, E.; Psillos, D.; Hatzikraniotis, E.; Viiri, J.
2009-01-01
As research shows that the knowledge and use of models and modelling by teachers is limited, particularly for predicting phenomena, we developed and applied a sequence of three representations of a simulated model focusing on polarization and specifically showing the behaviour of an atom, and forces exerted on a dipole and an insulator, when a…
Sex-specific gonadal and gene expression changes throughout development in fathead minnow
Although fathead minnows (Pimephales promelas) are commonly used as a model fish in endocrine disruption studies, none have characterized sex-specific baseline expression of genes involved in sex differentiation during development in this species. Using a sex-linked DNA marker t...
Investigation using data from ERTS to develop and implement utilization of living marine resources
NASA Technical Reports Server (NTRS)
Stevenson, W. H. (Principal Investigator); Pastula, E. J., Jr.
1973-01-01
The author has identified the following significant results. The feasibility of utilizing ERTS-1 data in conjunction with aerial remote sensing and sea truth information to predict the distribution of menhaden in the Mississippi Sound during a specific time frame has been demonstrated by employing a number of uniquely designed empirical regression models. The construction of these models was made possible through innovative statistical routines specifically developed to meet the stated objectives.
Zhai, Zhiqiang; Song, Guohua; Lu, Hongyu; He, Weinan; Yu, Lei
2017-09-01
Vehicle-specific power (VSP) has been found to be highly correlated with vehicle emissions. It is used in many studies on emission modeling such as the MOVES (Motor Vehicle Emissions Simulator) model. The existing studies develop specific VSP distributions (or OpMode distribution in MOVES) for different road types and various average speeds to represent the vehicle operating modes on road. However, it is still not clear if the facility- and speed-specific VSP distributions are consistent temporally and spatially. For instance, is it necessary to update periodically the database of the VSP distributions in the emission model? Are the VSP distributions developed in the city central business district (CBD) area applicable to its suburb area? In this context, this study examined the temporal and spatial consistency of the facility- and speed-specific VSP distributions in Beijing. The VSP distributions in different years and in different areas are developed, based on real-world vehicle activity data. The root mean square error (RMSE) is employed to quantify the difference between the VSP distributions. The maximum differences of the VSP distributions between different years and between different areas are approximately 20% of that between different road types. The analysis of the carbon dioxide (CO 2 ) emission factor indicates that the temporal and spatial differences of the VSP distributions have no significant impact on vehicle emission estimation, with relative error of less than 3%. The temporal and spatial differences have no significant impact on the development of the facility- and speed-specific VSP distributions for the vehicle emission estimation. The database of the specific VSP distributions in the VSP-based emission models can maintain in terms of time. Thus, it is unnecessary to update the database regularly, and it is reliable to use the history vehicle activity data to forecast the emissions in the future. In one city, the areas with less data can still develop accurate VSP distributions based on better data from other areas.
Modeling of transitional flows
NASA Technical Reports Server (NTRS)
Lund, Thomas S.
1988-01-01
An effort directed at developing improved transitional models was initiated. The focus of this work was concentrated on the critical assessment of a popular existing transitional model developed by McDonald and Fish in 1972. The objective of this effort was to identify the shortcomings of the McDonald-Fish model and to use the insights gained to suggest modifications or alterations of the basic model. In order to evaluate the transitional model, a compressible boundary layer code was required. Accordingly, a two-dimensional compressible boundary layer code was developed. The program was based on a three-point fully implicit finite difference algorithm where the equations were solved in an uncoupled manner with second order extrapolation used to evaluate the non-linear coefficients. Iteration was offered as an option if the extrapolation error could not be tolerated. The differencing scheme was arranged to be second order in both spatial directions on an arbitrarily stretched mesh. A variety of boundary condition options were implemented including specification of an external pressure gradient, specification of a wall temperature distribution, and specification of an external temperature distribution. Overall the results of the initial phase of this work indicate that the McDonald-Fish model does a poor job at predicting the details of the turbulent flow structure during the transition region.
ERIC Educational Resources Information Center
Fischer, Gerhard H.
1987-01-01
A natural parameterization and formalization of the problem of measuring change in dichotomous data is developed. Mathematically-exact definitions of specific objectivity are presented, and the basic structures of the linear logistic test model and the linear logistic model with relaxed assumptions are clarified. (SLD)
A Product Development Decision Model for Cockpit Weather Information System
NASA Technical Reports Server (NTRS)
Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin; Johnson, Edward J., Jr. (Technical Monitor)
2003-01-01
There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.
A Product Development Decision Model for Cockpit Weather Information Systems
NASA Technical Reports Server (NTRS)
Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin
2003-01-01
There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.
Characterization of Used Nuclear Fuel with Multivariate Analysis for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dayman, Kenneth J.; Coble, Jamie B.; Orton, Christopher R.
2014-01-01
The Multi-Isotope Process (MIP) Monitor combines gamma spectroscopy and multivariate analysis to detect anomalies in various process streams in a nuclear fuel reprocessing system. Measured spectra are compared to models of nominal behavior at each measurement location to detect unexpected changes in system behavior. In order to improve the accuracy and specificity of process monitoring, fuel characterization may be used to more accurately train subsequent models in a full analysis scheme. This paper presents initial development of a reactor-type classifier that is used to select a reactor-specific partial least squares model to predict fuel burnup. Nuclide activities for prototypic usedmore » fuel samples were generated in ORIGEN-ARP and used to investigate techniques to characterize used nuclear fuel in terms of reactor type (pressurized or boiling water reactor) and burnup. A variety of reactor type classification algorithms, including k-nearest neighbors, linear and quadratic discriminant analyses, and support vector machines, were evaluated to differentiate used fuel from pressurized and boiling water reactors. Then, reactor type-specific partial least squares models were developed to predict the burnup of the fuel. Using these reactor type-specific models instead of a model trained for all light water reactors improved the accuracy of burnup predictions. The developed classification and prediction models were combined and applied to a large dataset that included eight fuel assembly designs, two of which were not used in training the models, and spanned the range of the initial 235U enrichment, cooling time, and burnup values expected of future commercial used fuel for reprocessing. Error rates were consistent across the range of considered enrichment, cooling time, and burnup values. Average absolute relative errors in burnup predictions for validation data both within and outside the training space were 0.0574% and 0.0597%, respectively. The errors seen in this work are artificially low, because the models were trained, optimized, and tested on simulated, noise-free data. However, these results indicate that the developed models may generalize well to new data and that the proposed approach constitutes a viable first step in developing a fuel characterization algorithm based on gamma spectra.« less
AgRISTARS: Yield model development/soil moisture. Interface control document
NASA Technical Reports Server (NTRS)
1980-01-01
The interactions and support functions required between the crop Yield Model Development (YMD) Project and Soil Moisture (SM) Project are defined. The requirements for YMD support of SM and vice-versa are outlined. Specific tasks in support of these interfaces are defined for development of support functions.
Job Aid Manuals for Phase III--DEVELOP of the Instructional Systems Development Model.
ERIC Educational Resources Information Center
Schulz, Russel E.; Farrell, Jean R.
Designed to supplement the descriptive authoring flowcharts presented in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the third phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions;…
Le, Victoria P.; Yamashiro, Yoshito; Yanagisawa, Hiromi; Wagenseil, Jessica E.
2014-01-01
Mice with a smooth muscle cell (SMC) specific deletion of fibulin-4 (SMKO) show decreased expression of SMC contractile genes, decreased circumferential compliance, and develop aneurysms in the ascending aorta. Neonatal administration of drugs that inhibit the angiotensin II pathway encourage expression of contractile genes and prevent aneurysm development, but do not increase compliance in SMKO aorta. We hypothesized that multidimensional mechanical changes in the aorta and/or other elastic arteries may contribute to aneurysm pathophysiology. We found that the SMKO ascending aorta and carotid artery showed mechanical changes in the axial direction. These changes were not reversed by angiotensin II inhibitors, hence reversing the axial changes is not required for aneurysm prevention. Mechanical changes in the circumferential direction were specific to the ascending aorta, therefore mechanical changes in the carotid do not contribute to aortic aneurysm development. We also hypothesized that a published model of postnatal aortic growth and remodeling could be used to investigate mechanisms behind the changes in SMKO aorta and aneurysm development over time. Dimensions and mechanical behavior of adult SMKO aorta were reproduced by the model after modifying the initial component material constants and the aortic dilation with each postnatal time step. The model links biological observations to specific mechanical responses in aneurysm development and treatment. PMID:24526456
Le, Victoria P; Yamashiro, Yoshito; Yanagisawa, Hiromi; Wagenseil, Jessica E
2014-10-01
Mice with a smooth muscle cell (SMC)-specific deletion of Fibulin-4 (SMKO) show decreased expression of SMC contractile genes, decreased circumferential compliance, and develop aneurysms in the ascending aorta. Neonatal administration of drugs that inhibit the angiotensin II pathway encourages the expression of contractile genes and prevents aneurysm development, but does not increase compliance in SMKO aorta. We hypothesized that multidimensional mechanical changes in the aorta and/or other elastic arteries may contribute to aneurysm pathophysiology. We found that the SMKO ascending aorta and carotid artery showed mechanical changes in the axial direction. These changes were not reversed by angiotensin II inhibitors, hence reversing the axial changes is not required for aneurysm prevention. Mechanical changes in the circumferential direction were specific to the ascending aorta; therefore, mechanical changes in the carotid do not contribute to aortic aneurysm development. We also hypothesized that a published model of postnatal aortic growth and remodeling could be used to investigate mechanisms behind the changes in SMKO aorta and aneurysm development over time. Dimensions and mechanical behavior of adult SMKO aorta were reproduced by the model after modifying the initial component material constants and the aortic dilation with each postnatal time step. The model links biological observations to specific mechanical responses in aneurysm development and treatment.
Chang, Chawnshang; Lee, Soo Ok; Wang, Ruey-Sheng; Yeh, Shuyuan; Chang, Ta-Min
2013-01-01
ABSTRACT Androgens/androgen receptor (AR) signaling is involved primarily in the development of male-specific phenotypes during embryogenesis, spermatogenesis, sexual behavior, and fertility during adult life. However, this signaling has also been shown to play an important role in development of female reproductive organs and their functions, such as ovarian folliculogenesis, embryonic implantation, and uterine and breast development. The establishment of the testicular feminization (Tfm) mouse model exploiting the X-linked Tfm mutation in mice has been a good in vivo tool for studying the human complete androgen insensitivity syndrome, but this mouse may not be the perfect in vivo model. Mouse models with various cell-specific AR knockout (ARKO) might allow us to study AR roles in individual types of cells in these male and female reproductive systems, although discrepancies are found in results between labs, probably due to using various Cre mice and/or knocking out AR in different AR domains. Nevertheless, no doubt exists that the continuous development of these ARKO mouse models and careful studies will provide information useful for understanding AR roles in reproductive systems of humans and may help us to develop more effective and more specific therapeutic approaches for reproductive system-related diseases. PMID:23782840
Development of a Predictive Corrosion Model Using Locality-Specific Corrosion Indices
2017-09-12
6 3.2.1 Statistical data analysis methods ...6 3.2.2 Algorithm development method ...components, and method ) were compiled into an executable program that uses mathematical models of materials degradation, and statistical calcula- tions
Melanoma Risk Prediction Models
Developing statistical models that estimate the probability of developing melanoma cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Niu, Ran; Skliar, Mikhail
2012-07-01
In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate.
Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa
2015-04-13
Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.
Specifications of insilicoML 1.0: a multilevel biophysical model description language.
Asai, Yoshiyuki; Suzuki, Yasuyuki; Kido, Yoshiyuki; Oka, Hideki; Heien, Eric; Nakanishi, Masao; Urai, Takahito; Hagihara, Kenichi; Kurachi, Yoshihisa; Nomura, Taishin
2008-12-01
An extensible markup language format, insilicoML (ISML), version 0.1, describing multi-level biophysical models has been developed and available in the public domain. ISML is fully compatible with CellML 1.0, a model description standard developed by the IUPS Physiome Project, for enhancing knowledge integration and model sharing. This article illustrates the new specifications of ISML 1.0 that largely extend the capability of ISML 0.1. ISML 1.0 can describe various types of mathematical models, including ordinary/partial differential/difference equations representing the dynamics of physiological functions and the geometry of living organisms underlying the functions. ISML 1.0 describes a model using a set of functional elements (modules) each of which can specify mathematical expressions of the functions. Structural and logical relationships between any two modules are specified by edges, which allow modular, hierarchical, and/or network representations of the model. The role of edge-relationships is enriched by key words in order for use in constructing a physiological ontology. The ontology is further improved by the traceability of history of the model's development and by linking between different ISML models stored in the model's database using meta-information. ISML 1.0 is designed to operate with a model database and integrated environments for model development and simulations for knowledge integration and discovery.
Tong, Xiuli; McBride, Catherine
2017-07-01
Following a review of contemporary models of word-level processing for reading and their limitations, we propose a new hypothetical model of Chinese character reading, namely, the graded lexical space mapping model that characterizes how sublexical radicals and lexical information are involved in Chinese character reading development. The underlying assumption of this model is that Chinese character recognition is a process of competitive mappings of phonology, semantics, and orthography in both lexical and sublexical systems, operating as functions of statistical properties of print input based on the individual's specific level of reading. This model leads to several testable predictions concerning how the quasiregularity and continuity of Chinese-specific radicals are organized in memory for both child and adult readers at different developmental stages of reading.
NASA Technical Reports Server (NTRS)
Daly, J. K.; Torian, J. G.
1979-01-01
Software design specifications for developing environmental control and life support system (ECLSS) and electrical power system (EPS) programs into interactive computer programs are presented. Specifications for the ECLSS program are at the detail design level with respect to modification of an existing batch mode program. The FORTRAN environmental analysis routines (FEAR) are the subject batch mode program. The characteristics of the FEAR program are included for use in modifying batch mode programs to form interactive programs. The EPS program specifications are at the preliminary design level. Emphasis is on top-down structuring in the development of an interactive program.
Blat, Dan; Zigmond, Ehud; Alteber, Zoya; Waks, Tova; Eshhar, Zelig
2014-01-01
The adoptive transfer of regulatory T cells (Tregs) offers a promising strategy to combat pathologies that are characterized by aberrant immune activation, including graft rejection and autoinflammatory diseases. Expression of a chimeric antigen receptor (CAR) gene in Tregs redirects them to the site of autoimmune activity, thereby increasing their suppressive efficiency while avoiding systemic immunosuppression. Since carcinoembryonic antigen (CEA) has been shown to be overexpressed in both human colitis and colorectal cancer, we treated CEA-transgenic mice that were induced to develop colitis with CEA-specific CAR Tregs. Two disease models were employed: T-cell-transfer colitis as well as the azoxymethane–dextran sodium sulfate model for colitis-associated colorectal cancer. Systemically administered CEA-specific (but not control) CAR Tregs accumulated in the colons of diseased mice. In both model systems, CEA-specific CAR Tregs suppressed the severity of colitis compared to control Tregs. Moreover, in the azoxymethane–dextran sodium sulfate model, CEA-specific CAR Tregs significantly decreased the subsequent colorectal tumor burden. Our data demonstrate that CEA-specific CAR Tregs exhibit a promising potential in ameliorating ulcerative colitis and in hindering colorectal cancer development. Collectively, this study provides a proof of concept for the therapeutic potential of CAR Tregs in colitis patients as well as in other autoimmune inflammatory disorders. PMID:24686242
ERIC Educational Resources Information Center
Karimova, A. E.; Amanova, A. S.; Sadykova, A. M.; Kuzembaev, N. E.; Makisheva, A. T.; Kurmangazina, G. Zh.; Sakenov, Janat
2016-01-01
The article explores the significant problem of developing a theoretical model of professional competence development in dual-specialty students (on the example of the "History, Religious studies" specialty). In order to validate the specifics of the professional competence development in dual-specialty students (on the example of the…
Teachers Helping Teachers: A Professional Development Model That Promotes Teacher Leadership
ERIC Educational Resources Information Center
Ghamrawi, Norma
2013-01-01
This mixed methods study reports on the outcomes of a professional development model (PDM) developed by a K-12 private school in Beirut, Lebanon, after 3 years of its employment. Specifically, an evaluation of this PDM is provided with special emphasis on its potential of developing teacher leaders at school. The PDM embraces a constructivist…
Liu, Zitao; Hauskrecht, Milos
2017-11-01
Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.
The Continuous Quality Improvement Book Club: Developing a Book Club to Promote Praxis
ERIC Educational Resources Information Center
Lyons, Becky; Ray, Chris
2014-01-01
This article poses a model for developing a book club to promote praxis. This model is built upon a basic four step framework for developing book clubs and includes specific recommendations to focus the book club on reflection of theory and how to incorporate it into practice. This model will be used to start a book club examining Continuous…
Automatic Dynamic Aircraft Modeler (ADAM) for the Computer Program NASTRAN
NASA Technical Reports Server (NTRS)
Griffis, H.
1985-01-01
Large general purpose finite element programs require users to develop large quantities of input data. General purpose pre-processors are used to decrease the effort required to develop structural models. Further reduction of effort can be achieved by specific application pre-processors. Automatic Dynamic Aircraft Modeler (ADAM) is one such application specific pre-processor. General purpose pre-processors use points, lines and surfaces to describe geometric shapes. Specifying that ADAM is used only for aircraft structures allows generic structural sections, wing boxes and bodies, to be pre-defined. Hence with only gross dimensions, thicknesses, material properties and pre-defined boundary conditions a complete model of an aircraft can be created.
A Data-Driven Framework for Incorporating New Tools for ...
This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
NASA Astrophysics Data System (ADS)
Kuznetsova, Maria
The Community Coordinated Modeling Center (CCMC, http://ccmc.gsfc.nasa.gov) was established at the dawn of the new millennium as a long-term flexible solution to the problem of transition of progress in space environment modeling to operational space weather forecasting. CCMC hosts an expanding collection of state-of-the-art space weather models developed by the international space science community. Over the years the CCMC acquired the unique experience in preparing complex models and model chains for operational environment and developing and maintaining custom displays and powerful web-based systems and tools ready to be used by researchers, space weather service providers and decision makers. In support of space weather needs of NASA users CCMC is developing highly-tailored applications and services that target specific orbits or locations in space and partnering with NASA mission specialists on linking CCMC space environment modeling with impacts on biological and technological systems in space. Confidence assessment of model predictions is an essential element of space environment modeling. CCMC facilitates interaction between model owners and users in defining physical parameters and metrics formats relevant to specific applications and leads community efforts to quantify models ability to simulate and predict space environment events. Interactive on-line model validation systems developed at CCMC make validation a seamless part of model development circle. The talk will showcase innovative solutions for space weather research, validation, anomaly analysis and forecasting and review on-going community-wide model validation initiatives enabled by CCMC applications.
Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.
2013-01-01
Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887
Fuel-efficient cruise performance model for general aviation piston engine airplanes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parkinson, R.C.H.
1982-01-01
The uses and limitations of typical Pilot Operating Handbook cruise performance data, for constructing cruise performance models suitable for maximizing specific range, are first examined. These data are found to be inadequate for constructing such models. A new model of General Aviation piston-prop airplane cruise performance is then developed. This model consists of two subsystem models: the airframe-propeller-atmosphere subsystem model; and the engine-atmosphere subsystem model. The new model facilitates maximizing specific range; and by virtue of its simplicity and low volume data storage requirements, appears suitable for airborne microprocessor implementation.
A stochastic model for tumor geometry evolution during radiation therapy in cervical cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Yifang; Lee, Chi-Guhn; Chan, Timothy C. Y., E-mail: tcychan@mie.utoronto.ca
2014-02-15
Purpose: To develop mathematical models to predict the evolution of tumor geometry in cervical cancer undergoing radiation therapy. Methods: The authors develop two mathematical models to estimate tumor geometry change: a Markov model and an isomorphic shrinkage model. The Markov model describes tumor evolution by investigating the change in state (either tumor or nontumor) of voxels on the tumor surface. It assumes that the evolution follows a Markov process. Transition probabilities are obtained using maximum likelihood estimation and depend on the states of neighboring voxels. The isomorphic shrinkage model describes tumor shrinkage or growth in terms of layers of voxelsmore » on the tumor surface, instead of modeling individual voxels. The two proposed models were applied to data from 29 cervical cancer patients treated at Princess Margaret Cancer Centre and then compared to a constant volume approach. Model performance was measured using sensitivity and specificity. Results: The Markov model outperformed both the isomorphic shrinkage and constant volume models in terms of the trade-off between sensitivity (target coverage) and specificity (normal tissue sparing). Generally, the Markov model achieved a few percentage points in improvement in either sensitivity or specificity compared to the other models. The isomorphic shrinkage model was comparable to the Markov approach under certain parameter settings. Convex tumor shapes were easier to predict. Conclusions: By modeling tumor geometry change at the voxel level using a probabilistic model, improvements in target coverage and normal tissue sparing are possible. Our Markov model is flexible and has tunable parameters to adjust model performance to meet a range of criteria. Such a model may support the development of an adaptive paradigm for radiation therapy of cervical cancer.« less
Proceedings of the Workshop on Government Oil Spill Modeling
NASA Technical Reports Server (NTRS)
Bishop, J. M. (Compiler)
1980-01-01
Oil spill model users and modelers were brought together for the purpose of fostering joint communication and increasing understanding of mutual problems. The workshop concentrated on defining user needs, presentations on ongoing modeling programs, and discussions of supporting research for these modeling efforts. Specific user recommendations include the development of an oil spill model user library which identifies and describes available models. The development of models for the long-term fate and effect of spilled oil was examined.
A Practical Skills Model for Effectively Engaging Clients in Multicultural Settings
ERIC Educational Resources Information Center
Alberta, Anthony J.; Wood, Anita H.
2009-01-01
The Practical Skills Model of Multicultural Engagement represents an attempt to create a means for moving beyond the development of knowledge and awareness into the development of skills that will assist practitioners to practice in a culturally competent manner. The model builds on basic counseling skills, combining them with specific approaches…
ERIC Educational Resources Information Center
Zangori, Laura; Forbes, Cory T.
2016-01-01
To develop scientific literacy, elementary students should engage in knowledge building of core concepts through scientific practice (Duschl, Schweingruber, & Schouse, 2007). A core scientific practice is engagement in scientific modeling to build conceptual understanding about discipline-specific concepts. Yet scientific modeling remains…
A new physically-based windblown dust emission ...
Dust has significant impacts on weather and climate, air quality and visibility, and human health; therefore, it is important to include a windblown dust emission module in atmospheric and air quality models. In this presentation, we summarize our efforts in development of a physics-based windblown dust emission scheme and its implementation in the CMAQ modeling system. The new model incorporates the effect of the surface wind speed, soil texture, soil moisture, and surface roughness in a physically sound manner. Specifically, a newly developed dynamic relation for the surface roughness length in this model is believed to adequately represent the physics of the surface processes involved in the dust generation. Furthermore, careful attention is paid in integrating the new windblown dust module within the CMAQ to ensure that the required input parameters are correctly configured. The new model is evaluated for the case studies including the continental United States and the Northern hemisphere, and is shown to be able to capture the occurrence of the dust outbreak and the level of the soil concentration. We discuss the uncertainties and limitations of the model and briefly describe our path forward for further improvements. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
Prostate Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing prostate cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Bladder Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing bladder cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Ovarian Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing ovarian cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Pancreatic Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing pancreatic cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Breast Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing breast cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Esophageal Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing esophageal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Cervical Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing cervical cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Liver Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing liver cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
DEVELOPING SITE-SPECIFIC MODELS FOR FORECASTING BACTERIA LEVELS AT COASTAL BEACHES
The U.S.Beaches Environmental Assessment and Coastal Health Act of 2000 authorizes studies of pathogen indicators in coastal recreation waters that develop appropriate, accurate, expeditious, and cost-effective methods (including predictive models) for quantifying pathogens in co...
Lung Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing lung cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
DIAGNOSTIC TOOL DEVELOPMENT AND APPLICATION THROUGH REGIONAL CASE STUDIES
Case studies are a useful vehicle for developing and testing conceptual models, classification systems, diagnostic tools and models, and stressor-response relationships. Furthermore, case studies focused on specific places or issues of interest to the Agency provide an excellent ...
Colorectal Cancer Risk Prediction Models
Developing statistical models that estimate the probability of developing colorectal cancer over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
Development and fabrication of the Virginia skid-resistance measurement vehicle (model 2).
DOT National Transportation Integrated Search
1970-01-01
The inefficiency of the Virginia Highway Research Council, Model 1, skid measurement trailer, and the increasing effort expended by the American Society for Testing and Materials toward the development of more stringent specifications for pavement sk...
Martini, Alberto; Gupta, Akriti; Lewis, Sara C; Cumarasamy, Shivaram; Haines, Kenneth G; Briganti, Alberto; Montorsi, Francesco; Tewari, Ashutosh K
2018-04-19
To develop a nomogram for predicting side-specific extracapsular extension (ECE) for planning nerve-sparing radical prostatectomy. We retrospectively analysed data from 561 patients who underwent robot-assisted radical prostatectomy between February 2014 and October 2015. To develop a side-specific predictive model, we considered the prostatic lobes separately. Four variables were included: prostate-specific antigen; highest ipsilateral biopsy Gleason grade; highest ipsilateral percentage core involvement; and ECE on multiparametric magnetic resonance imaging (mpMRI). A multivariable logistic regression analysis was fitted to predict side-specific ECE. A nomogram was built based on the coefficients of the logit function. Internal validation was performed using 'leave-one-out' cross-validation. Calibration was graphically investigated. The decision curve analysis was used to evaluate the net clinical benefit. The study population consisted of 829 side-specific cases, after excluding negative biopsy observations (n = 293). ECE was reported on mpMRI and final pathology in 115 (14%) and 142 (17.1%) cases, respectively. Among these, mpMRI was able to predict ECE correctly in 57 (40.1%) cases. All variables in the model except highest percentage core involvement were predictors of ECE (all P ≤ 0.006). All variables were considered for inclusion in the nomogram. After internal validation, the area under the curve was 82.11%. The model demonstrated excellent calibration and improved clinical risk prediction, especially when compared with relying on mpMRI prediction of ECE alone. When retrospectively applying the nomogram-derived probability, using a 20% threshold for performing nerve-sparing, nine out of 14 positive surgical margins (PSMs) at the site of ECE resulted above the threshold. We developed an easy-to-use model for the prediction of side-specific ECE, and hope it serves as a tool for planning nerve-sparing radical prostatectomy and in the reduction of PSM in future series. © 2018 The Authors BJU International © 2018 BJU International Published by John Wiley & Sons Ltd.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Technology Development Risk Assessment for Space Transportation Systems
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Godsell, Aga M.; Go, Susie
2006-01-01
A new approach for assessing development risk associated with technology development projects is presented. The method represents technology evolution in terms of sector-specific discrete development stages. A Monte Carlo simulation is used to generate development probability distributions based on statistical models of the discrete transitions. Development risk is derived from the resulting probability distributions and specific program requirements. Two sample cases are discussed to illustrate the approach, a single rocket engine development and a three-technology space transportation portfolio.
ERIC Educational Resources Information Center
Miller, Robin E.
2011-01-01
Communities of practice offer reference librarians a conceptual model through which to develop and maintain general and subject specific knowledge. Reference librarians acquire general and subject-specific knowledge in many ways, sometimes independently and sometimes collaboratively. Applying the concept of the "community of practice" to reference…
Collective (Team) Learning Process Models: A Conceptual Review
ERIC Educational Resources Information Center
Knapp, Randall
2010-01-01
Teams have become a key resource for learning and accomplishing work in organizations. The development of collective learning in specific contexts is not well understood, yet has become critical to organizational success. The purpose of this conceptual review is to inform human resource development (HRD) practice about specific team behaviors and…
A modeling approach to compare ΣPCB concentrations between congener-specific analyses
Gibson, Polly P.; Mills, Marc A.; Kraus, Johanna M.; Walters, David M.
2017-01-01
Changes in analytical methods over time pose problems for assessing long-term trends in environmental contamination by polychlorinated biphenyls (PCBs). Congener-specific analyses vary widely in the number and identity of the 209 distinct PCB chemical configurations (congeners) that are quantified, leading to inconsistencies among summed PCB concentrations (ΣPCB) reported by different studies. Here we present a modeling approach using linear regression to compare ΣPCB concentrations derived from different congener-specific analyses measuring different co-eluting groups. The approach can be used to develop a specific conversion model between any two sets of congener-specific analytical data from similar samples (similar matrix and geographic origin). We demonstrate the method by developing a conversion model for an example data set that includes data from two different analytical methods, a low resolution method quantifying 119 congeners and a high resolution method quantifying all 209 congeners. We used the model to show that the 119-congener set captured most (93%) of the total PCB concentration (i.e., Σ209PCB) in sediment and biological samples. ΣPCB concentrations estimated using the model closely matched measured values (mean relative percent difference = 9.6). General applications of the modeling approach include (a) generating comparable ΣPCB concentrations for samples that were analyzed for different congener sets; and (b) estimating the proportional contribution of different congener sets to ΣPCB. This approach may be especially valuable for enabling comparison of long-term remediation monitoring results even as analytical methods change over time.
Multiscale Mathematics for Biomass Conversion to Renewable Hydrogen
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plechac, Petr; Vlachos, Dionisios; Katsoulakis, Markos
2013-09-05
The overall objective of this project is to develop multiscale models for understanding and eventually designing complex processes for renewables. To the best of our knowledge, our work is the first attempt at modeling complex reacting systems, whose performance relies on underlying multiscale mathematics. Our specific application lies at the heart of biofuels initiatives of DOE and entails modeling of catalytic systems, to enable economic, environmentally benign, and efficient conversion of biomass into either hydrogen or valuable chemicals. Specific goals include: (i) Development of rigorous spatio-temporal coarse-grained kinetic Monte Carlo (KMC) mathematics and simulation for microscopic processes encountered in biomassmore » transformation. (ii) Development of hybrid multiscale simulation that links stochastic simulation to a deterministic partial differential equation (PDE) model for an entire reactor. (iii) Development of hybrid multiscale simulation that links KMC simulation with quantum density functional theory (DFT) calculations. (iv) Development of parallelization of models of (i)-(iii) to take advantage of Petaflop computing and enable real world applications of complex, multiscale models. In this NCE period, we continued addressing these objectives and completed the proposed work. Main initiatives, key results, and activities are outlined.« less
USDA-ARS?s Scientific Manuscript database
A monoclonal antibody (MAb) against 4-(diethoxyphosphorothioyloxy)benzoic acid (hapten 1) was raised and used to develop a broad-specificity competitive indirect enzyme-linked immunosorbent assay (ciELISA) for 14 O,O-diethyl organophosphorus pesticides (OPs). Computer-assisted molecular modeling was...
A Rotational Blended Learning Model: Enhancement and Quality Assurance
ERIC Educational Resources Information Center
Ghoul, Said
2013-01-01
Research on blended learning theory and practice is growing nowadays with a focus on the development, evaluation, and quality assurance of case studies. However, the enhancement of blended learning existing models, the specification of their online parts, and the quality assurance related specifically to them have not received enough attention.…
Chemical structure determines target organ carcinogenesis in rats
Carrasquer, C. A.; Malik, N.; States, G.; Qamar, S.; Cunningham, S.L.; Cunningham, A.R.
2012-01-01
SAR models were developed for 12 rat tumour sites using data derived from the Carcinogenic Potency Database. Essentially, the models fall into two categories: Target Site Carcinogen – Non-Carcinogen (TSC-NC) and Target Site Carcinogen – Non-Target Site Carcinogen (TSC-NTSC). The TSC-NC models were composed of active chemicals that were carcinogenic to a specific target site and inactive ones that were whole animal non-carcinogens. On the other hand, the TSC-NTSC models used an inactive category also composed of carcinogens but to any/all other sites but the target site. Leave one out validations produced an overall average concordance value for all 12 models of 0.77 for the TSC-NC models and 0.73 for the TSC-NTSC models. Overall, these findings suggest that while the TSC-NC models are able to distinguish between carcinogens and non-carcinogens, the TSC-NTSC models are identifying structural attributes that associate carcinogens to specific tumour sites. Since the TSC-NTSC models are composed of active and inactive compounds that are genotoxic and non-genotoxic carcinogens, the TSC-NTSC models may be capable of deciphering non-genotoxic mechanisms of carcinogenesis. Together, models of this type may also prove useful in anticancer drug development since they essentially contain chemicals moieties that target specific tumour site. PMID:23066888
NASA Technical Reports Server (NTRS)
Parkinson, R. C. H.
1983-01-01
A fuel-efficient cruise performance model which facilitates maximizing the specific range of General Aviation airplanes powered by spark-ignition piston engines and propellers is presented. Airplanes of fixed design only are considered. The uses and limitations of typical Pilot Operating Handbook cruise performance data, for constructing cruise performance models suitable for maximizing specific range, are first examined. These data are found to be inadequate for constructing such models. A new model of General Aviation piston-prop airplane cruise performance is then developed. This model consists of two subsystem models: the airframe-propeller-atmosphere subsystem model; and the engine-atmosphere subsystem model. The new model facilitates maximizing specific range; and by virtue of its implicity and low volume data storge requirements, appears suitable for airborne microprocessor implementation.
Automated real time constant-specificity surveillance for disease outbreaks.
Wieland, Shannon C; Brownstein, John S; Berger, Bonnie; Mandl, Kenneth D
2007-06-13
For real time surveillance, detection of abnormal disease patterns is based on a difference between patterns observed, and those predicted by models of historical data. The usefulness of outbreak detection strategies depends on their specificity; the false alarm rate affects the interpretation of alarms. We evaluate the specificity of five traditional models: autoregressive, Serfling, trimmed seasonal, wavelet-based, and generalized linear. We apply each to 12 years of emergency department visits for respiratory infection syndromes at a pediatric hospital, finding that the specificity of the five models was almost always a non-constant function of the day of the week, month, and year of the study (p < 0.05). We develop an outbreak detection method, called the expectation-variance model, based on generalized additive modeling to achieve a constant specificity by accounting for not only the expected number of visits, but also the variance of the number of visits. The expectation-variance model achieves constant specificity on all three time scales, as well as earlier detection and improved sensitivity compared to traditional methods in most circumstances. Modeling the variance of visit patterns enables real-time detection with known, constant specificity at all times. With constant specificity, public health practitioners can better interpret the alarms and better evaluate the cost-effectiveness of surveillance systems.
Event-based total suspended sediment particle size distribution model
NASA Astrophysics Data System (ADS)
Thompson, Jennifer; Sattar, Ahmed M. A.; Gharabaghi, Bahram; Warner, Richard C.
2016-05-01
One of the most challenging modelling tasks in hydrology is prediction of the total suspended sediment particle size distribution (TSS-PSD) in stormwater runoff generated from exposed soil surfaces at active construction sites and surface mining operations. The main objective of this study is to employ gene expression programming (GEP) and artificial neural networks (ANN) to develop a new model with the ability to more accurately predict the TSS-PSD by taking advantage of both event-specific and site-specific factors in the model. To compile the data for this study, laboratory scale experiments using rainfall simulators were conducted on fourteen different soils to obtain TSS-PSD. This data is supplemented with field data from three construction sites in Ontario over a period of two years to capture the effect of transport and deposition within the site. The combined data sets provide a wide range of key overlooked site-specific and storm event-specific factors. Both parent soil and TSS-PSD in runoff are quantified by fitting each to a lognormal distribution. Compared to existing regression models, the developed model more accurately predicted the TSS-PSD using a more comprehensive list of key model input parameters. Employment of the new model will increase the efficiency of deployment of required best management practices, designed based on TSS-PSD, to minimize potential adverse effects of construction site runoff on aquatic life in the receiving watercourses.
Development and Characterization of High-Efficiency, High-Specific Impulse Xenon Hall Thrusters
NASA Technical Reports Server (NTRS)
Hofer, Richard R.; Jacobson, David (Technical Monitor)
2004-01-01
This dissertation presents research aimed at extending the efficient operation of 1600 s specific impulse Hall thruster technology to the 2000 to 3000 s range. Motivated by previous industry efforts and mission studies, the aim of this research was to develop and characterize xenon Hall thrusters capable of both high-specific impulse and high-efficiency operation. During the development phase, the laboratory-model NASA 173M Hall thrusters were designed and their performance and plasma characteristics were evaluated. Experiments with the NASA-173M version 1 (v1) validated the plasma lens magnetic field design. Experiments with the NASA 173M version 2 (v2) showed there was a minimum current density and optimum magnetic field topography at which efficiency monotonically increased with voltage. Comparison of the thrusters showed that efficiency can be optimized for specific impulse by varying the plasma lens. During the characterization phase, additional plasma properties of the NASA 173Mv2 were measured and a performance model was derived. Results from the model and experimental data showed how efficient operation at high-specific impulse was enabled through regulation of the electron current with the magnetic field. The electron Hall parameter was approximately constant with voltage, which confirmed efficient operation can be realized only over a limited range of Hall parameters.
New Space Weather Systems Under Development and Their Contribution to Space Weather Management
NASA Astrophysics Data System (ADS)
Tobiska, W.; Bouwer, D.; Schunk, R.; Garrett, H.; Mertens, C.; Bowman, B.
2008-12-01
There have been notable successes during the past decade in the development of operational space environment systems. Examples include the Magnetospheric Specification Model (MSM) of the Earth's magnetosphere, 2000; SOLAR2000 (S2K) solar spectral irradiances, 2001; High Accuracy Satellite Drag Model (HASDM) neutral atmosphere densities, 2004; Global Assimilation of Ionospheric Measurements (GAIM) ionosphere specification, 2006; Hakamada-Akasofu-Fry (HAF) solar wind parameters, 2007; Communication Alert and Prediction System (CAPS) ionosphere, high frequency radio, and scintillation S4 index prediction, 2008; and GEO Alert and Prediction System (GAPS) geosynchronous environment satellite charging specification and forecast, 2008. Operational systems that are in active operational implementation include the Jacchia-Bowman 2006/2008 (JB2006/2008) neutral atmosphere, 2009, and the Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) aviation radiation model using the Radiation Alert and Prediction System (RAPS), 2010. U.S. national agency and commercial assets will soon reach a state where specification and prediction will become ubiquitous and where coordinated management of the space environment and space weather will become a necessity. We describe the status of the CAPS, GAPS, RAPS, and JB2008 operational development. We additionally discuss the conditions that are laying the groundwork for space weather management and estimate the unfilled needs as we move beyond specification and prediction efforts.
Henkes, Luiz E; Davis, John S; Rueda, Bo R
2003-11-10
The corpus luteum is a unique organ, which is transitory in nature. The development, maintenance and regression of the corpus luteum are regulated by endocrine, paracrine and autocrine signaling events. Defining the specific mediators of luteal development, maintenance and regression has been difficult and often perplexing due to the complexity that stems from the variety of cell types that make up the luteal tissue. Moreover, some regulators may serve dual functions as a luteotropic and luteolytic agent depending on the temporal and spatial environment in which they are expressed. As a result, some confusion is present in the interpretation of in vitro and in vivo studies. More recently investigators have utilized mutant mouse models to define the functional significance of specific gene products. The goal of this mini-review is to identify and discuss mutant mouse models that have luteal anomalies, which may provide some clues as to the significance of specific regulators of corpus luteum function.
NASA Astrophysics Data System (ADS)
Trugman, A. T.; Fenton, N.; Bergeron, Y.; Xu, X.; Welp, L.; Medvigy, D.
2015-12-01
Soil organic layer dynamics strongly affect boreal forest development after fire. Field studies show that soil organic layer thickness exerts a species-specific control on propagule establishment in the North American boreal forest. On organic soils thicker than a few centimeters, all propagules are less able to recruit, but broadleaf trees recruit less effectively than needleleaf trees. In turn, forest growth controls organic layer accumulation through modulating litter input and litter quality. These dynamics have not been fully incorporated into models, but may be essential for accurate projections of ecosystem carbon storage. Here, we develop a data-constrained model for understanding boreal forest development after fire. We update the ED2 model to include new aspen and black spruce species-types, species-specific propagule survivorship dependent on soil organic layer depth, species-specific litter decay rates, dynamically accumulating moss and soil organic layers, and nitrogen fixation by cyanobacteria associated with moss. The model is validated against diverse observations ranging from monthly to centennial timescales and spanning a climate gradient in Alaska, central Canada, and Quebec. We then quantify differences in forest development that result from changes in organic layer accumulation, temperature, and nitrogen. We find that (1) the model accurately reproduces a range of observations throughout the North American boreal forest; (2) the presence of a thick organic layer results in decreased decomposition and decreased aboveground productivity, effects that can increase or decrease ecosystem carbon uptake depending on location-specific attributes; (3) with a mean warming of 4°C, some forests switch from undergoing succession to needleleaf forests to recruiting multiple cohorts of broadleaf trees, decreasing ecosystem accumulation by ~30% after 300 years; (4) the availability of nitrogen regulates successional dynamics such than broadleaf species are less able to compete with needleleaf trees under low nitrogen regimes. We conclude that a joint regulation between the soil organic layer, temperature, and nitrogen will likely play an important role in influencing boreal forests development after fire in future climates, and should be represented in models.
Asquith, William H.; Roussel, Meghan C.
2007-01-01
Estimation of representative hydrographs from design storms, which are known as design hydrographs, provides for cost-effective, riskmitigated design of drainage structures such as bridges, culverts, roadways, and other infrastructure. During 2001?07, the U.S. Geological Survey (USGS), in cooperation with the Texas Department of Transportation, investigated runoff hydrographs, design storms, unit hydrographs,and watershed-loss models to enhance design hydrograph estimation in Texas. Design hydrographs ideally should mimic the general volume, peak, and shape of observed runoff hydrographs. Design hydrographs commonly are estimated in part by unit hydrographs. A unit hydrograph is defined as the runoff hydrograph that results from a unit pulse of excess rainfall uniformly distributed over the watershed at a constant rate for a specific duration. A time-distributed, watershed-loss model is required for modeling by unit hydrographs. This report develops a specific time-distributed, watershed-loss model known as an initial-abstraction, constant-loss model. For this watershed-loss model, a watershed is conceptualized to have the capacity to store or abstract an absolute depth of rainfall at and near the beginning of a storm. Depths of total rainfall less than this initial abstraction do not produce runoff. The watershed also is conceptualized to have the capacity to remove rainfall at a constant rate (loss) after the initial abstraction is satisfied. Additional rainfall inputs after the initial abstraction is satisfied contribute to runoff if the rainfall rate (intensity) is larger than the constant loss. The initial abstraction, constant-loss model thus is a two-parameter model. The initial-abstraction, constant-loss model is investigated through detailed computational and statistical analysis of observed rainfall and runoff data for 92 USGS streamflow-gaging stations (watersheds) in Texas with contributing drainage areas from 0.26 to 166 square miles. The analysis is limited to a previously described, watershed-specific, gamma distribution model of the unit hydrograph. In particular, the initial-abstraction, constant-loss model is tuned to the gamma distribution model of the unit hydrograph. A complex computational analysis of observed rainfall and runoff for the 92 watersheds was done to determine, by storm, optimal values of initial abstraction and constant loss. Optimal parameter values for a given storm were defined as those values that produced a modeled runoff hydrograph with volume equal to the observed runoff hydrograph and also minimized the residual sum of squares of the two hydrographs. Subsequently, the means of the optimal parameters were computed on a watershed-specific basis. These means for each watershed are considered the most representative, are tabulated, and are used in further statistical analyses. Statistical analyses of watershed-specific, initial abstraction and constant loss include documentation of the distribution of each parameter using the generalized lambda distribution. The analyses show that watershed development has substantial influence on initial abstraction and limited influence on constant loss. The means and medians of the 92 watershed-specific parameters are tabulated with respect to watershed development; although they have considerable uncertainty, these parameters can be used for parameter prediction for ungaged watersheds. The statistical analyses of watershed-specific, initial abstraction and constant loss also include development of predictive procedures for estimation of each parameter for ungaged watersheds. Both regression equations and regression trees for estimation of initial abstraction and constant loss are provided. The watershed characteristics included in the regression analyses are (1) main-channel length, (2) a binary factor representing watershed development, (3) a binary factor representing watersheds with an abundance of rocky and thin-soiled terrain, and (4) curve numb
Predictive Software Cost Model Study. Volume I. Final Technical Report.
1980-06-01
development phase to identify computer resources necessary to support computer programs after transfer of program manangement responsibility and system... classical model development with refinements specifically applicable to avionics systems. The refinements are the result of the Phase I literature search
Bian, Yue-Hong; Xu, Cheng; Li, Junling; Xu, Jin; Zhang, Hongwei; Du, Shao Jun
2011-08-01
Hemojuvelin, also known as RGMc, is encoded by hfe2 gene that plays an important role in iron homeostasis. hfe2 is specifically expressed in the notochord, developing somite and skeletal muscles during development. The molecular regulation of hfe2 expression is, however, not clear. We reported here the characterization of hfe2 gene expression and the regulation of its tissue-specific expression in zebrafish embryos. We demonstrated that the 6 kb 5'-flanking sequence upstream of the ATG start codon in the zebrafish hfe2 gene could direct GFP specific expression in the notochord, somites, and skeletal muscle of zebrafish embryos, recapitulating the expression pattern of the endogenous gene. However, the Tg(hfe2:gfp) transgene is also expressed in the liver of fish embryos, which did not mimic the expression of the endogenous hfe2 at the early stage. Nevertheless, the Tg(hfe2:gfp) transgenic zebrafish provides a useful model to study liver development. Treating Tg(hfe2:gfp) transgenic zebrafish embryos with valproic acid, a liver development inhibitor, significantly inhibited GFP expression in zebrafish. Together, these data indicate that the tissue specific expression of hfe2 in the notochord, somites and muscles is regulated by regulatory elements within the 6 kb 5'-flanking sequence of the hfe2 gene. Moreover, the Tg(hfe2:gfp) transgenic zebrafish line provides a useful model system for analyzing liver development in zebrafish.
Gillis, Peter A; Hernandez-Alvarado, Nelmary; Gnanandarajah, Josephine S; Wussow, Felix; Diamond, Don J; Schleiss, Mark R
2014-06-30
The guinea pig (Cavia porcellus) provides a useful animal model for studying the pathogenesis of many infectious diseases, and for preclinical evaluation of vaccines. However, guinea pig models are limited by the lack of immunological reagents required for characterization and quantification of antigen-specific T cell responses. To address this deficiency, an enzyme-linked immunospot (ELISPOT) assay for guinea pig interferon (IFN)-γ was developed to measure antigen/epitope-specific T cell responses to guinea pig cytomegalovirus (GPCMV) vaccines. Using splenocytes harvested from animals vaccinated with a modified vaccinia virus Ankara (MVA) vector encoding the GPCMV GP83 (homolog of human CMV pp65 [gpUL83]) protein, we were able to enumerate and map antigen-specific responses, both in vaccinated as well as GPCMV-infected animals, using a panel of GP83-specific peptides. Several potential immunodominant GP83-specific peptides were identified, including one epitope, LGIVHFFDN, that was noted in all guinea pigs that had a detectable CD8+ response to GP83. Development of a guinea pig IFN-γ ELISPOT should be useful in characterization of additional T cell-specific responses to GPCMV, as well as other pathogens. This information in turn can help focus future experimental evaluation of immunization strategies, both for GPCMV as well as for other vaccine-preventable illnesses studied in the guinea pig model. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Kopasakis, George
2010-01-01
This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
ERIC Educational Resources Information Center
Lee, Il-Sun; Byeon, Jung-Ho; Kim, Young-shin; Kwon, Yong-Ju
2014-01-01
The purpose of this study was to develop a model for measuring experimental design ability based on functional magnetic resonance imaging (fMRI) during biological inquiry. More specifically, the researchers developed an experimental design task that measures experimental design ability. Using the developed experimental design task, they measured…
Chang, Sun Ju; Im, Eun-Ok
2014-01-01
The purpose of the study was to develop a situation-specific theory for explaining health-related quality of life (QOL) among older South Korean adults with type 2 diabetes. To develop a situation-specific theory, three sources were considered: (a) the conceptual model of health promotion and QOL for people with chronic and disabling conditions (an existing theory related to the QOL in patients with chronic diseases); (b) a literature review using multiple databases including Cumulative Index for Nursing and Allied Health Literature (CINAHL), PubMed, PsycINFO, and two Korean databases; and (c) findings from our structural equation modeling study on health-related QOL in older South Korean adults with type 2 diabetes. The proposed situation-specific theory is constructed with six major concepts including barriers, resources, perceptual factors, psychosocial factors, health-promoting behaviors, and health-related QOL. The theory also provides the interrelationships among concepts. Health care providers and nurses could incorporate the proposed situation-specific theory into development of diabetes education programs for improving health-related QOL in older South Korean adults with type 2 diabetes.
Application of chimeric mice with humanized liver for study of human-specific drug metabolism.
Bateman, Thomas J; Reddy, Vijay G B; Kakuni, Masakazu; Morikawa, Yoshio; Kumar, Sanjeev
2014-06-01
Human-specific or disproportionately abundant human metabolites of drug candidates that are not adequately formed and qualified in preclinical safety assessment species pose an important drug development challenge. Furthermore, the overall metabolic profile of drug candidates in humans is an important determinant of their drug-drug interaction susceptibility. These risks can be effectively assessed and/or mitigated if human metabolic profile of the drug candidate could reliably be determined in early development. However, currently available in vitro human models (e.g., liver microsomes, hepatocytes) are often inadequate in this regard. Furthermore, the conduct of definitive radiolabeled human ADME studies is an expensive and time-consuming endeavor that is more suited for later in development when the risk of failure has been reduced. We evaluated a recently developed chimeric mouse model with humanized liver on uPA/SCID background for its ability to predict human disposition of four model drugs (lamotrigine, diclofenac, MRK-A, and propafenone) that are known to exhibit human-specific metabolism. The results from these studies demonstrate that chimeric mice were able to reproduce the human-specific metabolite profile for lamotrigine, diclofenac, and MRK-A. In the case of propafenone, however, the human-specific metabolism was not detected as a predominant pathway, and the metabolite profiles in native and humanized mice were similar; this was attributed to the presence of residual highly active propafenone-metabolizing mouse enzymes in chimeric mice. Overall, the data indicate that the chimeric mice with humanized liver have the potential to be a useful tool for the prediction of human-specific metabolism of xenobiotics and warrant further investigation.
A Feature-Based Approach to Modeling Protein–DNA Interactions
Segal, Eran
2008-01-01
Transcription factor (TF) binding to its DNA target site is a fundamental regulatory interaction. The most common model used to represent TF binding specificities is a position specific scoring matrix (PSSM), which assumes independence between binding positions. However, in many cases, this simplifying assumption does not hold. Here, we present feature motif models (FMMs), a novel probabilistic method for modeling TF–DNA interactions, based on log-linear models. Our approach uses sequence features to represent TF binding specificities, where each feature may span multiple positions. We develop the mathematical formulation of our model and devise an algorithm for learning its structural features from binding site data. We also developed a discriminative motif finder, which discovers de novo FMMs that are enriched in target sets of sequences compared to background sets. We evaluate our approach on synthetic data and on the widely used TF chromatin immunoprecipitation (ChIP) dataset of Harbison et al. We then apply our algorithm to high-throughput TF ChIP data from mouse and human, reveal sequence features that are present in the binding specificities of mouse and human TFs, and show that FMMs explain TF binding significantly better than PSSMs. Our FMM learning and motif finder software are available at http://genie.weizmann.ac.il/. PMID:18725950
Jenni, Karen E.; Naftz, David L.; Presser, Theresa S.
2017-10-16
The U.S. Geological Survey, working with the Montana Department of Environmental Quality and the British Columbia Ministry of the Environment and Climate Change Strategy, has developed a conceptual modeling framework that can be used to provide structured and scientifically based input to the Lake Koocanusa Monitoring and Research Working Group as they consider potential site-specific selenium criteria for Lake Koocanusa, a transboundary reservoir located in Montana and British Columbia. This report describes that modeling framework, provides an example of how it can be applied, and outlines possible next steps for implementing the framework.
Norris, C R; Byerly, J R; Decile, K C; Berghaus, R D; Walby, W F; Schelegle, E S; Hyde, D M; Gershwin, L J
2003-12-15
Allergic asthma, a Th2 cell driven response to inhaled allergens, has classically been thought of as predominantly mediated by IgE antibodies. To investigate the role of other immunoglobulin classes (e.g., IgG and IgA) in the immunopathogenesis of allergic asthma, levels of these allergen-specific immunoglobulins were measured in serum and mucosal fluids. Bermuda grass allergen (BGA)-specific IgG and IgA ELISAs in serum and bronchoalveolar lavage fluid (BALF) were developed and optimized in an experimental model of BGA-induced feline asthma. Levels of BGA-specific IgG and IgA significantly increased over time in serum and BALF after allergen sensitization. Additionally, these elevated levels of BGA-specific IgG and IgA were seen in conjunction with the development of an asthmatic phenotype indicated by positive intradermal skin tests, enhanced airways hyperreactivity, and increased eosinophil percentages in the BALF.
Pohl, Calvin S.; Medland, Julia E.
2015-01-01
Early-life stress and adversity are major risk factors in the onset and severity of gastrointestinal (GI) disease in humans later in life. The mechanisms by which early-life stress leads to increased GI disease susceptibility in adult life remain poorly understood. Animal models of early-life stress have provided a foundation from which to gain a more fundamental understanding of this important GI disease paradigm. This review focuses on animal models of early-life stress-induced GI disease, with a specific emphasis on translational aspects of each model to specific human GI disease states. Early postnatal development of major GI systems and the consequences of stress on their development are discussed in detail. Relevant translational differences between species and models are highlighted. PMID:26451004
Colour Model for Outdoor Machine Vision for Tropical Regions and its Comparison with the CIE Model
NASA Astrophysics Data System (ADS)
Sahragard, Nasrolah; Ramli, Abdul Rahman B.; Hamiruce Marhaban, Mohammad; Mansor, Shattri B.
2011-02-01
Accurate modeling of daylight and surface reflectance are very useful for most outdoor machine vision applications specifically those which are based on color recognition. Existing daylight CIE model has drawbacks that limit its ability to predict the color of incident light. These limitations include lack of considering ambient light, effects of light reflected off the ground, and context specific information. Previously developed color model is only tested for a few geographical places in North America and its accountability is under question for other places in the world. Besides, existing surface reflectance models are not easily applied to outdoor images. A reflectance model with combined diffuse and specular reflection in normalized HSV color space could be used to predict color. In this paper, a new daylight color model showing the color of daylight for a broad range of sky conditions is developed which will suit weather conditions of tropical places such as Malaysia. A comparison of this daylight color model and daylight CIE model will be discussed. The colors of matte and specular surfaces have been estimated by use of the developed color model and surface reflection function in this paper. The results are shown to be highly reliable.
Animal Models of Colorectal Cancer
Johnson, Robert L.; Fleet, James C.
2012-01-01
Colorectal cancer is a heterogeneous disease that afflicts a large number of people in the United States. The use of animal models has the potential to increase our understanding of carcinogenesis, tumor biology, and the impact of specific molecular events on colon biology. In addition, animal models with features of specific human colorectal cancers can be used to test strategies for cancer prevention and treatment. In this review we provide an overview of the mechanisms driving human cancer, we discuss the approaches one can take to model colon cancer in animals, and we describe a number of specific animal models that have been developed for the study of colon cancer. We believe that there are many valuable animal models to study various aspects of human colorectal cancer. However, opportunities for improving upon these models exist. PMID:23076650
Animal models of contraception: utility and limitations
Liechty, Emma R; Bergin, Ingrid L; Bell, Jason D
2015-01-01
Appropriate animal modeling is vital for the successful development of novel contraceptive devices. Advances in reproductive biology have identified novel pathways for contraceptive intervention. Here we review species-specific anatomic and physiologic considerations impacting preclinical contraceptive testing, including efficacy testing, mechanistic studies, device design, and modeling off-target effects. Emphasis is placed on the use of nonhuman primate models in contraceptive device development. PMID:29386922
Development of a patient-specific anatomical foot model from structured light scan data.
Lochner, Samuel J; Huissoon, Jan P; Bedi, Sanjeev S
2014-01-01
The use of anatomically accurate finite element (FE) models of the human foot in research studies has increased rapidly in recent years. Uses for FE foot models include advancing knowledge of orthotic design, shoe design, ankle-foot orthoses, pathomechanics, locomotion, plantar pressure, tissue mechanics, plantar fasciitis, joint stress and surgical interventions. Similar applications but for clinical use on a per-patient basis would also be on the rise if it were not for the high costs associated with developing patient-specific anatomical foot models. High costs arise primarily from the expense and challenges of acquiring anatomical data via magnetic resonance imaging (MRI) or computed tomography (CT) and reconstructing the three-dimensional models. The proposed solution morphs detailed anatomy from skin surface geometry and anatomical landmarks of a generic foot model (developed from CT or MRI) to surface geometry and anatomical landmarks acquired from an inexpensive structured light scan of a foot. The method yields a patient-specific anatomical foot model at a fraction of the cost of standard methods. Average error for bone surfaces was 2.53 mm for the six experiments completed. Highest accuracy occurred in the mid-foot and lowest in the forefoot due to the small, irregular bones of the toes. The method must be validated in the intended application to determine if the resulting errors are acceptable.
Quantitative Predictive Models for Systemic Toxicity (SOT)
Models to identify systemic and specific target organ toxicity were developed to help transition the field of toxicology towards computational models. By leveraging multiple data sources to incorporate read-across and machine learning approaches, a quantitative model of systemic ...
USDA-ARS?s Scientific Manuscript database
Improving strategies for monitoring subsurface contaminant transport includes performance comparison of competing models, developed independently or obtained via model abstraction. Model comparison and parameter discrimination involve specific performance indicators selected to better understand s...
Testing the World with Simulations.
ERIC Educational Resources Information Center
Roberts, Nancy
1983-01-01
Discusses steps involved in model building and simulation: understanding a problem, building a model, and simulation. Includes a mathematical model (focusing on a problem dealing with influenza) written in the DYNAMO computer language, developed specifically for writing simulation models. (Author/JN)
Manoharan, Prabu; Chennoju, Kiranmai; Ghoshal, Nanda
2015-07-01
BACE1 is an attractive target in Alzheimer's disease (AD) treatment. A rational drug design effort for the inhibition of BACE1 is actively pursued by researchers in both academic and pharmaceutical industries. This continued effort led to the steady accumulation of BACE1 crystal structures, co-complexed with different classes of inhibitors. This wealth of information is used in this study to develop target specific proteochemometric models and these models are exploited for predicting the prospective BACE1 inhibitors. The models developed in this study have performed excellently in predicting the computationally generated poses, separately obtained from single and ensemble docking approaches. The simple protein-ligand contact (SPLC) model outperforms other sophisticated high end models, in virtual screening performance, developed during this study. In an attempt to account for BACE1 protein active site flexibility information in predictive models, we included the change in the area of solvent accessible surface and the change in the volume of solvent accessible surface in our models. The ensemble and single receptor docking results obtained from this study indicate that the structural water mediated interactions improve the virtual screening results. Also, these waters are essential for recapitulating bioactive conformation during docking study. The proteochemometric models developed in this study can be used for the prediction of BACE1 inhibitors, during the early stage of AD drug discovery.
VIP: A knowledge-based design aid for the engineering of space systems
NASA Technical Reports Server (NTRS)
Lewis, Steven M.; Bellman, Kirstie L.
1990-01-01
The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.
ERIC Educational Resources Information Center
Warren, Elizabeth
2009-01-01
The implementation of a new mathematics syllabus in the elementary context is problematic, especially if it contains a new content area. A professional development model, Transformative Teaching in the Early Years Mathematics (TTEYM) was specifically developed to support the implementation of the new Patterns and Algebra strand. The model was…
School-Based Job Placement Service Model: Phase I, Planning. Final Report.
ERIC Educational Resources Information Center
Gingerich, Garland E.
To assist school administrators and guidance personnel in providing job placement services, a study was conducted to: (1) develop a model design for a school-based job placement system, (2) identify students to be served by the model, (3) list specific services provided to students, and (4) develop job descriptions for each individual responsible…
Investigation of traveler acceptance factors in short haul air carrier operations
NASA Technical Reports Server (NTRS)
Kuhlthau, A. R.; Jacobson, I. D.
1972-01-01
The development of a mathematical model for human reaction to variables involved in transportation systems is discussed. The techniques, activities, and results related to defining certain specific inputs to the model are presented. A general schematic diagram of the problem solution is developed. The application of the model to short haul air carrier operations is examined.
Neural network modeling of emotion
NASA Astrophysics Data System (ADS)
Levine, Daniel S.
2007-03-01
This article reviews the history and development of computational neural network modeling of cognitive and behavioral processes that involve emotion. The exposition starts with models of classical conditioning dating from the early 1970s. Then it proceeds toward models of interactions between emotion and attention. Then models of emotional influences on decision making are reviewed, including some speculative (not and not yet simulated) models of the evolution of decision rules. Through the late 1980s, the neural networks developed to model emotional processes were mainly embodiments of significant functional principles motivated by psychological data. In the last two decades, network models of these processes have become much more detailed in their incorporation of known physiological properties of specific brain regions, while preserving many of the psychological principles from the earlier models. Most network models of emotional processes so far have dealt with positive and negative emotion in general, rather than specific emotions such as fear, joy, sadness, and anger. But a later section of this article reviews a few models relevant to specific emotions: one family of models of auditory fear conditioning in rats, and one model of induced pleasure enhancing creativity in humans. Then models of emotional disorders are reviewed. The article concludes with philosophical statements about the essential contributions of emotion to intelligent behavior and the importance of quantitative theories and models to the interdisciplinary enterprise of understanding the interactions of emotion, cognition, and behavior.
Nogueira, Waldo; Schurzig, Daniel; Büchner, Andreas; Penninger, Richard T.; Würfel, Waldemar
2016-01-01
Cochlear Implants (CIs) are medical implantable devices that can restore the sense of hearing in people with profound hearing loss. Clinical trials assessing speech intelligibility in CI users have found large intersubject variability. One possibility to explain the variability is the individual differences in the interface created between electrodes of the CI and the auditory nerve. In order to understand the variability, models of the voltage distribution of the electrically stimulated cochlea may be useful. With this purpose in mind, we developed a parametric model that can be adapted to each CI user based on landmarks from individual cone beam computed tomography (CBCT) scans of the cochlea before and after implantation. The conductivity values of each cochlea compartment as well as the weighting factors of different grounding modes have also been parameterized. Simulations were performed modeling the cochlea and electrode positions of 12 CI users. Three models were compared with different levels of detail: a homogeneous model (HM), a non-patient-specific model (NPSM), and a patient-specific model (PSM). The model simulations were compared with voltage distribution measurements obtained from the backward telemetry of the 12 CI users. Results show that the PSM produces the lowest error when predicting individual voltage distributions. Given a patient-specific geometry and electrode positions, we show an example on how to optimize the parameters of the model and how to couple it to an auditory nerve model. The model here presented may help to understand speech performance variability and support the development of new sound coding strategies for CIs. PMID:27933290
Initiating Formal Requirements Specifications with Object-Oriented Models
NASA Technical Reports Server (NTRS)
Ampo, Yoko; Lutz, Robyn R.
1994-01-01
This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.
Bhandari, Ammar B; Nelson, Nathan O; Sweeney, Daniel W; Baffaut, Claire; Lory, John A; Senaviratne, Anomaa; Pierzynski, Gary M; Janssen, Keith A; Barnes, Philip L
2017-11-01
Process-based computer models have been proposed as a tool to generate data for Phosphorus (P) Index assessment and development. Although models are commonly used to simulate P loss from agriculture using managements that are different from the calibration data, this use of models has not been fully tested. The objective of this study is to determine if the Agricultural Policy Environmental eXtender (APEX) model can accurately simulate runoff, sediment, total P, and dissolved P loss from 0.4 to 1.5 ha of agricultural fields with managements that are different from the calibration data. The APEX model was calibrated with field-scale data from eight different managements at two locations (management-specific models). The calibrated models were then validated, either with the same management used for calibration or with different managements. Location models were also developed by calibrating APEX with data from all managements. The management-specific models resulted in satisfactory performance when used to simulate runoff, total P, and dissolved P within their respective systems, with > 0.50, Nash-Sutcliffe efficiency > 0.30, and percent bias within ±35% for runoff and ±70% for total and dissolved P. When applied outside the calibration management, the management-specific models only met the minimum performance criteria in one-third of the tests. The location models had better model performance when applied across all managements compared with management-specific models. Our results suggest that models only be applied within the managements used for calibration and that data be included from multiple management systems for calibration when using models to assess management effects on P loss or evaluate P Indices. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Integrative change model in psychotherapy: Perspectives from Indian thought.
Manickam, L S S
2013-01-01
Different psychotherapeutic approaches claim positive changes in patients as a result of therapy. Explanations related to the change process led to different change models. Some of the change models are experimentally oriented whereas some are theoretical. Apart from the core models of behavioral, psychodynamic, humanistic, cognitive and spiritually oriented models there are specific models, within psychotherapy that explains the change process. Integrative theory of a person as depicted in Indian thought provides a common ground for the integration of various therapies. Integrative model of change based on Indian thought, with specific reference to psychological concepts in Upanishads, Ayurveda, Bhagavad Gita and Yoga are presented. Appropriate psychological tools may be developed in order to help the clinicians to choose the techniques that match the problem and the origin of the dimension. Explorations have to be conducted to develop more techniques that are culturally appropriate and clinically useful. Research has to be initiated to validate the identified concepts.
Integrative change model in psychotherapy: Perspectives from Indian thought
Manickam, L. S. S
2013-01-01
Different psychotherapeutic approaches claim positive changes in patients as a result of therapy. Explanations related to the change process led to different change models. Some of the change models are experimentally oriented whereas some are theoretical. Apart from the core models of behavioral, psychodynamic, humanistic, cognitive and spiritually oriented models there are specific models, within psychotherapy that explains the change process. Integrative theory of a person as depicted in Indian thought provides a common ground for the integration of various therapies. Integrative model of change based on Indian thought, with specific reference to psychological concepts in Upanishads, Ayurveda, Bhagavad Gita and Yoga are presented. Appropriate psychological tools may be developed in order to help the clinicians to choose the techniques that match the problem and the origin of the dimension. Explorations have to be conducted to develop more techniques that are culturally appropriate and clinically useful. Research has to be initiated to validate the identified concepts. PMID:23858275
NASA Astrophysics Data System (ADS)
Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé
2014-05-01
Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.
Risk Prediction Models for Other Cancers or Multiple Sites
Developing statistical models that estimate the probability of developing other multiple cancers over a defined period of time will help clinicians identify individuals at higher risk of specific cancers, allowing for earlier or more frequent screening and counseling of behavioral changes to decrease risk.
HEAVY DUTY DIESEL VEHICLE LOAD ESTIMATION: DEVELOPMENT OF VEHICLE ACTIVITY OPTIMIZATION ALGORITHM
The Heavy-Duty Vehicle Modal Emission Model (HDDV-MEM) developed by the Georgia Institute of Technology(Georgia Tech) has a capability to model link-specific second-by-second emissions using speed/accleration matrices. To estimate emissions, engine power demand calculated usin...
Simulation model for electron irradiated IGZO thin film transistors
NASA Astrophysics Data System (ADS)
Dayananda, G. K.; Shantharama Rai, C.; Jayarama, A.; Kim, Hyun Jae
2018-02-01
An efficient drain current simulation model for the electron irradiation effect on the electrical parameters of amorphous In-Ga-Zn-O (IGZO) thin-film transistors is developed. The model is developed based on the specifications such as gate capacitance, channel length, channel width, flat band voltage etc. Electrical parameters of un-irradiated IGZO samples were simulated and compared with the experimental parameters and 1 kGy electron irradiated parameters. The effect of electron irradiation on the IGZO sample was analysed by developing a mathematical model.
Expert systems and simulation models; Proceedings of the Seminar, Tucson, AZ, November 18, 19, 1985
NASA Technical Reports Server (NTRS)
1986-01-01
The seminar presents papers on modeling and simulation methodology, artificial intelligence and expert systems, environments for simulation/expert system development, and methodology for simulation/expert system development. Particular attention is given to simulation modeling concepts and their representation, modular hierarchical model specification, knowledge representation, and rule-based diagnostic expert system development. Other topics include the combination of symbolic and discrete event simulation, real time inferencing, and the management of large knowledge-based simulation projects.
Facility Energy Performance Benchmarking in a Data-Scarce Environment
2017-08-01
environment, and analyze occupant-, system-, and component-level faults contributing to energy in- efficiency. A methodology for developing DoD-specific...Research, Development, Test, and Evaluation (RDTE) Program to develop an intelligent framework, encompassing methodology and model- ing, that...energy performers by installation, climate zone, and other criteria. A methodology for creating the DoD-specific EUIs would be an important part of a
Concise review: modeling central nervous system diseases using induced pluripotent stem cells.
Zeng, Xianmin; Hunsberger, Joshua G; Simeonov, Anton; Malik, Nasir; Pei, Ying; Rao, Mahendra
2014-12-01
Induced pluripotent stem cells (iPSCs) offer an opportunity to delve into the mechanisms underlying development while also affording the potential to take advantage of a number of naturally occurring mutations that contribute to either disease susceptibility or resistance. Just as with any new field, several models of screening are being explored, and innovators are working on the most efficient methods to overcome the inherent limitations of primary cell screens using iPSCs. In the present review, we provide a background regarding why iPSCs represent a paradigm shift for central nervous system (CNS) disease modeling. We describe the efforts in the field to develop more biologically relevant CNS disease models, which should provide screening assays useful for the pharmaceutical industry. We also provide some examples of successful uses for iPSC-based screens and suggest that additional development could revolutionize the field of drug discovery. The development and implementation of these advanced iPSC-based screens will create a more efficient disease-specific process underpinned by the biological mechanism in a patient- and disease-specific manner rather than by trial-and-error. Moreover, with careful and strategic planning, shared resources can be developed that will enable exponential advances in the field. This will undoubtedly lead to more sensitive and accurate screens for early diagnosis and allow the identification of patient-specific therapies, thus, paving the way to personalized medicine. ©AlphaMed Press.
DOT National Transportation Integrated Search
2010-09-01
This project focused on the evaluation of traffic sign sheeting performance in terms of meeting the nighttime : driver needs. The goal was to develop a nighttime driver needs specification for traffic signs. The : researchers used nighttime sign legi...
Trevisan, Marta; Sinigaglia, Alessandro; Desole, Giovanna; Berto, Alessandro; Pacenti, Monia; Palù, Giorgio; Barzon, Luisa
2015-07-13
The recent biotechnology breakthrough of cell reprogramming and generation of induced pluripotent stem cells (iPSCs), which has revolutionized the approaches to study the mechanisms of human diseases and to test new drugs, can be exploited to generate patient-specific models for the investigation of host-pathogen interactions and to develop new antimicrobial and antiviral therapies. Applications of iPSC technology to the study of viral infections in humans have included in vitro modeling of viral infections of neural, liver, and cardiac cells; modeling of human genetic susceptibility to severe viral infectious diseases, such as encephalitis and severe influenza; genetic engineering and genome editing of patient-specific iPSC-derived cells to confer antiviral resistance.
Modelling the development and arrangement of the primary vascular structure in plants.
Cartenì, Fabrizio; Giannino, Francesco; Schweingruber, Fritz Hans; Mazzoleni, Stefano
2014-09-01
The process of vascular development in plants results in the formation of a specific array of bundles that run throughout the plant in a characteristic spatial arrangement. Although much is known about the genes involved in the specification of procambium, phloem and xylem, the dynamic processes and interactions that define the development of the radial arrangement of such tissues remain elusive. This study presents a spatially explicit reaction-diffusion model defining a set of logical and functional rules to simulate the differentiation of procambium, phloem and xylem and their spatial patterns, starting from a homogeneous group of undifferentiated cells. Simulation results showed that the model is capable of reproducing most vascular patterns observed in plants, from primitive and simple structures made up of a single strand of vascular bundles (protostele), to more complex and evolved structures, with separated vascular bundles arranged in an ordered pattern within the plant section (e.g. eustele). The results presented demonstrate, as a proof of concept, that a common genetic-molecular machinery can be the basis of different spatial patterns of plant vascular development. Moreover, the model has the potential to become a useful tool to test different hypotheses of genetic and molecular interactions involved in the specification of vascular tissues.
A description of the new 3D electron gun and collector modeling tool: MICHELLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petillo, J.; Mondelli, A.; Krueger, W.
1999-07-01
A new 3D finite element gun and collector modeling code is under development at SAIC in collaboration with industrial partners and national laboratories. This development program has been designed specifically to address the shortcomings of current simulation and modeling tools. In particular, although there are 3D gun codes that exist today, their ability to address fine scale features is somewhat limited in 3D due to disparate length scales of certain classes of devices. Additionally, features like advanced emission rules, including thermionic Child's law and comprehensive secondary emission models also need attention. The program specifically targets problems classes including gridded-guns, sheet-beammore » guns, multi-beam devices, and anisotropic collectors. The presentation will provide an overview of the program objectives, the approach to be taken by the development team, and a status of the project.« less
2013-08-01
surgeries, hospitalizations, etc). Once our model is developed we hope to apply our model at an outside institution, specifically University of...to build predictive models with the hope of improving disease management. It is difficult to find these factors in EMR systems as the...death, surgeries, hospitalizations, etc.) Once our model is developed, we hope to apply the model to de-identified data set from the University of
A physical model for evaluating uranium nitride specific heat
NASA Astrophysics Data System (ADS)
Baranov, V. G.; Devyatko, Yu. N.; Tenishev, A. V.; Khlunov, A. V.; Khomyakov, O. V.
2013-03-01
Nitride fuel is one of perspective materials for the nuclear industry. But unlike the oxide and carbide uranium and mixed uranium-plutonium fuel, the nitride fuel is less studied. The present article is devoted to the development of a model for calculating UN specific heat on the basis of phonon spectrum data within the solid state theory.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1977-01-01
Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.
Elements of episodic-like memory in animal models.
Crystal, Jonathon D
2009-03-01
Representations of unique events from one's past constitute the content of episodic memories. A number of studies with non-human animals have revealed that animals remember specific episodes from their past (referred to as episodic-like memory). The development of animal models of memory holds enormous potential for gaining insight into the biological bases of human memory. Specifically, given the extensive knowledge of the rodent brain, the development of rodent models of episodic memory would open new opportunities to explore the neuroanatomical, neurochemical, neurophysiological, and molecular mechanisms of memory. Development of such animal models holds enormous potential for studying functional changes in episodic memory in animal models of Alzheimer's disease, amnesia, and other human memory pathologies. This article reviews several approaches that have been used to assess episodic-like memory in animals. The approaches reviewed include the discrimination of what, where, and when in a radial arm maze, dissociation of recollection and familiarity, object recognition, binding, unexpected questions, and anticipation of a reproductive state. The diversity of approaches may promote the development of converging lines of evidence on the difficult problem of assessing episodic-like memory in animals.
ImSET: Impact of Sector Energy Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roop, Joseph M.; Scott, Michael J.; Schultz, Robert W.
2005-07-19
This version of the Impact of Sector Energy Technologies (ImSET) model represents the ''next generation'' of the previously developed Visual Basic model (ImBUILD 2.0) that was developed in 2003 to estimate the macroeconomic impacts of energy-efficient technology in buildings. More specifically, a special-purpose version of the 1997 benchmark national Input-Output (I-O) model was designed specifically to estimate the national employment and income effects of the deployment of Office of Energy Efficiency and Renewable Energy (EERE) -developed energy-saving technologies. In comparison with the previous versions of the model, this version allows for more complete and automated analysis of the essential featuresmore » of energy efficiency investments in buildings, industry, transportation, and the electric power sectors. This version also incorporates improvements in the treatment of operations and maintenance costs, and improves the treatment of financing of investment options. ImSET is also easier to use than extant macroeconomic simulation models and incorporates information developed by each of the EERE offices as part of the requirements of the Government Performance and Results Act.« less
Xu, Y; Li, Y F; Zhang, D; Dockendorf, M; Tetteh, E; Rizk, M L; Grobler, J A; Lai, M-T; Gobburu, J; Ankrom, W
2016-08-01
We applied model-based meta-analysis of viral suppression as a function of drug exposure and in vitro potency for short-term monotherapy in human immunodeficiency virus type 1 (HIV-1)-infected treatment-naïve patients to set pharmacokinetic targets for development of nonnucleoside reverse transcriptase inhibitors (NNRTIs) and integrase strand transfer inhibitors (InSTIs). We developed class-specific models relating viral load kinetics from monotherapy studies to potency normalized steady-state trough plasma concentrations. These models were integrated with a literature assessment of doses which demonstrated to have long-term efficacy in combination therapy, in order to set steady-state trough concentration targets of 6.17- and 2.15-fold above potency for NNRTIs and InSTIs, respectively. Both the models developed and the pharmacokinetic targets derived can be used to guide compound selection during preclinical development and to predict the dose-response of new antiretrovirals to inform early clinical trial design. © 2016 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Development of emergency department load relief area--gauging benefits in empirical terms.
Rasheed, Farrukh; Lee, Young Hoon; Kim, Seung Ho; Park, In Cheol
2012-12-01
The primary goal of this investigation was to develop a simulation model to evaluate the various internal and external factors affecting patient flow and crowding in the emergency department (ED). In addition, a few recommendations are proposed to reconfigure the patient flow to improve ED capacity while maintaining service quality. In this research, we present a simulation study conducted in the ED at the "S Hospital" located in Seoul. Based on patient flow data and process analysis, a simulation model of patient throughput in the ED has been developed. We evaluated simulations of diverting the specific patient load in the light of our proposed recommendations to a separately managed area named as the ED load relief area (ED-LRA) and analyzing potential effects on overall length of stay (LOS) and waiting time (WT). What-if analyses have been proposed to identify key issues and investigate the improvements as per our proposed recommendations. The simulation results suggest that specific patient load diversion is needed to ensure desired outcomes. With the diversion of specific patient load to ED-LRA, there is a reduction of 40.60% in mean LOS and 42.5% in WT with improved resource utilization. As a result, opening of an ED-LRA is justified. Real-world systems are often too intricate for analytical models and often too expensive to trial with directly. Simulation models allow the modeling of this intricacy and enable experimentation to make inferences about how the actual system might perform. Our simulation study modeled that diverting the specific patient load to ED-LRA produced an improvement in overall ED's LOS and WT.
Meguid, Robert A; Bronsert, Michael R; Juarez-Colunga, Elizabeth; Hammermeister, Karl E; Henderson, William G
2016-07-01
To develop parsimonious prediction models for postoperative mortality, overall morbidity, and 6 complication clusters applicable to a broad range of surgical operations in adult patients. Quantitative risk assessment tools are not routinely used for preoperative patient assessment, shared decision making, informed consent, and preoperative patient optimization, likely due in part to the burden of data collection and the complexity of incorporation into routine surgical practice. Multivariable forward selection stepwise logistic regression analyses were used to develop predictive models for 30-day mortality, overall morbidity, and 6 postoperative complication clusters, using 40 preoperative variables from 2,275,240 surgical cases in the American College of Surgeons National Surgical Quality Improvement Program data set, 2005 to 2012. For the mortality and overall morbidity outcomes, prediction models were compared with and without preoperative laboratory variables, and generic models (based on all of the data from 9 surgical specialties) were compared with specialty-specific models. In each model, the cumulative c-index was used to examine the contribution of each added predictor variable. C-indexes, Hosmer-Lemeshow analyses, and Brier scores were used to compare discrimination and calibration between models. For the mortality and overall morbidity outcomes, the prediction models without the preoperative laboratory variables performed as well as the models with the laboratory variables, and the generic models performed as well as the specialty-specific models. The c-indexes were 0.938 for mortality, 0.810 for overall morbidity, and for the 6 complication clusters ranged from 0.757 for infectious to 0.897 for pulmonary complications. Across the 8 prediction models, the first 7 to 11 variables entered accounted for at least 99% of the c-index of the full model (using up to 28 nonlaboratory predictor variables). Our results suggest that it will be possible to develop parsimonious models to predict 8 important postoperative outcomes for a broad surgical population, without the need for surgeon specialty-specific models or inclusion of laboratory variables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sathaye, Jayant A.
2000-04-01
Integrated assessment (IA) modeling of climate policy is increasingly global in nature, with models incorporating regional disaggregation. The existing empirical basis for IA modeling, however, largely arises from research on industrialized economies. Given the growing importance of developing countries in determining long-term global energy and carbon emissions trends, filling this gap with improved statistical information on developing countries' energy and carbon-emissions characteristics is an important priority for enhancing IA modeling. Earlier research at LBNL on this topic has focused on assembling and analyzing statistical data on productivity trends and technological change in the energy-intensive manufacturing sectors of five developing countries,more » India, Brazil, Mexico, Indonesia, and South Korea. The proposed work will extend this analysis to the agriculture and electric power sectors in India, South Korea, and two other developing countries. They will also examine the impact of alternative model specifications on estimates of productivity growth and technological change for each of the three sectors, and estimate the contribution of various capital inputs--imported vs. indigenous, rigid vs. malleable-- in contributing to productivity growth and technological change. The project has already produced a data resource on the manufacturing sector which is being shared with IA modelers. This will be extended to the agriculture and electric power sectors, which would also be made accessible to IA modeling groups seeking to enhance the empirical descriptions of developing country characteristics. The project will entail basic statistical and econometric analysis of productivity and energy trends in these developing country sectors, with parameter estimates also made available to modeling groups. The parameter estimates will be developed using alternative model specifications that could be directly utilized by the existing IAMs for the manufacturing, agriculture, and electric power sectors.« less
Development of an integrated CAD-FEA system for patient-specific design of spinal cages.
Zhang, Mingzheng; Pu, Fang; Xu, Liqiang; Zhang, Linlin; Liang, Hang; Li, Deyu; Wang, Yu; Fan, Yubo
2017-03-01
Spinal cages are used to create a suitable mechanical environment for interbody fusion in cases of degenerative spinal instability. Due to individual variations in bone structures and pathological conditions, patient-specific cages can provide optimal biomechanical conditions for fusion, strengthening patient recovery. Finite element analysis (FEA) is a valuable tool in the biomechanical evaluation of patient-specific cage designs, but the time- and labor-intensive process of modeling limits its clinical application. In an effort to facilitate the design and analysis of patient-specific spinal cages, an integrated CAD-FEA system (CASCaDeS, comprehensive analytical spinal cage design system) was developed. This system produces a biomechanical-based patient-specific design of spinal cages and is capable of rapid implementation of finite element modeling. By comparison with commercial software, this system was validated and proven to be both accurate and efficient. CASCaDeS can be used to design patient-specific cages with a superior biomechanical performance to commercial spinal cages.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-09
... assessments of site specific, generic, and process-oriented multimedia environmental models as they pertain to human and environmental health risk assessment. Multimedia model development and simulation supports...
Evaluating a Control System Architecture Based on a Formally Derived AOCS Model
NASA Astrophysics Data System (ADS)
Ilic, Dubravka; Latvala, Timo; Varpaaniemi, Kimmo; Vaisanen, Pauli; Troubitsyna, Elena; Laibinis, Linas
2010-08-01
Attitude & Orbit Control System (AOCS) refers to a wider class of control systems which are used to determine and control the attitude of the spacecraft while in orbit, based on the information obtained from various sensors. In this paper, we propose an approach to evaluate a typical (yet somewhat simplified) AOCS architecture using formal development - based on the Event-B method. As a starting point, an Ada specification of the AOCS is translated into a formal specification and further refined to incorporate all the details of its original source code specification. This way we are able not only to evaluate the Ada specification by expressing and verifying specific system properties in our formal models, but also to determine how well the chosen modelling framework copes with the level of detail required for an actual implementation and code generation from the derived models.
Specific heat and thermal conductivity of nanomaterials
NASA Astrophysics Data System (ADS)
Bhatt, Sandhya; Kumar, Raghuvesh; Kumar, Munish
2017-01-01
A model is proposed to study the size and shape effects on specific heat and thermal conductivity of nanomaterials. The formulation developed for specific heat is based on the basic concept of cohesive energy and melting temperature. The specific heat of Ag and Au nanoparticles is reported and the effect of size and shape has been studied. We observed that specific heat increases with the reduction of particle size having maximum shape effect for spherical nanoparticle. To provide a more critical test, we extended our model to study the thermal conductivity and used it for the study of Si, diamond, Cu, Ni, Ar, ZrO2, BaTiO3 and SrTiO3 nanomaterials. A significant reduction is found in the thermal conductivity for nanomaterials by decreasing the size. The model predictions are consistent with the available experimental and simulation results. This demonstrates the suitability of the model proposed in this paper.
Sound quality indicators for urban places in Paris cross-validated by Milan data.
Ricciardi, Paola; Delaitre, Pauline; Lavandier, Catherine; Torchia, Francesca; Aumond, Pierre
2015-10-01
A specific smartphone application was developed to collect perceptive and acoustic data in Paris. About 3400 questionnaires were analyzed, regarding the global sound environment characterization, the perceived loudness of some emergent sources and the presence time ratio of sources that do not emerge from the background. Sound pressure level was recorded each second from the mobile phone's microphone during a 10-min period. The aim of this study is to propose indicators of urban sound quality based on linear regressions with perceptive variables. A cross validation of the quality models extracted from Paris data was carried out by conducting the same survey in Milan. The proposed sound quality general model is correlated with the real perceived sound quality (72%). Another model without visual amenity and familiarity is 58% correlated with perceived sound quality. In order to improve the sound quality indicator, a site classification was performed by Kohonen's Artificial Neural Network algorithm, and seven specific class models were developed. These specific models attribute more importance on source events and are slightly closer to the individual data than the global model. In general, the Parisian models underestimate the sound quality of Milan environments assessed by Italian people.
Guideline validation in multiple trauma care through business process modeling.
Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen
2003-07-01
Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.
The Shadow of Muhammad: Developing a Charismatic Leadership Model for the Islamic World
2002-06-01
leadership and specific “type” of leader in the Islamic world. It is a work of synthesis in which a theory about one form of successful Islamic...DEVELOPING A CHARISMATIC LEADERSHIP MODEL FOR THE ISLAMIC WORLD by Edward W. Kostrzebski June 2002 Thesis Advisor: Anna Simons...Shadow of Muhammad: Developing a Charismatic Leadership Model for the Islamic World 5. FUNDING NUMBERS 6. AUTHOR (S) Edward W. Kostrzebski 7
A predictive model for biomimetic plate type broadband frequency sensor
NASA Astrophysics Data System (ADS)
Ahmed, Riaz U.; Banerjee, Sourav
2016-04-01
In this work, predictive model for a bio-inspired broadband frequency sensor is developed. Broadband frequency sensing is essential in many domains of science and technology. One great example of such sensor is human cochlea, where it senses a frequency band of 20 Hz to 20 KHz. Developing broadband sensor adopting the physics of human cochlea has found tremendous interest in recent years. Although few experimental studies have been reported, a true predictive model to design such sensors is missing. A predictive model is utmost necessary for accurate design of selective broadband sensors that are capable of sensing very selective band of frequencies. Hence, in this study, we proposed a novel predictive model for the cochlea-inspired broadband sensor, aiming to select the frequency band and model parameters predictively. Tapered plate geometry is considered mimicking the real shape of the basilar membrane in the human cochlea. The predictive model is intended to develop flexible enough that can be employed in a wide variety of scientific domains. To do that, the predictive model is developed in such a way that, it can not only handle homogeneous but also any functionally graded model parameters. Additionally, the predictive model is capable of managing various types of boundary conditions. It has been found that, using the homogeneous model parameters, it is possible to sense a specific frequency band from a specific portion (B) of the model length (L). It is also possible to alter the attributes of `B' using functionally graded model parameters, which confirms the predictive frequency selection ability of the developed model.
Lee, Chu-Hee; Landham, Priyan R; Eastell, Richard; Adams, Michael A; Dolan, Patricia; Yang, Lang
2017-09-01
Finite element models of an isolated vertebral body cannot accurately predict compressive strength of the spinal column because, in life, compressive load is variably distributed across the vertebral body and neural arch. The purpose of this study was to develop and validate a patient-specific finite element model of a functional spinal unit, and then use the model to predict vertebral strength from medical images. A total of 16 cadaveric functional spinal units were scanned and then tested mechanically in bending and compression to generate a vertebral wedge fracture. Before testing, an image processing and finite element analysis framework (SpineVox-Pro), developed previously in MATLAB using ANSYS APDL, was used to generate a subject-specific finite element model with eight-node hexahedral elements. Transversely isotropic linear-elastic material properties were assigned to vertebrae, and simple homogeneous linear-elastic properties were assigned to the intervertebral disc. Forward bending loading conditions were applied to simulate manual handling. Results showed that vertebral strengths measured by experiment were positively correlated with strengths predicted by the functional spinal unit finite element model with von Mises or Drucker-Prager failure criteria ( R 2 = 0.80-0.87), with areal bone mineral density measured by dual-energy X-ray absorptiometry ( R 2 = 0.54) and with volumetric bone mineral density from quantitative computed tomography ( R 2 = 0.79). Large-displacement non-linear analyses on all specimens did not improve predictions. We conclude that subject-specific finite element models of a functional spinal unit have potential to estimate the vertebral strength better than bone mineral density alone.
Development of the NASA Digital Astronaut Project Muscle Model
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Pennline, James A.; Thompson, W. K.; Humphreys, B. T.; Ryder, J. W.; Ploutz-Snyder, L. L.; Mulugeta, L.
2015-01-01
This abstract describes development work performed on the NASA Digital Astronaut Project Muscle Model. Muscle atrophy is a known physiological response to exposure to a low gravity environment. The DAP muscle model computationally predicts the change in muscle structure and function vs. time in a reduced gravity environment. The spaceflight muscle model can then be used in biomechanical models of exercise countermeasures and spaceflight tasks to: 1) develop site specific bone loading input to the DAP bone adaptation model over the course of a mission; 2) predict astronaut performance of spaceflight tasks; 3) inform effectiveness of new exercise countermeasures concepts.
NASA Technical Reports Server (NTRS)
Briggs, Maxwell H.
2011-01-01
The Fission Power System (FPS) project is developing a Technology Demonstration Unit (TDU) to verify the performance and functionality of a subscale version of the FPS reference concept in a relevant environment, and to verify component and system models. As hardware is developed for the TDU, component and system models must be refined to include the details of specific component designs. This paper describes the development of a Sage-based pseudo-steady-state Stirling convertor model and its implementation into a system-level model of the TDU.
2013-01-01
In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate. PMID:22531754
Analytical determination of critical crack size in solar cells
NASA Technical Reports Server (NTRS)
Chen, C. P.
1988-01-01
Although solar cells usually have chips and cracks, no material specifications concerning the allowable crack size on solar cells are available for quality assurance and engineering design usage. Any material specifications that the cell manufacturers use were developed for cosmetic reasons that have no technical basis. Therefore, the Applied Solar Energy Corporation (ASEC) has sponsored a continuing program for the fracture mechanics evaluation of GaAs. Fracture mechanics concepts were utilized to develop an analytical model that can predict the critical crack size of solar cells. This model indicates that the edge cracks of a solar cell are more critical than its surface cracks. In addition, the model suggests that the material specifications on the allowable crack size used for Si solar cells should not be applied to GaAs solar cells. The analytical model was applied to Si and GaAs solar cells, but it would also be applicable to the semiconductor wafers of other materials, such as a GaAs thin film on a Ge substrate, using appropriate input data.
Morelato, Marie; Baechler, Simon; Ribaux, Olivier; Beavis, Alison; Tahtouh, Mark; Kirkbride, Paul; Roux, Claude; Margot, Pierre
2014-03-01
Forensic intelligence is a distinct dimension of forensic science. Forensic intelligence processes have mostly been developed to address either a specific type of trace or a specific problem. Even though these empirical developments have led to successes, they are trace-specific in nature and contribute to the generation of silos which hamper the establishment of a more general and transversal model. Forensic intelligence has shown some important perspectives but more general developments are required to address persistent challenges. This will ensure the progress of the discipline as well as its widespread implementation in the future. This paper demonstrates that the description of forensic intelligence processes, their architectures, and the methods for building them can, at a certain level, be abstracted from the type of traces considered. A comparative analysis is made between two forensic intelligence approaches developed independently in Australia and in Europe regarding the monitoring of apparently very different kind of problems: illicit drugs and false identity documents. An inductive effort is pursued to identify similarities and to outline a general model. Besides breaking barriers between apparently separate fields of study in forensic science and intelligence, this transversal model would assist in defining forensic intelligence, its role and place in policing, and in identifying its contributions and limitations. The model will facilitate the paradigm shift from the current case-by-case reactive attitude towards a proactive approach by serving as a guideline for the use of forensic case data in an intelligence-led perspective. A follow-up article will specifically address issues related to comparison processes, decision points and organisational issues regarding forensic intelligence (part II). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Norris Reinero, Carol R; Decile, Kendra C; Berghaus, Roy D; Williams, Kurt J; Leutenegger, Christian M; Walby, William F; Schelegle, Edward S; Hyde, Dallas M; Gershwin, Laurel J
2004-10-01
Animal models are used to mimic human asthma, however, not all models replicate the major characteristics of the human disease. Spontaneous development of asthma with hallmark features similar to humans has been documented to occur with relative frequency in only one animal species, the cat. We hypothesized that we could develop an experimental model of feline asthma using clinically relevant aeroallergens identified from cases of naturally developing feline asthma, and characterize immunologic, physiologic, and pathologic changes over 1 year. House dust mite (HDMA) and Bermuda grass (BGA) allergen were selected by screening 10 privately owned pet cats with spontaneous asthma using a serum allergen-specific IgE ELISA. Parenteral sensitization and aerosol challenges were used to replicate the naturally developing disease in research cats. The asthmatic phenotype was characterized using intradermal skin testing, serum allergen-specific IgE ELISA, serum and bronchoalveolar lavage fluid (BALF) IgG and IgA ELISAs, airway hyperresponsiveness testing, BALF cytology, cytokine profiles using TaqMan PCR, and histopathologic evaluation. Sensitization with HDMA or BGA in cats led to allergen-specific IgE production, allergen-specific serum and BALF IgG and IgA production, airway hyperreactivity, airway eosinophilia, an acute T helper 2 cytokine profile in peripheral blood mononuclear cells and BALF cells, and histologic evidence of airway remodeling. Using clinically relevant aeroallergens to sensitize and challenge the cat provides an additional animal model to study the immunopathophysiologic mechanisms of allergic asthma. Chronic exposure to allergen in the cat leads to a variety of immunologic, physiologic, and pathologic changes that mimic the features seen in human asthma.
Generic solar photovoltaic system dynamic simulation model specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, Abraham; Behnke, Michael Robert; Elliott, Ryan Thomas
This document is intended to serve as a specification for generic solar photovoltaic (PV) system positive-sequence dynamic models to be implemented by software developers and approved by the WECC MVWG for use in bulk system dynamic simulations in accordance with NERC MOD standards. Two specific dynamic models are included in the scope of this document. The first, a Central Station PV System model, is intended to capture the most important dynamic characteristics of large scale (> 10 MW) PV systems with a central Point of Interconnection (POI) at the transmission level. The second, a Distributed PV System model, is intendedmore » to represent an aggregation of smaller, distribution-connected systems that comprise a portion of a composite load that might be modeled at a transmission load bus.« less
Development of the Patient-specific Cardiovascular Modeling System Using Immersed Boundary Technique
NASA Astrophysics Data System (ADS)
Tay, Wee-Beng; Lin, Liang-Yu; Tseng, Wen-Yih; Tseng, Yu-Heng
2010-05-01
A computational fluid dynamics (CFD) based, patient-specific cardiovascular modeling system is under-developed. The system can identify possible diseased conditions and facilitate physicians' diagnosis at early stage through the hybrid CFD simulation and time-resolved magnetic resonance imaging (MRI). The CFD simulation is initially based on the three-dimensional heart model developed by McQueen and Peskin, which can simultaneously compute fluid motions and elastic boundary motions using the immersed boundary method. We extend and improve the three-dimensional heart model for the clinical application by including the patient-specific hemodynamic information. The flow features in the ventricles and their responses are investigated under different inflow and outflow conditions during diastole and systole phases based on the quasi-realistic heart model, which takes advantage of the observed flow scenarios. Our results indicate distinct differences between the two groups of participants, including the vortex formation process in the left ventricle (LV), as well as the flow rate distributions at different identified sources such as the aorta, vena cava and pulmonary veins/artery. We further identify some key parameters which may affect the vortex formation in the LV. Thus it is hypothesized that disease-related dysfunctions in intervals before complete heart failure can be observed in the dynamics of transmitral blood flow during early LV diastole.
Novel In Vivo Model for Combinatorial Fluorescence Labeling in Mouse Prostate
Fang, Xiaolan; Gyabaah, Kenneth; Nickkholgh, Bita; Cline, J. Mark; Balaji, K.C.
2015-01-01
BACKGROUND The epithelial layer of prostate glands contains several types of cells, including luminal and basal cells. Yet there is paucity of animal models to study the cellular origin of normal or neoplastic development in the prostate to facilitate the treatment of heterogenous prostate diseases by targeting individual cell lineages. METHODS We developed a mouse model that expresses different types of fluorescent proteins (XFPs) specifically in prostatic cells. Using an in vivo stochastic fluorescent protein combinatorial strategy, XFP signals were expressed specifically in prostate of Protein Kinase D1 (PKD1) knock-out, K-RasG12D knock-in, and Phosphatase and tensin homolog (PTEN) and PKD1 double knock-out mice under the control of PB-Cre promoter. RESULTS In vivo XFP signals were observed in prostate of PKD1 knock-out, K-RasG12D knock-in, and PTEN PKD1 double knock-out mice, which developed normal, hyperplastic, and neoplastic prostate, respectively. The patchy expression pattern of XFPs in neoplasia tissue indicated the clonal origin of cancer cells in the prostate. CONCLUSIONS The transgenic mouse models demonstrate combinatorial fluorescent protein expression in normal and cancerous prostatic tissues. This novel prostate-specific fluorescent labeled mouse model, which we named Prorainbow, could be useful in studying benign and malignant pathology of prostate. PMID:25753731
Novel In Vivo model for combinatorial fluorescence labeling in mouse prostate.
Fang, Xiaolan; Gyabaah, Kenneth; Nickkholgh, Bita; Cline, J Mark; Balaji, K C
2015-06-15
The epithelial layer of prostate glands contains several types of cells, including luminal and basal cells. Yet there is paucity of animal models to study the cellular origin of normal or neoplastic development in the prostate to facilitate the treatment of heterogenous prostate diseases by targeting individual cell lineages. We developed a mouse model that expresses different types of fluorescent proteins (XFPs) specifically in prostatic cells. Using an in vivo stochastic fluorescent protein combinatorial strategy, XFP signals were expressed specifically in prostate of Protein Kinase D1 (PKD1) knock-out, K-Ras(G) (12) (D) knock-in, and Phosphatase and tensin homolog (PTEN) and PKD1 double knock-out mice under the control of PB-Cre promoter. In vivo XFP signals were observed in prostate of PKD1 knock-out, K-Ras(G) (12) (D) knock-in, and PTEN PKD1 double knock-out mice, which developed normal, hyperplastic, and neoplastic prostate, respectively. The patchy expression pattern of XFPs in neoplasia tissue indicated the clonal origin of cancer cells in the prostate. The transgenic mouse models demonstrate combinatorial fluorescent protein expression in normal and cancerous prostatic tissues. This novel prostate-specific fluorescent labeled mouse model, which we named Prorainbow, could be useful in studying benign and malignant pathology of prostate. © 2015 Wiley Periodicals, Inc.
Curado, Silvia; Ober, Elke A.; Walsh, Susan; Cortes-Hernandez, Paulina; Verkade, Heather; Koehler, Carla M.; Stainier, Didier Y. R.
2010-01-01
SUMMARY Understanding liver development should lead to greater insights into liver diseases and improve therapeutic strategies. In a forward genetic screen for genes regulating liver development in zebrafish, we identified a mutant – oliver – that exhibits liver-specific defects. In oliver mutants, the liver is specified, bile ducts form and hepatocytes differentiate. However, the hepatocytes die shortly after their differentiation, and thus the resulting mutant liver consists mainly of biliary tissue. We identified a mutation in the gene encoding translocase of the outer mitochondrial membrane 22 (Tomm22) as responsible for this phenotype. Mutations in tomm genes have been associated with mitochondrial dysfunction, but most studies on the effect of defective mitochondrial protein translocation have been carried out in cultured cells or unicellular organisms. Therefore, the tomm22 mutant represents an important vertebrate genetic model to study mitochondrial biology and hepatic mitochondrial diseases. We further found that the temporary knockdown of Tomm22 levels by morpholino antisense oligonucleotides causes a specific hepatocyte degeneration phenotype that is reversible: new hepatocytes repopulate the liver as Tomm22 recovers to wild-type levels. The specificity and reversibility of hepatocyte ablation after temporary knockdown of Tomm22 provides an additional model to study liver regeneration, under conditions where most hepatocytes have died. We used this regeneration model to analyze the signaling commonalities between hepatocyte development and regeneration. PMID:20483998
Miao, Jinxin; Ying, Baoling; Li, Rong; Tollefson, Ann E; Spencer, Jacqueline F; Wold, William S M; Song, Seok-Hwan; Kong, Il-Keun; Toth, Karoly; Wang, Yaohe; Wang, Zhongde
2018-05-06
The accumulating evidence demonstrates that Syrian hamsters have advantages as models for various diseases. To develop a Syrian hamster ( Mesocricetus auratus ) model of human immunodeficiency caused by RAG1 gene mutations, we employed the CRISPR/Cas9 system and introduced an 86-nucleotide frameshift deletion in the hamster RAG1 gene encoding part of the N-terminal non-core domain of RAG1. Histological and immunohistochemical analyses demonstrated that these hamsters (referred herein as RAG1-86nt hamsters) had atrophic spleen and thymus, and developed significantly less white pulp and were almost completely devoid of splenic lymphoid follicles. The RAG1-nt86 hamsters had barely detectable CD3⁺ and CD4⁺ T cells. The expression of B and T lymphocyte-specific genes (CD3γ and CD4 for T cell-specific) and (CD22 and FCMR for B cell-specific) was dramatically reduced, whereas the expression of macrophage-specific (CD68) and natural killer (NK) cell-specific (CD94 and KLRG1) marker genes was increased in the spleen of RAG1-nt86 hamsters compared to wildtype hamsters. Interestingly, despite the impaired development of B and T lymphocytes, the RAG1-86nt hamsters still developed neutralizing antibodies against human adenovirus type C6 (HAdV-C6) upon intranasal infection and were capable of clearing the infectious viruses, albeit with slower kinetics. Therefore, the RAG1-86nt hamster reported herein (similar to the hypomorphic RAG1 mutations in humans that cause Omenn syndrome), may provide a useful model for studying the pathogenesis of the specific RAG1-mutation-induced human immunodeficiency, the host immune response to adenovirus infection and other pathogens as well as for evaluation of cell and gene therapies for treatment of this subset of RAG1 mutation patients.
Sex-specific habitat suitability models for Panthera tigris in Chitwan National Park, Nepal
NASA Astrophysics Data System (ADS)
Battle, Curtis Scott
Although research on wildlife species across taxa has shown that males and females differentially select habitat, sex-specific models of habitat suitability for endangered species are uncommon. Here, we developed such models for Bengal Tigers (Panthera tigris) based on camera trap data collected from 20 January to 22 March, 2010, within Chitwan National Park, Nepal, and its buffer zone. We compared these to a sex-indiscriminate habitat suitability model in order to identify information that is lost when occurrence data for both sexes are included in the same model, as well as to assess the benefits of a sex-specific approach to habitat suitability modelling. Our sex-specific models allowed us to produce more informative and detailed habitat suitability maps, highlighting key differences in the distribution of suitable habitats for males and females, preferences in vegetation structure, and habitat use near human settlements. In the context of global tiger conservation, such information is essential to fulfilling established conservation goals and population recovery targets.
Specific acoustic models for spontaneous and dictated style in indonesian speech recognition
NASA Astrophysics Data System (ADS)
Vista, C. B.; Satriawan, C. H.; Lestari, D. P.; Widyantoro, D. H.
2018-03-01
The performance of an automatic speech recognition system is affected by differences in speech style between the data the model is originally trained upon and incoming speech to be recognized. In this paper, the usage of GMM-HMM acoustic models for specific speech styles is investigated. We develop two systems for the experiments; the first employs a speech style classifier to predict the speech style of incoming speech, either spontaneous or dictated, then decodes this speech using an acoustic model specifically trained for that speech style. The second system uses both acoustic models to recognise incoming speech and decides upon a final result by calculating a confidence score of decoding. Results show that training specific acoustic models for spontaneous and dictated speech styles confers a slight recognition advantage as compared to a baseline model trained on a mixture of spontaneous and dictated training data. In addition, the speech style classifier approach of the first system produced slightly more accurate results than the confidence scoring employed in the second system.
A three-dimensional inverse finite element analysis of the heel pad.
Chokhandre, Snehal; Halloran, Jason P; van den Bogert, Antonie J; Erdemir, Ahmet
2012-03-01
Quantification of plantar tissue behavior of the heel pad is essential in developing computational models for predictive analysis of preventive treatment options such as footwear for patients with diabetes. Simulation based studies in the past have generally adopted heel pad properties from the literature, in return using heel-specific geometry with material properties of a different heel. In exceptional cases, patient-specific material characterization was performed with simplified two-dimensional models, without further evaluation of a heel-specific response under different loading conditions. The aim of this study was to conduct an inverse finite element analysis of the heel in order to calculate heel-specific material properties in situ. Multidimensional experimental data available from a previous cadaver study by Erdemir et al. ("An Elaborate Data Set Characterizing the Mechanical Response of the Foot," ASME J. Biomech. Eng., 131(9), pp. 094502) was used for model development, optimization, and evaluation of material properties. A specimen-specific three-dimensional finite element representation was developed. Heel pad material properties were determined using inverse finite element analysis by fitting the model behavior to the experimental data. Compression dominant loading, applied using a spherical indenter, was used for optimization of the material properties. The optimized material properties were evaluated through simulations representative of a combined loading scenario (compression and anterior-posterior shear) with a spherical indenter and also of a compression dominant loading applied using an elevated platform. Optimized heel pad material coefficients were 0.001084 MPa (μ), 9.780 (α) (with an effective Poisson's ratio (ν) of 0.475), for a first-order nearly incompressible Ogden material model. The model predicted structural response of the heel pad was in good agreement for both the optimization (<1.05% maximum tool force, 0.9% maximum tool displacement) and validation cases (6.5% maximum tool force, 15% maximum tool displacement). The inverse analysis successfully predicted the material properties for the given specimen-specific heel pad using the experimental data for the specimen. The modeling framework and results can be used for accurate predictions of the three-dimensional interaction of the heel pad with its surroundings.
Photochemical Phenomenology Model for the New Millenium
NASA Technical Reports Server (NTRS)
Bishop, James; Evans, J. Scott
2000-01-01
This project tackles the problem of conversion of validated a priori physics-based modeling capabilities, specifically those relevant to the analysis and interpretation of planetary atmosphere observations, to application-oriented software for use in science and science-support activities. The software package under development, named the Photochemical Phenomenology Modeling Tool (PPMT), has particular focus on the atmospheric remote sensing data to be acquired by the CIRS instrument during the CASSINI Jupiter flyby and orbital tour of the Saturnian system. Overall, the project has followed the development outline given in the original proposal, and the Year 1 design and architecture goals have been met. Specific accomplishments and the difficulties encountered are summarized in this report. Most of the effort has gone into complete definition of the PPMT interfaces within the context of today's IT arena: adoption and adherence to the CORBA Component Model (CCM) has yielded a solid architecture basis, and CORBA-related issues (services, specification options, development plans, etc.) have been largely resolved. Implementation goals have been redirected somewhat so as to be more relevant to the upcoming CASSINI flyby of Jupiter, with focus now being more on data analysis and remote sensing retrieval applications.
2014-01-01
Spatial heterogeneity in the incidence of visceral leishmaniasis (VL) is an important aspect to be considered in planning control actions for the disease. The objective of this study was to predict areas at high risk for visceral leishmaniasis (VL) based on socioeconomic indicators and remote sensing data. We applied classification and regression trees to develop and validate prediction models. Performance of the models was assessed by means of sensitivity, specificity and area under the ROC curve. The model developed was able to discriminate 15 subsets of census tracts (CT) with different probabilities of containing CT with high risk of VL occurrence. The model presented, respectively, in the validation and learning samples, sensitivity of 79% and 52%, specificity of 75% and 66%, and area under the ROC curve of 83% and 66%. Considering the complex network of factors involved in the occurrence of VL in urban areas, the results of this study showed that the development of a predictive model for VL might be feasible and useful for guiding interventions against the disease, but it is still a challenge as demonstrated by the unsatisfactory predictive performance of the model developed. PMID:24885128
Bottomside Ionospheric Electron Density Specification using Passive High Frequency Signals
NASA Astrophysics Data System (ADS)
Kaeppler, S. R.; Cosgrove, R. B.; Mackay, C.; Varney, R. H.; Kendall, E. A.; Nicolls, M. J.
2016-12-01
The vertical bottomside electron density profile is influenced by a variety of natural sources, most especially traveling ionospheric disturbances (TIDs). These disturbances cause plasma to be moved up or down along the local geomagnetic field and can strongly impact the propagation of high frequency radio waves. While the basic physics of these perturbations has been well studied, practical bottomside models are not well developed. We present initial results from an assimilative bottomside ionosphere model. This model uses empirical orthogonal functions based on the International Reference Ionosphere (IRI) to develop a vertical electron density profile, and features a builtin HF ray tracing function. This parameterized model is then perturbed to model electron density perturbations associated with TIDs or ionospheric gradients. Using the ray tracing feature, the model assimilates angle of arrival measurements from passive HF transmitters. We demonstrate the effectiveness of the model using angle of arrival data. Modeling results of bottomside electron density specification are compared against suitable ancillary observations to quantify accuracy of our model.
NASA Technical Reports Server (NTRS)
Hoffler, Keith D.; Fears, Scott P.; Carzoo, Susan W.
1997-01-01
A generic airplane model concept was developed to allow configurations with various agility, performance, handling qualities, and pilot vehicle interface to be generated rapidly for piloted simulation studies. The simple concept allows stick shaping and various stick command types or modes to drive an airplane with both linear and nonlinear components. Output from the stick shaping goes to linear models or a series of linear models that can represent an entire flight envelope. The generic model also has provisions for control power limitations, a nonlinear feature. Therefore, departures from controlled flight are possible. Note that only loss of control is modeled, the generic airplane does not accurately model post departure phenomenon. The model concept is presented herein, along with four example airplanes. Agility was varied across the four example airplanes without altering specific excess energy or significantly altering handling qualities. A new feedback scheme to provide angle-of-attack cueing to the pilot, while using a pitch rate command system, was implemented and tested.
Conceptualizing a model: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-2.
Roberts, Mark; Russell, Louise B; Paltiel, A David; Chambers, Michael; McEwan, Phil; Krahn, Murray
2012-01-01
The appropriate development of a model begins with understanding the problem that is being represented. The aim of this article is to provide a series of consensus-based best practices regarding the process of model conceptualization. For the purpose of this series of papers, the authors consider the development of models whose purpose is to inform medical decisions and health-related resource allocation questions. They specifically divide the conceptualization process into two distinct components: the conceptualization of the problem, which converts knowledge of the health care process or decision into a representation of the problem, followed by the conceptualization of the model itself, which matches the attributes and characteristics of a particular modeling type to the needs of the problem being represented. Recommendations are made regarding the structure of the modeling team, agreement on the statement of the problem, the structure, perspective and target population of the model, and the interventions and outcomes represented. Best practices relating to the specific characteristics of model structure, and which characteristics of the problem might be most easily represented in a specific modeling method, are presented. Each section contains a number of recommendations that were iterated among the authors, as well as the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making.
Microscopic model of road capacity for highway systems in port based metropolitan areas.
DOT National Transportation Integrated Search
2011-03-01
In this report, we present our approach to use microscopic modeling to assess : highway traffic mobility during lane blockage situation. A test microscopic model using : ARENA software is developed. In this model, we specifically aim to simulate the ...
Nondestructive pavement evaluation using ILLI-PAVE based artificial neural network models.
DOT National Transportation Integrated Search
2008-09-01
The overall objective in this research project is to develop advanced pavement structural analysis models for more accurate solutions with fast computation schemes. Soft computing and modeling approaches, specifically the Artificial Neural Network (A...
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1994-01-01
This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
Nagy, Christopher J; Fitzgerald, Brian M; Kraus, Gregory P
2014-01-01
Anesthesiology residency programs will be expected to have Milestones-based evaluation systems in place by July 2014 as part of the Next Accreditation System. The San Antonio Uniformed Services Health Education Consortium (SAUSHEC) anesthesiology residency program developed and implemented a Milestones-based feedback and evaluation system a year ahead of schedule. It has been named the Milestone-specific, Observed Data points for Evaluating Levels of performance (MODEL) assessment strategy. The "MODEL Menu" and the "MODEL Blueprint" are tools that other anesthesiology residency programs can use in developing their own Milestones-based feedback and evaluation systems prior to ACGME-required implementation. Data from our early experience with the streamlined MODEL blueprint assessment strategy showed substantially improved faculty compliance with reporting requirements. The MODEL assessment strategy provides programs with a workable assessment method for residents, and important Milestones data points to programs for ACGME reporting.
A Model-Driven Development Method for Management Information Systems
NASA Astrophysics Data System (ADS)
Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki
Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.
NASA Technical Reports Server (NTRS)
Bekey, G. A.
1971-01-01
Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.
Strategies for developing competency models.
Marrelli, Anne F; Tondora, Janis; Hoge, Michael A
2005-01-01
There is an emerging trend within healthcare to introduce competency-based approaches in the training, assessment, and development of the workforce. The trend is evident in various disciplines and specialty areas within the field of behavioral health. This article is designed to inform those efforts by presenting a step-by-step process for developing a competency model. An introductory overview of competencies, competency models, and the legal implications of competency development is followed by a description of the seven steps involved in creating a competency model for a specific function, role, or position. This modeling process is drawn from advanced work on competencies in business and industry.
A model of urban rational growth based on grey prediction
NASA Astrophysics Data System (ADS)
Xiao, Wenjing
2017-04-01
Smart growth focuses on building sustainable cities, using compact development to prevent urban sprawl. This paper establishes a series of models to implement smart growth theories into city design. Besides two specific city design cases are shown. Firstly, We establishes Smart Growth Measure Model to measure the success of smart growth of a city. And we use Full Permutation Polygon Synthetic Indicator Method to calculate the Comprehensive Indicator (CI) which is used to measure the success of smart growth. Secondly, this paper uses the principle of smart growth to develop a new growth plan for two cities. We establish an optimization model to maximum CI value. The Particle Swarm Optimization (PSO) algorithm is used to solve the model. Combined with the calculation results and the specific circumstances of cities, we make their the smart growth plan respectively.
Pohl, Calvin S; Medland, Julia E; Moeser, Adam J
2015-12-15
Early-life stress and adversity are major risk factors in the onset and severity of gastrointestinal (GI) disease in humans later in life. The mechanisms by which early-life stress leads to increased GI disease susceptibility in adult life remain poorly understood. Animal models of early-life stress have provided a foundation from which to gain a more fundamental understanding of this important GI disease paradigm. This review focuses on animal models of early-life stress-induced GI disease, with a specific emphasis on translational aspects of each model to specific human GI disease states. Early postnatal development of major GI systems and the consequences of stress on their development are discussed in detail. Relevant translational differences between species and models are highlighted. Copyright © 2015 the American Physiological Society.
Ma, Ye; Xie, Shengquan; Zhang, Yanxin
2016-03-01
A patient-specific electromyography (EMG)-driven neuromuscular model (PENm) is developed for the potential use of human-inspired gait rehabilitation robots. The PENm is modified based on the current EMG-driven models by decreasing the calculation time and ensuring good prediction accuracy. To ensure the calculation efficiency, the PENm is simplified into two EMG channels around one joint with minimal physiological parameters. In addition, a dynamic computation model is developed to achieve real-time calculation. To ensure the calculation accuracy, patient-specific muscle kinematics information, such as the musculotendon lengths and the muscle moment arms during the entire gait cycle, are employed based on the patient-specific musculoskeletal model. Moreover, an improved force-length-velocity relationship is implemented to generate accurate muscle forces. Gait analysis data including kinematics, ground reaction forces, and raw EMG signals from six adolescents at three different speeds were used to evaluate the PENm. The simulation results show that the PENm has the potential to predict accurate joint moment in real-time. The design of advanced human-robot interaction control strategies and human-inspired gait rehabilitation robots can benefit from the application of the human internal state provided by the PENm. Copyright © 2016 Elsevier Ltd. All rights reserved.
Design Specifications for the Advanced Instructional Design Advisor (AIDA). Volume 1
1992-01-01
research; (3) Describe the knowledge base sufficient to support the varieties of knowledge to be represented in the AIDA model ; (4) Document the...feasibility of continuing the development of the AIDA model . 2.3 Background In Phase I of the AIDA project (Task 0006), (1) the AIDA concept was defined...the AIDA Model A paper-based demonstration of the AIDA instructional design model was performed by using the model to develop a minimal application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Utgikar, Vivek; Sun, Xiaodong; Christensen, Richard
2016-12-29
The overall goal of the research project was to model the behavior of the advanced reactorintermediate heat exchange system and to develop advanced control techniques for off-normal conditions. The specific objectives defined for the project were: 1. To develop the steady-state thermal hydraulic design of the intermediate heat exchanger (IHX); 2. To develop mathematical models to describe the advanced nuclear reactor-IHX-chemical process/power generation coupling during normal and off-normal operations, and to simulate models using multiphysics software; 3. To develop control strategies using genetic algorithm or neural network techniques and couple these techniques with the multiphysics software; 4. To validate themore » models experimentally The project objectives were accomplished by defining and executing four different tasks corresponding to these specific objectives. The first task involved selection of IHX candidates and developing steady state designs for those. The second task involved modeling of the transient and offnormal operation of the reactor-IHX system. The subsequent task dealt with the development of control strategies and involved algorithm development and simulation. The last task involved experimental validation of the thermal hydraulic performances of the two prototype heat exchangers designed and fabricated for the project at steady state and transient conditions to simulate the coupling of the reactor- IHX-process plant system. The experimental work utilized the two test facilities at The Ohio State University (OSU) including one existing High-Temperature Helium Test Facility (HTHF) and the newly developed high-temperature molten salt facility.« less
A Model of Developing Communication Skills among Adolescents with Behavioral Problems
ERIC Educational Resources Information Center
Novik, Natalia N.; Podgórecki, Józef
2015-01-01
The urgency of the problem under investigation is determined by the need to help the adolescents with behavioral problems to develop communication skills in the specific bilingual conditions in such regions as the Republic of Tatarstan where education should consider not only the specific skills of verbal behavior but also take into account the…
OpenADR Specification to Ease Saving Power in Buildings
Piette, Mary Ann
2017-12-09
A new data model developed by researchers at the Department of Energys Lawrence Berkeley National Laboratory and their colleagues at other universities and in the private sector will help facilities and buildings save power through automated demand response technology, and advance the development of the Smart Grid. http://newscenter.lbl.gov/press-releases/2009/04/27/openadr-specification/
A Methodology for Developing Learning Objects for Web Course Delivery
ERIC Educational Resources Information Center
Stauffer, Karen; Lin, Fuhua; Koole, Marguerite
2008-01-01
This article presents a methodology for developing learning objects for web-based courses using the IMS Learning Design (IMS LD) specification. We first investigated the IMS LD specification, determining how to use it with online courses and the student delivery model, and then applied this to a Unit of Learning (UOL) for online computer science…
Studying the Brain in a Dish: 3D Cell Culture Models of Human Brain Development and Disease.
Brown, Juliana; Quadrato, Giorgia; Arlotta, Paola
2018-01-01
The study of the cellular and molecular processes of the developing human brain has been hindered by access to suitable models of living human brain tissue. Recently developed 3D cell culture models offer the promise of studying fundamental brain processes in the context of human genetic background and species-specific developmental mechanisms. Here, we review the current state of 3D human brain organoid models and consider their potential to enable investigation of complex aspects of human brain development and the underpinning of human neurological disease. © 2018 Elsevier Inc. All rights reserved.
Job Aid Manuals for Phase II--DESIGN of the Instructional Systems Development Model.
ERIC Educational Resources Information Center
Schulz, Russel E.; Farrell, Jean R.
Designed to supplement the descriptive authoring flowcharts presented in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the second phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions;…
Job Aid Manuals for Phase I--ANALYZE of the Instructional Systems Development Model.
ERIC Educational Resources Information Center
Schulz, Russel E.; Farrell, Jean R.
Designed to supplement the descriptive authoring flowcharts in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the first phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions; descriptions of…
DEVELOPMENT OF AN EMPIRICAL MODEL OF METHANE EMISSIONS FROM LANDFILLS
The report gives results of a field study of 21 U.S. landfills with gas recovery systems, to gather information that can be used to develop an empirical model of methane (CH4) emissions. Site-specific information includes average CH4 recovery rate, landfill size, tons of refuse (...
The Mystery Tubes: Teaching Pupils about Hypothetical Modelling
ERIC Educational Resources Information Center
Kenrick, Carole
2017-01-01
This article recounts the author's working experience of one method by which pupils' understanding of the epistemologies of science can be developed, specifically how scientists can develop hypothetical models and test them through simulations. She currently uses this approach for transition lessons with pupils in upper primary or lower secondary…
Participatory Model of Mental Health Programming: Lessons Learned from Work in a Developing Country.
ERIC Educational Resources Information Center
Nastasi, Bonnie K.; Varjas, Kristen; Sarkar, Sreeroopa; Jayasena, Asoka
1998-01-01
Describes application of participatory model for creating school-based mental health services in a developing country. Describes process of identifying individual and cultural factors relevant to mental health. Discusses importance of formative research and collaboration with stakeholders to ensure cultural specificity of interventions, and the…
Parental influences on 7-9 year olds' physical activity: a conceptual model.
Leary, Janie M; Lilly, Christa L; Dino, Geri; Loprinzi, Paul D; Cottrell, Lesley
2013-05-01
Models characterizing parental influence on child and adolescent physical activity (PA) over time are limited. Preschool and Adolescent Models (PM and AM) of PA are available leaving the need to focus on elementary-aged children. We tested current models (PM and AM) with a sample of 7-9 year-olds, and then developed a model appropriate to this specific target population. Parent-child dyads completed questionnaires in 2010-2011. All models were assessed using path analysis and model fit indices. For adequate power, 90 families were needed, with 174 dyads participating. PM and AM exhibited poor fit when applied to the study population. A gender-specific model was developed and demonstrated acceptable fit. To develop an acceptable model for this population, constructs from both the PM (i.e. parental perception of child competency) and AM (i.e., child-reported self-efficacy) were used. For boys, self-efficacy was a strong predictor of PA, which was influenced by various parental variables. For girls, parental PA demonstrated the greatest strength of association with child PA. This new model can be used to promote PA and guide future research/interventions. Future studies, particularly longitudinal designs, are needed to confirm the utility of this model as a bridge between currently available models. Copyright © 2013 Elsevier Inc. All rights reserved.
Antiepileptic Drugs in Clinical Development: Differentiate or Die?
Zaccara, Gaetano; Schmidt, D
2017-01-01
Animal models when carefully selected, designed and conducted, are important parts of any translational drug development strategy. However, research of new compounds for patients with drugresistant epilepsies is still based on animal experiments, mostly in rodents, which are far from being a model of chronic human epilepsy and have failed to differentiate the efficacy of new compounds versus standard drug treatment. The objective was identification and description of compounds in clinical development in 2016. Search was conducted from the website of the U.S. National Institutes of Health and from literature. Identified compounds have been divided in two groups: 1) compounds initially developed for the treatment of diseases other than epilepsy: biperiden, bumetanide, everolimus, fenfluramine, melatonin, minocycline, verapamil. 2) Compounds specifically developed for the treatment of epilepsy: allopregnanolone, cannabidiol, cannabidivarin, ganaxolone, nalutozan, PF-06372865, UCB0942, and cenobamate. Everolimus, and perhaps, fenfluramine are effective in specific epileptic diseases and may be considered as true disease modifying antiepileptic drugs. These are tuberous sclerosis complex for everolimus and Dravet syndrome for fenfluramine. With the exception of a few other compounds such as cannabinidiol, cannabidivarin and minocycline, the vast majority of other compounds had mechanisms of action which are similar to the mechanism of action of the anti-seizure drugs already in the market. Substantial improvements in the efficacy, specifically as pharmacological treatment of drug-resistant epilepsy is regarded, are not expected. New drugs should be developed to specifically target the biochemical alteration which characterizes the underlying disease and also include targets that contribute to epileptogenesis in relevant epilepsy models. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Asymmetry of the Brain: Development and Implications.
Duboc, Véronique; Dufourcq, Pascale; Blader, Patrick; Roussigné, Myriam
2015-01-01
Although the left and right hemispheres of our brains develop with a high degree of symmetry at both the anatomical and functional levels, it has become clear that subtle structural differences exist between the two sides and that each is dominant in processing specific cognitive tasks. As the result of evolutionary conservation or convergence, lateralization of the brain is found in both vertebrates and invertebrates, suggesting that it provides significant fitness for animal life. This widespread feature of hemispheric specialization has allowed the emergence of model systems to study its development and, in some cases, to link anatomical asymmetries to brain function and behavior. Here, we present some of what is known about brain asymmetry in humans and model organisms as well as what is known about the impact of environmental and genetic factors on brain asymmetry development. We specifically highlight the progress made in understanding the development of epithalamic asymmetries in zebrafish and how this model provides an exciting opportunity to address brain asymmetry at different levels of complexity.
Modeling the Development of Goal-Specificity in Mirror Neurons.
Thill, Serge; Svensson, Henrik; Ziemke, Tom
2011-12-01
Neurophysiological studies have shown that parietal mirror neurons encode not only actions but also the goal of these actions. Although some mirror neurons will fire whenever a certain action is perceived (goal-independently), most will only fire if the motion is perceived as part of an action with a specific goal. This result is important for the action-understanding hypothesis as it provides a potential neurological basis for such a cognitive ability. It is also relevant for the design of artificial cognitive systems, in particular robotic systems that rely on computational models of the mirror system in their interaction with other agents. Yet, to date, no computational model has explicitly addressed the mechanisms that give rise to both goal-specific and goal-independent parietal mirror neurons. In the present paper, we present a computational model based on a self-organizing map, which receives artificial inputs representing information about both the observed or executed actions and the context in which they were executed. We show that the map develops a biologically plausible organization in which goal-specific mirror neurons emerge. We further show that the fundamental cause for both the appearance and the number of goal-specific neurons can be found in geometric relationships between the different inputs to the map. The results are important to the action-understanding hypothesis as they provide a mechanism for the emergence of goal-specific parietal mirror neurons and lead to a number of predictions: (1) Learning of new goals may mostly reassign existing goal-specific neurons rather than recruit new ones; (2) input differences between executed and observed actions can explain observed corresponding differences in the number of goal-specific neurons; and (3) the percentage of goal-specific neurons may differ between motion primitives.
Alternative supply specifications and estimates of regional supply and demand for stumpage.
Kent P. Connaughton; David H. Jackson; Gerard A. Majerus
1988-01-01
Four plausible sets of stumpage supply and demand equations were developed and estimated; the demand equation was the same for each set, although the supply equation differed. The supply specifications varied from the model of regional excess demand in which National Forest harvest levels were assumed fixed to a more realistic model in which the harvest on the National...
Physiologically relevant organs on chips
Yum, Kyungsuk; Hong, Soon Gweon; Lee, Luke P.
2015-01-01
Recent advances in integrating microengineering and tissue engineering have generated promising microengineered physiological models for experimental medicine and pharmaceutical research. Here we review the recent development of microengineered physiological systems, or organs on chips, that reconstitute the physiologically critical features of specific human tissues and organs and their interactions. This technology uses microengineering approaches to construct organ-specific microenvironments, reconstituting tissue structures, tissue–tissue interactions and interfaces, and dynamic mechanical and biochemical stimuli found in specific organs, to direct cells to assemble into functional tissues. We first discuss microengineering approaches to reproduce the key elements of physiologically important, dynamic mechanical microenvironments, biochemical microenvironments, and microarchitectures of specific tissues and organs in microfluidic cell culture systems. This is followed by examples of microengineered individual organ models that incorporate the key elements of physiological microenvironments into single microfluidic cell culture systems to reproduce organ-level functions. Finally, microengineered multiple organ systems that simulate multiple organ interactions to better represent human physiology, including human responses to drugs, is covered in this review. This emerging organs-on-chips technology has the potential to become an alternative to 2D and 3D cell culture and animal models for experimental medicine, human disease modeling, drug development, and toxicology. PMID:24357624
HyPEP FY06 Report: Models and Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
DOE report
2006-09-01
The Department of Energy envisions the next generation very high-temperature gas-cooled reactor (VHTR) as a single-purpose or dual-purpose facility that produces hydrogen and electricity. The Ministry of Science and Technology (MOST) of the Republic of Korea also selected VHTR for the Nuclear Hydrogen Development and Demonstration (NHDD) Project. This research project aims at developing a user-friendly program for evaluating and optimizing cycle efficiencies of producing hydrogen and electricity in a Very-High-Temperature Reactor (VHTR). Systems for producing electricity and hydrogen are complex and the calculations associated with optimizing these systems are intensive, involving a large number of operating parameter variations andmore » many different system configurations. This research project will produce the HyPEP computer model, which is specifically designed to be an easy-to-use and fast running tool for evaluating nuclear hydrogen and electricity production facilities. The model accommodates flexible system layouts and its cost models will enable HyPEP to be well-suited for system optimization. Specific activities of this research are designed to develop the HyPEP model into a working tool, including (a) identifying major systems and components for modeling, (b) establishing system operating parameters and calculation scope, (c) establishing the overall calculation scheme, (d) developing component models, (e) developing cost and optimization models, and (f) verifying and validating the program. Once the HyPEP model is fully developed and validated, it will be used to execute calculations on candidate system configurations. FY-06 report includes a description of reference designs, methods used in this study, models and computational strategies developed for the first year effort. Results from computer codes such as HYSYS and GASS/PASS-H used by Idaho National Laboratory and Argonne National Laboratory, respectively will be benchmarked with HyPEP results in the following years.« less
An adaptive toolbox approach to the route to expertise in sport.
de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus
2014-01-01
Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.
An adaptive toolbox approach to the route to expertise in sport
de Oliveira, Rita F.; Lobinger, Babett H.; Raab, Markus
2014-01-01
Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes’ natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise. PMID:25071673
ERIC Educational Resources Information Center
Foley-Nicpon, Megan; Assouline, Susan G.; Kivlighan, D. Martin; Fosenburg, Staci; Cederberg, Charles; Nanji, Michelle
2017-01-01
Contemporary models highlight the need to cultivate cognitive and psychosocial factors in developing domain-specific talent. This model was the basis for the current study where high ability youth with self-reported social difficulties (n = 28, 12 with a coexisting disability) participated in a social skills and talent development intervention…
ERIC Educational Resources Information Center
Sarfo, Frederick K.; Elen, Jan
2007-01-01
In this study, the effectiveness of learning environments, developed in line with the specifications of the four components instructional design model (4C/ID model) and the additional effect of ICT for fostering the development of technical expertise in traditional Ghanaian classrooms, was assessed. The study had a one-by-one-by-two…
Combat Service Support Model Development: BRASS - TRANSLOG - Army 21
1984-07-01
throughout’the system. Transitional problems may address specific hardware and related software , such as the Standard Army Ammunition System ( SAAS ...FILE. 00 Cabat Service Support Model Development .,PASS TRANSLOG -- ARMY 21 0 Contract Number DAAK11-84-D-0004 Task Order #1 DRAFT REPOkT July 1984 D...Armament Systems, Inc. 211 West Bel Air Avenue P.O. Box 158 Aberdeen, MD 21001 8 8 8 2 1 S CORMIT SERVICE SUPPORT MODEL DEVELOPMENT BRASS -- TRANSLOG
Stone, Wesley W.; Gilliom, Robert J.; Crawford, Charles G.
2008-01-01
Regression models were developed for predicting annual maximum and selected annual maximum moving-average concentrations of atrazine in streams using the Watershed Regressions for Pesticides (WARP) methodology developed by the National Water-Quality Assessment Program (NAWQA) of the U.S. Geological Survey (USGS). The current effort builds on the original WARP models, which were based on the annual mean and selected percentiles of the annual frequency distribution of atrazine concentrations. Estimates of annual maximum and annual maximum moving-average concentrations for selected durations are needed to characterize the levels of atrazine and other pesticides for comparison to specific water-quality benchmarks for evaluation of potential concerns regarding human health or aquatic life. Separate regression models were derived for the annual maximum and annual maximum 21-day, 60-day, and 90-day moving-average concentrations. Development of the regression models used the same explanatory variables, transformations, model development data, model validation data, and regression methods as those used in the original development of WARP. The models accounted for 72 to 75 percent of the variability in the concentration statistics among the 112 sampling sites used for model development. Predicted concentration statistics from the four models were within a factor of 10 of the observed concentration statistics for most of the model development and validation sites. Overall, performance of the models for the development and validation sites supports the application of the WARP models for predicting annual maximum and selected annual maximum moving-average atrazine concentration in streams and provides a framework to interpret the predictions in terms of uncertainty. For streams with inadequate direct measurements of atrazine concentrations, the WARP model predictions for the annual maximum and the annual maximum moving-average atrazine concentrations can be used to characterize the probable levels of atrazine for comparison to specific water-quality benchmarks. Sites with a high probability of exceeding a benchmark for human health or aquatic life can be prioritized for monitoring.
Problem Solving in the General Mathematics Classroom
ERIC Educational Resources Information Center
Troutman, Andria Price; Lichtenberg, Betty Plunkett
1974-01-01
Five steps common to different problem solving models are listed. Next, seven specific abilities related to solving problems are discussed and examples given. Sample activities, appropriate to help in developing these specific abilities, are suggested. (LS)
Domain Specifications and Evaluative Criteria for Distributive Education.
ERIC Educational Resources Information Center
Weber, Larry J.; And Others
This paper attempts to provide a model that allows for individual evaluators to frame specific evaluation items based on a common set of beliefs (a philosophy) regarding the secondary school program for marketing and distributive education. It describes Popham's procedure for the development of domain specifications that frame the philosophical…
Response to "Transfer or Specificity?"
ERIC Educational Resources Information Center
Miller, Judith
2007-01-01
This article presents the author's response to "Transfer or Specificity?" and reports a research that supports a strong case for a fundamental motor skill as a precursor to two sport specific skills as in Gallahue and Ozmun's (2002) theoretical model of motor development. Reported changes in performance of the overarm throw are…
ERIC Educational Resources Information Center
Woolridge, Richard William
2010-01-01
Application and project domain specifications are an important aspect of Information Systems (IS) development. Observations of over thirty IS projects suggest dimly perceived structural patterns in specifications that are unaccounted for in research and practice. This investigation utilizes a theory building with case studies methodology to…
DeChenne, Sue Ellen; Koziol, Natalie; Needham, Mark; Enochs, Larry
2015-01-01
Graduate teaching assistants (GTAs) in science, technology, engineering, and mathematics (STEM) have a large impact on undergraduate instruction but are often poorly prepared to teach. Teaching self-efficacy, an instructor’s belief in his or her ability to teach specific student populations a specific subject, is an important predictor of teaching skill and student achievement. A model of sources of teaching self-efficacy is developed from the GTA literature. This model indicates that teaching experience, departmental teaching climate (including peer and supervisor relationships), and GTA professional development (PD) can act as sources of teaching self-efficacy. The model is pilot tested with 128 GTAs from nine different STEM departments at a midsized research university. Structural equation modeling reveals that K–12 teaching experience, hours and perceived quality of GTA PD, and perception of the departmental facilitating environment are significant factors that explain 32% of the variance in the teaching self-efficacy of STEM GTAs. This model highlights the important contributions of the departmental environment and GTA PD in the development of teaching self-efficacy for STEM GTAs. PMID:26250562
Lesmes Fabian, Camilo; Binder, Claudia R.
2015-01-01
In the field of occupational hygiene, researchers have been working on developing appropriate methods to estimate human exposure to pesticides in order to assess the risk and therefore to take the due decisions to improve the pesticide management process and reduce the health risks. This paper evaluates dermal exposure models to find the most appropriate. Eight models (i.e., COSHH, DERM, DREAM, EASE, PHED, RISKOFDERM, STOFFENMANAGER and PFAM) were evaluated according to a multi-criteria analysis and from these results five models (i.e., DERM, DREAM, PHED, RISKOFDERM and PFAM) were selected for the assessment of dermal exposure in the case study of the potato farming system in the Andean highlands of Vereda La Hoya, Colombia. The results show that the models provide different dermal exposure estimations which are not comparable. However, because of the simplicity of the algorithm and the specificity of the determinants, the DERM, DREAM and PFAM models were found to be the most appropriate although their estimations might be more accurate if specific determinants are included for the case studies in developing countries. PMID:25938911
Identity Profiles in Lesbian, Gay, and Bisexual Youth: The Role of Family Influences
Bregman, Hallie R.; Malik, Neena M.; Page, Matthew J. L.; Makynen, Emily; Lindahl, Kristin M.
2012-01-01
Sexual identity development is a central task of adolescence and young adulthood and can be especially challenging for sexual minority youth. Recent research has moved from a stage model of identity development in lesbian, gay, and bisexual (LGB) youth to examining identity in a non-linear, multidimensional manner. In addition, although families have been identified as important to youth's identity development, limited research has examined the influence of parental responses to youth's disclosure of their LGB sexual orientation on LGB identity. The current study examined a multidimensional model of LGB identity and its links with parental support and rejection. One hundred and sixty-nine LGB adolescents and young adults (ages 14–24, 56% male, 48% gay, 31% lesbian, 21% bisexual) described themselves on dimensions of LGB identity and reported on parental rejection, sexuality-specific social support, and non-sexuality-specific social support. Using latent profile analysis (LPA), two profiles were identified, indicating that youth experience both affirmed and struggling identities. Results indicated that parental rejection and sexuality-specific social support from families were salient links to LGB identity profile classification, while non-sexuality specific social support was unrelated. Parental rejection and sexuality-specific social support may be important to target in interventions for families to foster affirmed LGB identity development in youth. PMID:22847752
Using Live Dual Modeling to Help Preservice Teachers Develop TPACK
ERIC Educational Resources Information Center
Lu, Liangyue; Lei, Jing
2012-01-01
To help preservice teachers learn about teaching with technology--specifically, technological pedagogical content knowledge (TPACK)--the researchers designed and implemented a Live Dual Modeling strategy involving both live behavior modeling and cognitive modeling in this study. Using qualitative research methods, the researchers investigated…
Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel
2011-06-01
This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.
Dynamic Evaluation of Two Decades of CMAQ Simulations ...
This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emissions are simulated by the model. We applied spectral decomposition of the ozone time-series using the KZ filter to assess the variations in the strengths of synoptic (weather-induced variations) and baseline (long-term variation) forcings, embedded in the simulated and observed concentrations. The results reveal that CMAQ captured the year-to-year variability (more so in the later years than the earlier years) and the synoptic forcing in accordance with what the observations are showing. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
The VATES-Diamond as a Verifier's Best Friend
NASA Astrophysics Data System (ADS)
Glesner, Sabine; Bartels, Björn; Göthel, Thomas; Kleine, Moritz
Within a model-based software engineering process it needs to be ensured that properties of abstract specifications are preserved by transformations down to executable code. This is even more important in the area of safety-critical real-time systems where additionally non-functional properties are crucial. In the VATES project, we develop formal methods for the construction and verification of embedded systems. We follow a novel approach that allows us to formally relate abstract process algebraic specifications to their implementation in a compiler intermediate representation. The idea is to extract a low-level process algebraic description from the intermediate code and to formally relate it to previously developed abstract specifications. We apply this approach to a case study from the area of real-time operating systems and show that this approach has the potential to seamlessly integrate modeling, implementation, transformation and verification stages of embedded system development.
24 CFR 3285.1 - Administration.
Code of Federal Regulations, 2012 CFR
2012-04-01
... DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.1 Administration. (a) Scope. These Model Installation Standards provide minimum requirements for the initial installation of new... performing a specific operation or assembly, will be deemed to comply with these Model Installation Standards...
24 CFR 3285.1 - Administration.
Code of Federal Regulations, 2011 CFR
2011-04-01
... DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS General § 3285.1 Administration. (a) Scope. These Model Installation Standards provide minimum requirements for the initial installation of new... performing a specific operation or assembly, will be deemed to comply with these Model Installation Standards...
NASA Technical Reports Server (NTRS)
Solloway, C. B.; Wakeland, W.
1976-01-01
First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.
Van Holsbeke, C; Ameye, L; Testa, A C; Mascilini, F; Lindqvist, P; Fischerova, D; Frühauf, F; Fransis, S; de Jonge, E; Timmerman, D; Epstein, E
2014-05-01
To develop and validate strategies, using new ultrasound-based mathematical models, for the prediction of high-risk endometrial cancer and compare them with strategies using previously developed models or the use of preoperative grading only. Women with endometrial cancer were prospectively examined using two-dimensional (2D) and three-dimensional (3D) gray-scale and color Doppler ultrasound imaging. More than 25 ultrasound, demographic and histological variables were analyzed. Two logistic regression models were developed: one 'objective' model using mainly objective variables; and one 'subjective' model including subjective variables (i.e. subjective impression of myometrial and cervical invasion, preoperative grade and demographic variables). The following strategies were validated: a one-step strategy using only preoperative grading and two-step strategies using preoperative grading as the first step and one of the new models, subjective assessment or previously developed models as a second step. One-hundred and twenty-five patients were included in the development set and 211 were included in the validation set. The 'objective' model retained preoperative grade and minimal tumor-free myometrium as variables. The 'subjective' model retained preoperative grade and subjective assessment of myometrial invasion. On external validation, the performance of the new models was similar to that on the development set. Sensitivity for the two-step strategy with the 'objective' model was 78% (95% CI, 69-84%) at a cut-off of 0.50, 82% (95% CI, 74-88%) for the strategy with the 'subjective' model and 83% (95% CI, 75-88%) for that with subjective assessment. Specificity was 68% (95% CI, 58-77%), 72% (95% CI, 62-80%) and 71% (95% CI, 61-79%) respectively. The two-step strategies detected up to twice as many high-risk cases as preoperative grading only. The new models had a significantly higher sensitivity than did previously developed models, at the same specificity. Two-step strategies with 'new' ultrasound-based models predict high-risk endometrial cancers with good accuracy and do this better than do previously developed models. Copyright © 2013 ISUOG. Published by John Wiley & Sons Ltd.
Zhou, Xiangrong; Xu, Rui; Hara, Takeshi; Hirano, Yasushi; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Kido, Shoji; Fujita, Hiroshi
2014-07-01
The shapes of the inner organs are important information for medical image analysis. Statistical shape modeling provides a way of quantifying and measuring shape variations of the inner organs in different patients. In this study, we developed a universal scheme that can be used for building the statistical shape models for different inner organs efficiently. This scheme combines the traditional point distribution modeling with a group-wise optimization method based on a measure called minimum description length to provide a practical means for 3D organ shape modeling. In experiments, the proposed scheme was applied to the building of five statistical shape models for hearts, livers, spleens, and right and left kidneys by use of 50 cases of 3D torso CT images. The performance of these models was evaluated by three measures: model compactness, model generalization, and model specificity. The experimental results showed that the constructed shape models have good "compactness" and satisfied the "generalization" performance for different organ shape representations; however, the "specificity" of these models should be improved in the future.
Knowledge sifters in MDA technologies
NASA Astrophysics Data System (ADS)
Kravchenko, Yuri; Kursitys, Ilona; Bova, Victoria
2018-05-01
The article considers a new approach to efficient management of information processes on the basis of object models. With the help of special design tools, a generic and application-independent application model is created, and then the program is implemented in a specific development environment. At the same time, the development process is completely based on a model that must contain all the information necessary for programming. The presence of a detailed model provides the automatic creation of typical parts of the application, the development of which is amenable to automation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
JaeHwa Koh; DuckJoo Yoon; Chang H. Oh
2010-07-01
An electrolyzer model for the analysis of a hydrogen-production system using a solid oxide electrolysis cell (SOEC) has been developed, and the effects for principal parameters have been estimated by sensitivity studies based on the developed model. The main parameters considered are current density, area specific resistance, temperature, pressure, and molar fraction and flow rates in the inlet and outlet. Finally, a simple model for a high-temperature hydrogen-production system using the solid oxide electrolysis cell integrated with very high temperature reactors is estimated.
In Silico Models of Aerosol Delivery to the Respiratory Tract – Development and Applications
Longest, P. Worth; Holbrook, Landon T.
2011-01-01
This review discusses the application of computational models to simulate the transport and deposition of inhaled pharmaceutical aerosols from the site of particle or droplet formation to deposition within the respiratory tract. Traditional one-dimensional (1-D) whole-lung models are discussed briefly followed by a more in-depth review of three-dimensional (3-D) computational fluid dynamics (CFD) simulations. The review of CFD models is organized into sections covering transport and deposition within the inhaler device, the extrathoracic (oral and nasal) region, conducting airways, and alveolar space. For each section, a general review of significant contributions and advancements in the area of simulating pharmaceutical aerosols is provided followed by a more in-depth application or case study that highlights the challenges, utility, and benefits of in silico models. Specific applications presented include the optimization of an existing spray inhaler, development of charge-targeted delivery, specification of conditions for optimal nasal delivery, analysis of a new condensational delivery approach, and an evaluation of targeted delivery using magnetic aerosols. The review concludes with recommendations on the need for more refined model validations, use of a concurrent experimental and CFD approach for developing aerosol delivery systems, and development of a stochastic individual path (SIP) model of aerosol transport and deposition throughout the respiratory tract. PMID:21640772
Reference and Standard Atmosphere Models
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Roberts, Barry C.; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)
2002-01-01
This paper describes the development of standard and reference atmosphere models along with the history of their origin and use since the mid 19th century. The first "Standard Atmospheres" were established by international agreement in the 1920's. Later some countries, notably the United States, also developed and published "Standard Atmospheres". The term "Reference Atmospheres" is used to identify atmosphere models for specific geographical locations. Range Reference Atmosphere Models developed first during the 1960's are examples of these descriptions of the atmosphere. This paper discusses the various models, scopes, applications and limitations relative to use in aerospace industry activities.
To grow or not to grow: Hair morphogenesis and human genetic hair disorders
Duverger, Olivier; Morasso, Maria I.
2014-01-01
Mouse models have greatly helped in elucidating the molecular mechanisms involved in hair formation and regeneration. Recent publications have reviewed the genes involved in mouse hair development based on the phenotype of transgenic, knockout and mutant animal models. While much of this information has been instrumental in determining molecular aspects of human hair development and cycling, mice exhibit a specific pattern of hair morphogenesis and hair distribution throughout the body that cannot be directly correlated to human hair. In this mini-review, we discuss specific aspects of human hair follicle development and present an up-to-date summary of human genetic disorders associated with abnormalities in hair follicle morphogenesis, structure or regeneration. PMID:24361867
Building confidence and credibility amid growing model and computing complexity
NASA Astrophysics Data System (ADS)
Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.
2017-12-01
As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.
Descriptive modelling to predict deoxynivalenol in winter wheat in the Netherlands.
Van Der Fels-Klerx, H J; Burgers, S L G E; Booij, C J H
2010-05-01
Predictions of deoxynivalenol (DON) content in wheat at harvest can be useful for decision-making by stakeholders of the wheat feed and food supply chain. The objective of the current research was to develop quantitative predictive models for DON in mature winter wheat in the Netherlands for two specific groups of end-users. One model was developed for use by farmers in underpinning Fusarium spp. disease management, specifically the application of fungicides around wheat flowering (model A). The second model was developed for industry and food safety authorities, and considered the entire wheat cultivation period (model B). Model development was based on observational data collected from 425 fields throughout the Netherlands between 2001 and 2008. For each field, agronomical information, climatic data and DON levels in mature wheat were collected. Using multiple regression analyses, the set of biological relevant variables that provided the highest statistical performance was selected. The two final models include the following variables: region, wheat resistance level, spraying, flowering date, several climatic variables in the different stages of wheat growing, and length of the period between flowering and harvesting (model B only). The percentages of variance accounted for were 64.4% and 65.6% for models A and B, respectively. Model validation showed high correlation between the predicted and observed DON levels. The two models may be applied by various groups of end-users to reduce DON contamination in wheat-derived feed and food products and, ultimately, reduce animal and consumer health risks.
Metal Mixture Modeling Evaluation project: 2. Comparison of four modeling approaches
Farley, Kevin J.; Meyer, Joe; Balistrieri, Laurie S.; DeSchamphelaere, Karl; Iwasaki, Yuichi; Janssen, Colin; Kamo, Masashi; Lofts, Steve; Mebane, Christopher A.; Naito, Wataru; Ryan, Adam C.; Santore, Robert C.; Tipping, Edward
2015-01-01
As part of the Metal Mixture Modeling Evaluation (MMME) project, models were developed by the National Institute of Advanced Industrial Science and Technology (Japan), the U.S. Geological Survey (USA), HDR⎪HydroQual, Inc. (USA), and the Centre for Ecology and Hydrology (UK) to address the effects of metal mixtures on biological responses of aquatic organisms. A comparison of the 4 models, as they were presented at the MMME Workshop in Brussels, Belgium (May 2012), is provided herein. Overall, the models were found to be similar in structure (free ion activities computed by WHAM; specific or non-specific binding of metals/cations in or on the organism; specification of metal potency factors and/or toxicity response functions to relate metal accumulation to biological response). Major differences in modeling approaches are attributed to various modeling assumptions (e.g., single versus multiple types of binding site on the organism) and specific calibration strategies that affected the selection of model parameters. The models provided a reasonable description of additive (or nearly additive) toxicity for a number of individual toxicity test results. Less-than-additive toxicity was more difficult to describe with the available models. Because of limitations in the available datasets and the strong inter-relationships among the model parameters (log KM values, potency factors, toxicity response parameters), further evaluation of specific model assumptions and calibration strategies is needed.
Model development for Ulysses and SOHO
NASA Technical Reports Server (NTRS)
Wu, S. T.
1993-01-01
The purpose of this research is to provide scientific expertise in solar physics and in the development and use of magnetohydrodynamic (MHD) models of coronal structures for the computation of Lyman alpha scattered radiation in these structures. The specific objectives will be to run MHD models with new boundary conditions and compute resulting scattered solar Lyman alpha intensities, guided by results from the first series of boundary conditions.
ERIC Educational Resources Information Center
Luecht, Richard M.
2013-01-01
Assessment engineering is a new way to design and implement scalable, sustainable and ideally lower-cost solutions to the complexities of designing and developing tests. It represents a merger of sorts between cognitive task modeling and engineering design principles--a merger that requires some new thinking about the nature of score scales, item…
Construction of an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.
1993-01-01
Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a testbed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.
Construction of an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.
1992-01-01
Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.
Theoretical and Numerical Modeling of Transport of Land Use-Specific Fecal Source Identifiers
NASA Astrophysics Data System (ADS)
Bombardelli, F. A.; Sirikanchana, K. J.; Bae, S.; Wuertz, S.
2008-12-01
Microbial contamination in coastal and estuarine waters is of particular concern to public health officials. In this work, we advocate that well-formulated and developed mathematical and numerical transport models can be combined with modern molecular techniques in order to predict continuous concentrations of microbial indicators under diverse scenarios of interest, and that they can help in source identification of fecal pollution. As a proof of concept, we present initially the theory, numerical implementation and validation of one- and two-dimensional numerical models aimed at computing the distribution of fecal source identifiers in water bodies (based on Bacteroidales marker DNA sequences) coming from different land uses such as wildlife, livestock, humans, dogs or cats. These models have been developed to allow for source identification of fecal contamination in large bodies of water. We test the model predictions using diverse velocity fields and boundary conditions. Then, we present some preliminary results of an application of a three-dimensional water quality model to address the source of fecal contamination in the San Pablo Bay (SPB), United States, which constitutes an important sub-embayment of the San Francisco Bay. The transport equations for Bacteroidales include the processes of advection, diffusion, and decay of Bacteroidales. We discuss the validation of the developed models through comparisons of numerical results with field campaigns developed in the SPB. We determine the extent and importance of the contamination in the bay for two decay rates obtained from field observations, corresponding to total host-specific Bacteroidales DNA and host-specific viable Bacteroidales cells, respectively. Finally, we infer transport conditions in the SPB based on the numerical results, characterizing the fate of outflows coming from the Napa, Petaluma and Sonoma rivers.
Comparison of specificity and information for fuzzy domains
NASA Technical Reports Server (NTRS)
Ramer, Arthur
1992-01-01
This paper demonstrates how an integrated theory can be built on the foundation of possibility theory. Information and uncertainty were considered in 'fuzzy' literature since 1982. Our departing point is the model proposed by Klir for the discrete case. It was elaborated axiomatically by Ramer, who also introduced the continuous model. Specificity as a numerical function was considered mostly within Dempster-Shafer evidence theory. An explicity definition was given first by Yager, who has also introduced it in the context of possibility theory. Axiomatic approach and the continuous model have been developed very recently by Ramer and Yager. They also establish a close analytical correspondence between specificity and information. In literature to date, specificity and uncertainty are defined only for the discrete finite domains, with a sole exception. Our presentation removes these limitations. We define specificity measures for arbitrary measurable domains.
Combinatorial Histone Acetylation Patterns Are Generated by Motif-Specific Reactions.
Blasi, Thomas; Feller, Christian; Feigelman, Justin; Hasenauer, Jan; Imhof, Axel; Theis, Fabian J; Becker, Peter B; Marr, Carsten
2016-01-27
Post-translational modifications (PTMs) are pivotal to cellular information processing, but how combinatorial PTM patterns ("motifs") are set remains elusive. We develop a computational framework, which we provide as open source code, to investigate the design principles generating the combinatorial acetylation patterns on histone H4 in Drosophila melanogaster. We find that models assuming purely unspecific or lysine site-specific acetylation rates were insufficient to explain the experimentally determined motif abundances. Rather, these abundances were best described by an ensemble of models with acetylation rates that were specific to motifs. The model ensemble converged upon four acetylation pathways; we validated three of these using independent data from a systematic enzyme depletion study. Our findings suggest that histone acetylation patterns originate through specific pathways involving motif-specific acetylation activity. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Loncich, Kristen Teczar
Bat echolocation strategies and neural processing of acoustic information, with a focus on cluttered environments, is investigated in this study. How a bat processes the dense field of echoes received while navigating and foraging in the dark is not well understood. While several models have been developed to describe the mechanisms behind bat echolocation, most are based in mathematics rather than biology, and focus on either peripheral or neural processing---not exploring how these two levels of processing are vitally connected. Current echolocation models also do not use habitat specific acoustic input, or account for field observations of echolocation strategies. Here, a new approach to echolocation modeling is described capturing the full picture of echolocation from signal generation to a neural picture of the acoustic scene. A biologically inspired echolocation model is developed using field research measurements of the interpulse interval timing used by a frequency modulating (FM) bat in the wild, with a whole method approach to modeling echolocation including habitat specific acoustic inputs, a biologically accurate peripheral model of sound processing by the outer, middle, and inner ear, and finally a neural model incorporating established auditory pathways and neuron types with echolocation adaptations. Field recordings analyzed underscore bat sonar design differences observed in the laboratory and wild, and suggest a correlation between interpulse interval groupings and increased clutter. The scenario model provides habitat and behavior specific echoes and is a useful tool for both modeling and behavioral studies, and the peripheral and neural model show that spike-time information and echolocation specific neuron types can produce target localization in the midbrain.
A standard satellite control reference model
NASA Technical Reports Server (NTRS)
Golden, Constance
1994-01-01
This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.
Cardiac image modelling: Breadth and depth in heart disease.
Suinesiaputra, Avan; McCulloch, Andrew D; Nash, Martyn P; Pontre, Beau; Young, Alistair A
2016-10-01
With the advent of large-scale imaging studies and big health data, and the corresponding growth in analytics, machine learning and computational image analysis methods, there are now exciting opportunities for deepening our understanding of the mechanisms and characteristics of heart disease. Two emerging fields are computational analysis of cardiac remodelling (shape and motion changes due to disease) and computational analysis of physiology and mechanics to estimate biophysical properties from non-invasive imaging. Many large cohort studies now underway around the world have been specifically designed based on non-invasive imaging technologies in order to gain new information about the development of heart disease from asymptomatic to clinical manifestations. These give an unprecedented breadth to the quantification of population variation and disease development. Also, for the individual patient, it is now possible to determine biophysical properties of myocardial tissue in health and disease by interpreting detailed imaging data using computational modelling. For these population and patient-specific computational modelling methods to develop further, we need open benchmarks for algorithm comparison and validation, open sharing of data and algorithms, and demonstration of clinical efficacy in patient management and care. The combination of population and patient-specific modelling will give new insights into the mechanisms of cardiac disease, in particular the development of heart failure, congenital heart disease, myocardial infarction, contractile dysfunction and diastolic dysfunction. Copyright © 2016. Published by Elsevier B.V.
Improved prediction models for PCC pavement performance-related specifications
DOT National Transportation Integrated Search
2000-01-01
Performance-related specifications (PRS) for the acceptance of newly constructed jointed plain concrete pavements (JPCP) have been developed over the past decade. The main objectives of this study were to improve the distress and smoothness predictio...
Sentinel site data for model improvement – Definition and characterization
USDA-ARS?s Scientific Manuscript database
Crop models are increasingly being used to assess the impacts of future climate change on production and food security. High quality site-specific data on weather, soils, management, and cultivar are needed for those model applications. Also important, is that model development, evaluation, improvem...
Merema, B J; Kraeima, J; Ten Duis, K; Wendt, K W; Warta, R; Vos, E; Schepers, R H; Witjes, M J H; IJpma, F F A
2017-11-01
An innovative procedure for the development of 3D patient-specific implants with drilling guides for acetabular fracture surgery is presented. By using CT data and 3D surgical planning software, a virtual model of the fractured pelvis was created. During this process the fracture was virtually reduced. Based on the reduced fracture model, patient-specific titanium plates including polyamide drilling guides were designed, 3D printed and milled for intra-operative use. One of the advantages of this procedure is that the personalised plates could be tailored to both the shape of the pelvis and the type of fracture. The optimal screw directions and sizes were predetermined in the 3D model. The virtual plan was translated towards the surgical procedure by using the surgical guides and patient-specific osteosynthesis. Besides the description of the newly developed multi-disciplinary workflow, a clinical case example is presented to demonstrate that this technique is feasible and promising for the operative treatment of complex acetabular fractures. Copyright © 2017 Elsevier Ltd. All rights reserved.
[Research model on commodity specification standard of radix Chinese materia medica].
Kang, Chuan-Zhi; Zhou, Tao; Jiang, Wei-Ke; Huang, Lu-Qi; Guo, Lan-Ping
2016-03-01
As an important part of the market commodity circulation, the standard grade of Chinese traditional medicine commodity is very important to restrict the market order and guarantee the quality of the medicinal material. The State Council issuing the "protection and development of Chinese herbal medicine (2015-2020)" also make clear that the important task of improving the circulation of Chinese herbal medicine industry norms and the commodity specification standard of common traditional Chinese medicinal materials. However, as a large class of Chinese herbal medicines, the standard grade of the radix is more confused in the market circulation, and lack of a more reasonable study model in the development of the standard. Thus, this paper summarizes the research background, present situation and problems, and several key points of the commodity specification and grade standard in radix herbs. Then, the research model is introduced as an example of Pseudostellariae Radix, so as to provide technical support and reference for formulating commodity specifications and grades standard in other radix traditional Chinese medicinal materials. Copyright© by the Chinese Pharmaceutical Association.
nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.
Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia
2017-12-01
Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Social work perspectives on human behavior.
Wodarski, J S
1993-01-01
This manuscript addresses recent developments in human behavior research that are relevant to social work practice. Specific items addressed are biological aspects of behavior, life span development, cognitive variables, the self-efficacy learning process, the perceptual process, the exchange model, group level variables, macro level variables, and gender and ethnic-racial variables. Where relevant, specific applications to social work practice are provided.
NASA Technical Reports Server (NTRS)
Thorp, Scott A.
1992-01-01
This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.
NASA Astrophysics Data System (ADS)
Bouya, Zahra; Terkildsen, Michael
2016-07-01
The Australian Space Forecast Centre (ASFC) provides space weather forecasts to a diverse group of customers. Space Weather Services (SWS) within the Australian Bureau of Meteorology is focussed both on developing tailored products and services for the key customer groups, and supporting ASFC operations. Research in SWS is largely centred on the development of data-driven models using a range of solar-terrestrial data. This paper will cover some data requirements , approaches and recent SWS activities for data driven modelling with a focus on the regional Ionospheric specification and forecasting.
Rapaport, Sivan; Leshno, Moshe; Fink, Lior
2014-12-01
Shared decision making (SDM) encourages the patient to play a more active role in the process of medical consultation and its primary objective is to find the best treatment for a specific patient. Recent findings, however, show that patient preferences cannot be easily or accurately judged on the basis of communicative exchange during routine office visits, even for patients who seek to expand their role in medical decision making (MDM). The objective of this study is to improve the quality of patient-physician communication by developing a novel design process for SDM and then demonstrating, through a case study, the applicability of this process in enabling the use of a normative model for a specific medical situation. Our design process goes through the following stages: definition of medical situation and decision problem, development/identification of normative model, adaptation of normative model, empirical analysis and development of decision support systems (DSS) tools that facilitate the SDM process in the specific medical situation. This study demonstrates the applicability of the process through the implementation of the general normative theory of MDM under uncertainty for the medical-financial dilemma of choosing a physician to perform amniocentesis. The use of normative models in SDM raises several issues, such as the goal of the normative model, the relation between the goals of prediction and recommendation, and the general question of whether it is valid to use a normative model for people who do not behave according to the model's assumptions. © 2012 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
James, Jenee; Ellis, Bruce J.; Schlomer, Gabriel L.; Garber, Judy
2012-01-01
The current study tested sex-specific pathways to early puberty, sexual debut, and sexual risk taking, as specified by an integrated evolutionary-developmental model of adolescent sexual development and behavior. In a prospective study of 238 adolescents (n = 129 girls and n = 109 boys) followed from approximately 12-18 years of age, we tested for…
Kataoka, Takeshi; Tsutahara, Michihisa
2004-03-01
We have developed a lattice Boltzmann model for the compressible Navier-Stokes equations with a flexible specific-heat ratio. Several numerical results are presented, and they agree well with the corresponding solutions of the Navier-Stokes equations. In addition, an explicit finite-difference scheme is proposed for the numerical calculation that can make a stable calculation with a large Courant number.
Publishing web-based guidelines using interactive decision models.
Sanders, G D; Nease, R F; Owens, D K
2001-05-01
Commonly used methods for guideline development and dissemination do not enable developers to tailor guidelines systematically to specific patient populations and update guidelines easily. We developed a web-based system, ALCHEMIST, that uses decision models and automatically creates evidence-based guidelines that can be disseminated, tailored and updated over the web. Our objective was to demonstrate the use of this system with clinical scenarios that provide challenges for guideline development. We used the ALCHEMIST system to develop guidelines for three clinical scenarios: (1) Chlamydia screening for adolescent women, (2) antiarrhythmic therapy for the prevention of sudden cardiac death; and (3) genetic testing for the BRCA breast-cancer mutation. ALCHEMIST uses information extracted directly from the decision model, combined with the additional information from the author of the decision model, to generate global guidelines. ALCHEMIST generated electronic web-based guidelines for each of the three scenarios. Using ALCHEMIST, we demonstrate that tailoring a guideline for a population at high-risk for Chlamydia changes the recommended policy for control of Chlamydia from contact tracing of reported cases to a population-based screening programme. We used ALCHEMIST to incorporate new evidence about the effectiveness of implantable cardioverter defibrillators (ICD) and demonstrate that the cost-effectiveness of use of ICDs improves from $74 400 per quality-adjusted life year (QALY) gained to $34 500 per QALY gained. Finally, we demonstrate how a clinician could use ALCHEMIST to incorporate a woman's utilities for relevant health states and thereby develop patient-specific recommendations for BRCA testing; the patient-specific recommendation improved quality-adjusted life expectancy by 37 days. The ALCHEMIST system enables guideline developers to publish both a guideline and an interactive decision model on the web. This web-based tool enables guideline developers to tailor guidelines systematically, to update guidelines easily, and to make the underlying evidence and analysis transparent for users.
Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.
2018-01-01
Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection results changed between the two approaches, and incorporating observer knowledge in cause-of-death increased the variability associated with parameter estimates when compared to the traditional approach. These differences between the two approaches can impact reported results, and therefore, it is critical to explicitly incorporate expert knowledge in statistical methods to ensure rigorous inference.
ERIC Educational Resources Information Center
Simmering, Vanessa R.; Patterson, Rebecca
2012-01-01
Numerous studies have established that visual working memory has a limited capacity that increases during childhood. However, debate continues over the source of capacity limits and its developmental increase. Simmering (2008) adapted a computational model of spatial cognitive development, the Dynamic Field Theory, to explain not only the source…
Development of a Medicaid Behavioral Health Case-Mix Model
ERIC Educational Resources Information Center
Robst, John
2009-01-01
Many Medicaid programs have either fully or partially carved out mental health services. The evaluation of carve-out plans requires a case-mix model that accounts for differing health status across Medicaid managed care plans. This article develops a diagnosis-based case-mix adjustment system specific to Medicaid behavioral health care. Several…
DOT National Transportation Integrated Search
1977-04-01
Noise reduction option development work was carried out on two inservice diesel powered IH trucks, consisting of a Cab-over model and a Conventional model with a baseline exterior noise level of 87 dB(A) each. Since no specific noise goals were set, ...
A Model for Online Support in Classroom Management: Perceptions of Beginning Teachers
ERIC Educational Resources Information Center
Baker, Credence; Gentry, James; Larmer, William
2016-01-01
Classroom management is a challenge for beginning teachers. To address this challenge, a model to provide support for beginning teachers was developed, consisting of a one-day workshop on classroom management, followed with online support extending over eight weeks. Specific classroom management strategies included (a) developing a foundation…
Demographic Accounting and Model-Building. Education and Development Technical Reports.
ERIC Educational Resources Information Center
Stone, Richard
This report describes and develops a model for coordinating a variety of demographic and social statistics within a single framework. The framework proposed, together with its associated methods of analysis, serves both general and specific functions. The general aim of these functions is to give numerical definition to the pattern of society and…
ERIC Educational Resources Information Center
Champagne, Tiffany
2013-01-01
The purpose of this dissertation research was to critically examine the development of community-based health information exchanges (HIEs) and to comparatively analyze the various models of exchanges in operation today nationally. Specifically this research sought to better understand several aspects of HIE: policy influences, organizational…
Developing of Indicators of an E-Learning Benchmarking Model for Higher Education Institutions
ERIC Educational Resources Information Center
Sae-Khow, Jirasak
2014-01-01
This study was the development of e-learning indicators used as an e-learning benchmarking model for higher education institutes. Specifically, it aimed to: 1) synthesize the e-learning indicators; 2) examine content validity by specialists; and 3) explore appropriateness of the e-learning indicators. Review of related literature included…
DOT National Transportation Integrated Search
2017-07-16
The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of Dynamic Mobility Applications (DMA) and the Active Transportation and Demand Management (ATDM) strategies. Specifically,...
"Special Issue": Regional Dimensions of the Triple Helix Model
ERIC Educational Resources Information Center
Todeva, Emanuela; Danson, Mike
2016-01-01
This paper introduces the rationale for the special issue and its contributions, which bridge the literature on regional development and the Triple Helix model. The concept of the Triple Helix at the sub-national, and specifically regional, level is established and examined, with special regard to regional economic development founded on…
Regional Dimensions of the Triple Helix Model: Setting the Context
ERIC Educational Resources Information Center
Todeva, Emanuela; Danson, Mike
2016-01-01
This paper introduces the rationale for the special issue and its contributions, which bridge the literature on regional development and the Triple Helix model. The concept of the Triple Helix at the sub-national, and specifically regional, level is established and examined, with special regard to regional economic development founded on…
Due to its presence in water as a volatile disinfection byproduct, BDCM, which is mutagenic and a rodent carcinogen, poses a risk for exposure via multiple routes. We developed a refined human PBPK model for BDCM (including new chemical-specific human parameters) to evaluate the...
Capacity Levels of Academic Staff in a Malaysian Public University: Students' Perspective
ERIC Educational Resources Information Center
Tajuddin, Muhammad Jawad; Ghani, Muhammad Faizal A.; Siraj, Saedah; Saifuldin, Mohd Helmi Firdaus; Kenayatulla, Husaina Banu; Elham, Faisol
2013-01-01
This research aims to develop a competency model for staff of higher education institutions in Malaysia. The model involves the listing of the main features and implementation strategy for the development of academic competence. Specifically, this research aims to achieve the following research objectives: a) to identify if there is any…
Gender and Infertility: A Relational Approach To Counseling Women.
ERIC Educational Resources Information Center
Gibson, Donna M.; Myers, Jane E.
2000-01-01
The Relational Model (J. V. Jordan, 1995) of women's development is a theory that explains women's development in a context of relationships, specifically relationships that promote growth for self and others. This model is applied to counseling women who are experiencing infertility, and a case presentation is provided to illustrate the approach.…
Reflections on Wittrock's Generative Model of Learning: A Motivation Perspective
ERIC Educational Resources Information Center
Anderman, Eric M.
2010-01-01
In this article, I examine developments in research on achievement motivation and comment on how those developments are reflected in Wittrock's generative model of learning. Specifically, I focus on the roles of prior knowledge, the generation of knowledge, and beliefs about ability. Examples from Wittrock's theory and from current motivational…
Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.
Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A
2014-01-01
Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.
Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models
Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.
2014-01-01
Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542
Lee II, Henry; Reusser, Deborah A.; Frazier, Melanie R; McCoy, Lee M; Clinton, Patrick J.; Clough, Jonathan S.
2014-01-01
The “Sea‐Level Affecting Marshes Model” (SLAMM) is a moderate resolution model used to predict the effects of sea level rise on marsh habitats (Craft et al. 2009). SLAMM has been used extensively on both the west coast (e.g., Glick et al., 2007) and east coast (e.g., Geselbracht et al., 2011) of the United States to evaluate potential changes in the distribution and extent of tidal marsh habitats. However, a limitation of the current version of SLAMM, (Version 6.2) is that it lacks the ability to model distribution changes in seagrass habitat resulting from sea level rise. Because of the ecological importance of SAV habitats, U.S. EPA, USGS, and USDA partnered with Warren Pinnacle Consulting to enhance the SLAMM modeling software to include new functionality in order to predict changes in Zostera marina distribution within Pacific Northwest estuaries in response to sea level rise. Specifically, the objective was to develop a SAV model that used generally available GIS data and parameters that were predictive and that could be customized for other estuaries that have GIS layers of existing SAV distribution. This report describes the procedure used to develop the SAV model for the Yaquina Bay Estuary, Oregon, appends a statistical script based on the open source R software to generate a similar SAV model for other estuaries that have data layers of existing SAV, and describes how to incorporate the model coefficients from the site‐specific SAV model into SLAMM to predict the effects of sea level rise on Zostera marina distributions. To demonstrate the applicability of the R tools, we utilize them to develop model coefficients for Willapa Bay, Washington using site‐specific SAV data.
Prakash, Punit; Salgaonkar, Vasant A.; Diederich, Chris J.
2014-01-01
Endoluminal and catheter-based ultrasound applicators are currently under development and are in clinical use for minimally invasive hyperthermia and thermal ablation of various tissue targets. Computational models play a critical role in in device design and optimization, assessment of therapeutic feasibility and safety, devising treatment monitoring and feedback control strategies, and performing patient-specific treatment planning with this technology. The critical aspects of theoretical modeling, applied specifically to endoluminal and interstitial ultrasound thermotherapy, are reviewed. Principles and practical techniques for modeling acoustic energy deposition, bioheat transfer, thermal tissue damage, and dynamic changes in the physical and physiological state of tissue are reviewed. The integration of these models and applications of simulation techniques in identification of device design parameters, development of real time feedback-control platforms, assessing the quality and safety of treatment delivery strategies, and optimization of inverse treatment plans are presented. PMID:23738697
LinkEHR-Ed: a multi-reference model archetype editor based on formal semantics.
Maldonado, José A; Moner, David; Boscá, Diego; Fernández-Breis, Jesualdo T; Angulo, Carlos; Robles, Montserrat
2009-08-01
To develop a powerful archetype editing framework capable of handling multiple reference models and oriented towards the semantic description and standardization of legacy data. The main prerequisite for implementing tools providing enhanced support for archetypes is the clear specification of archetype semantics. We propose a formalization of the definition section of archetypes based on types over tree-structured data. It covers the specialization of archetypes, the relationship between reference models and archetypes and conformance of data instances to archetypes. LinkEHR-Ed, a visual archetype editor based on the former formalization with advanced processing capabilities that supports multiple reference models, the editing and semantic validation of archetypes, the specification of mappings to data sources, and the automatic generation of data transformation scripts, is developed. LinkEHR-Ed is a useful tool for building, processing and validating archetypes based on any reference model.
An interpersonal neurobiological-informed treatment model for childhood traumatic grief.
Crenshaw, David A
This article expands an earlier model of the tasks of grieving (1990, [1995], [2001]) by building on science based findings derived from research in attachment theory, neuroscience, interpersonal neurobiology, and childhood traumatic grief (CTG). The proposed treatment model is a prescriptive approach that spells out specific tasks to be undertaken by children suffering traumatic grief under the direction of a therapist who is trained in trauma-informed therapy approaches and draws heavily on the empirically derived childhood traumatic grief treatment model developed by Cohen and Mannarino (2004; Cohen, Mannarino, & Deblinger, 2006). This model expands on their work by proposing specific tasks that are informed by attachment theory research and the interpersonal neurobiological research (Schore, 2003a, 2003b; Siegel, 1999). Particular emphasis is placed on developing a coherent and meaningful narrative since this has been found as a crucial factor in recovery from trauma in attachment research (Siegel, 1999; Siegel & Hartzell, 2003).
Hadwin, Julie A; Garner, Matthew; Perez-Olivas, Gisela
2006-11-01
The aim of this paper is to explore parenting as one potential route through which information processing biases for threat develop in children. It reviews information processing biases in childhood anxiety in the context of theoretical models and empirical research in the adult anxiety literature. Specifically, it considers how adult models have been used and adapted to develop a theoretical framework with which to investigate information processing biases in children. The paper then considers research which specifically aims to understand the relationship between parenting and the development of information processing biases in children. It concludes that a clearer theoretical framework is required to understand the significance of information biases in childhood anxiety, as well as their origins in parenting.
Empathy and child neglect: a theoretical model.
De Paul, Joaquín; Guibert, María
2008-11-01
To present an explanatory theory-based model of child neglect. This model does not address neglectful behaviors of parents with mental retardation, alcohol or drug abuse, or severe mental health problems. In this model parental behavior aimed to satisfy a child's need is considered a helping behavior and, as a consequence, child neglect is considered as a specific type of non-helping behavior. The central hypothesis of the theoretical model presented here suggests that neglectful parents cannot develop the helping response set to care for their children because the observation of a child's signal of need does not lead to the experience of emotions that motivate helping or because the parents experience these emotions, but specific cognitions modify the motivation to help. The present theoretical model suggests that different typologies of neglectful parents could be developed based on different reasons that parents might not to experience emotions that motivate helping behaviors. The model can be helpful to promote new empirical studies about the etiology of different groups of neglectful families.
Buoli, Massimiliano; Serati, Marta; Caldiroli, Alice; Cremaschi, Laura; Altamura, Alfredo Carlo
2017-03-01
Available data support a contribution of both neurodevelopmental and neurodegenerative factors in the etiology of schizophrenia (SCH) and bipolar disorder (BD). Of note, one of the most important issue of the current psychiatric research is to identify the specific factors that contribute to impaired brain development and neurodegeneration in SCH and BD, and especially how these factors alter normal brain development and physiological aging process. Our hypothesis is that only specific damages, taking place in precise brain development stages, are associated with future SCH /BD onset and that neurodegeneration consists of an acceleration of brain aging after SCH /BD onset. In support of our hypothesis, the results of the present narrative mini-review shows as neurodevelopmental damages generally contribute to neuropsychiatric syndromes (e.g. hypothyroidism or treponema pallidum), but only some of them are specifically associated with adult SCH and BD (e.g. toxoplasma or substance abuse), particularly if they happen in specific stages of brain development. On the other hand, cognitive impairment and brain changes, associated with long duration of SCH /BD, look like what happens during aging: memory, executive domains and prefrontal cortex are implicated both in aging and in SCH /BD progression. Future research will explore possible validity of this etiological model for SCH and BD.
Franz, A; Triesch, J
2010-12-01
The perception of the unity of objects, their permanence when out of sight, and the ability to perceive continuous object trajectories even during occlusion belong to the first and most important capacities that infants have to acquire. Despite much research a unified model of the development of these abilities is still missing. Here we make an attempt to provide such a unified model. We present a recurrent artificial neural network that learns to predict the motion of stimuli occluding each other and that develops representations of occluded object parts. It represents completely occluded, moving objects for several time steps and successfully predicts their reappearance after occlusion. This framework allows us to account for a broad range of experimental data. Specifically, the model explains how the perception of object unity develops, the role of the width of the occluders, and it also accounts for differences between data for moving and stationary stimuli. We demonstrate that these abilities can be acquired by learning to predict the sensory input. The model makes specific predictions and provides a unifying framework that has the potential to be extended to other visual event categories. Copyright © 2010 Elsevier Inc. All rights reserved.
A Subject-Specific Acoustic Model of the Upper Airway for Snoring Sounds Generation
Saha, Shumit; Bradley, T. Douglas; Taheri, Mahsa; Moussavi, Zahra; Yadollahi, Azadeh
2016-01-01
Monitoring variations in the upper airway narrowing during sleep is invasive and expensive. Since snoring sounds are generated by air turbulence and vibrations of the upper airway due to its narrowing; snoring sounds may be used as a non-invasive technique to assess upper airway narrowing. Our goal was to develop a subject-specific acoustic model of the upper airway to investigate the impacts of upper airway anatomy, e.g. length, wall thickness and cross-sectional area, on snoring sounds features. To have a subject-specific model for snoring generation, we used measurements of the upper airway length, cross-sectional area and wall thickness from every individual to develop the model. To validate the proposed model, in 20 male individuals, intensity and resonant frequencies of modeled snoring sounds were compared with those measured from recorded snoring sounds during sleep. Based on both modeled and measured results, we found the only factor that may positively and significantly contribute to snoring intensity was narrowing in the upper airway. Furthermore, measured resonant frequencies of snoring were inversely correlated with the upper airway length, which is a risk factor for upper airway collapsibility. These results encourage the use of snoring sounds analysis to assess the upper airway anatomy during sleep. PMID:27210576
NASA Astrophysics Data System (ADS)
Mia, Mozammel; Al Bashir, Mahmood; Dhar, Nikhil Ranjan
2016-10-01
Hard turning is increasingly employed in machining, lately, to replace time-consuming conventional turning followed by grinding process. An excessive amount of tool wear in hard turning is one of the main hurdles to be overcome. Many researchers have developed tool wear model, but most of them developed it for a particular work-tool-environment combination. No aggregate model is developed that can be used to predict the amount of principal flank wear for specific machining time. An empirical model of principal flank wear (VB) has been developed for the different hardness of workpiece (HRC40, HRC48 and HRC56) while turning by coated carbide insert with different configurations (SNMM and SNMG) under both dry and high pressure coolant conditions. Unlike other developed model, this model includes the use of dummy variables along with the base empirical equation to entail the effect of any changes in the input conditions on the response. The base empirical equation for principal flank wear is formulated adopting the Exponential Associate Function using the experimental results. The coefficient of dummy variable reflects the shifting of the response from one set of machining condition to another set of machining condition which is determined by simple linear regression. The independent cutting parameters (speed, rate, depth of cut) are kept constant while formulating and analyzing this model. The developed model is validated with different sets of machining responses in turning hardened medium carbon steel by coated carbide inserts. For any particular set, the model can be used to predict the amount of principal flank wear for specific machining time. Since the predicted results exhibit good resemblance with experimental data and the average percentage error is <10 %, this model can be used to predict the principal flank wear for stated conditions.
NASA Astrophysics Data System (ADS)
Bouaziz, Laurène; de Boer-Euser, Tanja; Brauer, Claudia; Drogue, Gilles; Fenicia, Fabrizio; Grelier, Benjamin; de Niel, Jan; Nossent, Jiri; Pereira, Fernando; Savenije, Hubert; Thirel, Guillaume; Willems, Patrick
2016-04-01
International collaboration between institutes and universities is a promising way to reach consensus on hydrological model development. Education, experience and expert knowledge of the hydrological community have resulted in the development of a great variety of model concepts, calibration methods and analysis techniques. Although comparison studies are very valuable for international cooperation, they do often not lead to very clear new insights regarding the relevance of the modelled processes. We hypothesise that this is partly caused by model complexity and the used comparison methods, which focus on a good overall performance instead of focusing on specific events. We propose an approach that focuses on the evaluation of specific events. Eight international research groups calibrated their model for the Ourthe catchment in Belgium (1607 km2) and carried out a validation in time for the Ourthe (i.e. on two different periods, one of them on a blind mode for the modellers) and a validation in space for nested and neighbouring catchments of the Meuse in a completely blind mode. For each model, the same protocol was followed and an ensemble of best performing parameter sets was selected. Signatures were first used to assess model performances in the different catchments during validation. Comparison of the models was then followed by evaluation of selected events, which include: low flows, high flows and the transition from low to high flows. While the models show rather similar performances based on general metrics (i.e. Nash-Sutcliffe Efficiency), clear differences can be observed for specific events. While most models are able to simulate high flows well, large differences are observed during low flows and in the ability to capture the first peaks after drier months. The transferability of model parameters to neighbouring and nested catchments is assessed as an additional measure in the model evaluation. This suggested approach helps to select, among competing model alternatives, the most suitable model for a specific purpose.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Worley, Rachel Rogers, E-mail: idz7@cdc.gov; Interdisciplinary Toxicology Program, University of Georgia, 341 Pharmacy South, Athens, GA 30602; Fisher, Jeffrey
ABSTRACT: Renal elimination and the resulting clearance of perfluorooctanoic acid (PFOA) from the serum exhibit pronounced sex differences in the adult rat. The literature suggests that this is largely due to hormonally regulated expression of organic anion transporters (OATs) on the apical and basolateral membranes of the proximal tubule cells that facilitate excretion and reabsorption of PFOA from the filtrate into the blood. Previously developed PBPK models of PFOA exposure in the rat have not been parameterized to specifically account for transporter-mediated renal elimination. We developed a PBPK model for PFOA in male and female rats to explore the rolemore » of Oat1, Oat3, and Oatp1a1 in sex-specific renal reabsorption and excretion of PFOA. Descriptions of the kinetic behavior of these transporters were extrapolated from in vitro studies and the model was used to simulate time-course serum, liver, and urine data for intravenous (IV) and oral exposures in both sexes. Model predicted concentrations of PFOA in the liver, serum, and urine showed good agreement with experimental data for both male and female rats indicating that in vitro derived physiological descriptions of transporter-mediated renal reabsorption can successfully predict sex-dependent excretion of PFOA in the rat. This study supports the hypothesis that sex-specific serum half-lives for PFOA are largely driven by expression of transporters in the kidney and contribute to the development of PBPK modeling as a tool for evaluating the role of transporters in renal clearance. - Highlights: • The PBPK model for PFOA in the rat explores the role of OATs in sex-specific clearance. • Descriptions of OAT kinetics were extrapolated from in vitro studies. • Model predictions showed good fit with experimental data for male and female rats.« less
NASA Astrophysics Data System (ADS)
Knierim, Katherine J.; Nottmeier, Anna M.; Worland, Scott; Westerman, Drew A.; Clark, Brian R.
2017-09-01
Hydrologic budgets to determine groundwater availability are important tools for water-resource managers. One challenging component for developing hydrologic budgets is quantifying water use through time because historical and site-specific water-use data can be sparse or poorly documented. This research developed a groundwater-use record for the Ozark Plateaus aquifer system (central USA) from 1900 to 2010 that related county-level aggregated water-use data to site-specific well locations and aquifer units. A simple population-based linear model, constrained to 0 million liters per day in 1900, provided the best means to extrapolate groundwater-withdrawal rates pre-1950s when there was a paucity of water-use data. To disaggregate county-level data to individual wells across a regional aquifer system, a programmatic hierarchical process was developed, based on the level of confidence that a well pumped groundwater for a specific use during a specific year. Statistical models tested on a subset of the best-available site-specific water-use data provided a mechanism to bracket historic groundwater use, such that groundwater-withdrawal rates ranged, on average, plus or minus 38% from modeled values. Groundwater withdrawn for public supply and domestic use accounted for between 48 and 74% of total groundwater use since 1901, highlighting that groundwater provides an important drinking-water resource. The compilation, analysis, and spatial and temporal extrapolation of water-use data remain a challenging task for water scientists, but is of paramount importance to better quantify groundwater use and availability.
Knierim, Katherine J.; Nottmeier, Anna M.; Worland, Scott C.; Westerman, Drew A.; Clark, Brian R.
2017-01-01
Hydrologic budgets to determine groundwater availability are important tools for water-resource managers. One challenging component for developing hydrologic budgets is quantifying water use through time because historical and site-specific water-use data can be sparse or poorly documented. This research developed a groundwater-use record for the Ozark Plateaus aquifer system (central USA) from 1900 to 2010 that related county-level aggregated water-use data to site-specific well locations and aquifer units. A simple population-based linear model, constrained to 0 million liters per day in 1900, provided the best means to extrapolate groundwater-withdrawal rates pre-1950s when there was a paucity of water-use data. To disaggregate county-level data to individual wells across a regional aquifer system, a programmatic hierarchical process was developed, based on the level of confidence that a well pumped groundwater for a specific use during a specific year. Statistical models tested on a subset of the best-available site-specific water-use data provided a mechanism to bracket historic groundwater use, such that groundwater-withdrawal rates ranged, on average, plus or minus 38% from modeled values. Groundwater withdrawn for public supply and domestic use accounted for between 48 and 74% of total groundwater use since 1901, highlighting that groundwater provides an important drinking-water resource. The compilation, analysis, and spatial and temporal extrapolation of water-use data remain a challenging task for water scientists, but is of paramount importance to better quantify groundwater use and availability.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1978-01-01
The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.
Toward modeling locomotion using electromyography-informed 3D models: application to cerebral palsy.
Sartori, M; Fernandez, J W; Modenese, L; Carty, C P; Barber, L A; Oberhofer, K; Zhang, J; Handsfield, G G; Stott, N S; Besier, T F; Farina, D; Lloyd, D G
2017-03-01
This position paper proposes a modeling pipeline to develop clinically relevant neuromusculoskeletal models to understand and treat complex neurological disorders. Although applicable to a variety of neurological conditions, we provide direct pipeline applicative examples in the context of cerebral palsy (CP). This paper highlights technologies in: (1) patient-specific segmental rigid body models developed from magnetic resonance imaging for use in inverse kinematics and inverse dynamics pipelines; (2) efficient population-based approaches to derive skeletal models and muscle origins/insertions that are useful for population statistics and consistent creation of continuum models; (3) continuum muscle descriptions to account for complex muscle architecture including spatially varying material properties with muscle wrapping; (4) muscle and tendon properties specific to CP; and (5) neural-based electromyography-informed methods for muscle force prediction. This represents a novel modeling pipeline that couples for the first time electromyography extracted features of disrupted neuromuscular behavior with advanced numerical methods for modeling CP-specific musculoskeletal morphology and function. The translation of such pipeline to the clinical level will provide a new class of biomarkers that objectively describe the neuromusculoskeletal determinants of pathological locomotion and complement current clinical assessment techniques, which often rely on subjective judgment. WIREs Syst Biol Med 2017, 9:e1368. doi: 10.1002/wsbm.1368 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Hewlin, Rodward L; Kizito, John P
2018-03-01
The ultimate goal of the present work is to aid in the development of tools to assist in the treatment of cardiovascular disease. Gaining an understanding of hemodynamic parameters for medical implants allow clinicians to have some patient-specific proposals for intervention planning. In the present work an experimental and digital computational fluid dynamics (CFD) arterial model consisting of a number of major arteries (aorta, carotid bifurcation, cranial, femoral, jejunal, and subclavian arteries) were fabricated to study: (1) the effects of local hemodynamics (flow parameters) on global hemodynamics (2) the effects of transition from bedrest to upright position (postural change) on hemodynamics, and (3) diffusion of dye (medical drug diffusion simulation) in the arterial system via experimental and numerical techniques. The experimental and digital arterial models used in the present study are the first 3-D systems reported in literature to incorporate the major arterial vessels that deliver blood from the heart to the cranial and femoral arteries. These models are also the first reported in literature to be used for flow parameter assessment via medical drug delivery and orthostatic postural change studies. The present work addresses the design of the experimental and digital arterial model in addition to the design of measuring tools used to measure hemodynamic parameters. The experimental and digital arterial model analyzed in the present study was developed from patient specific computed tomography angiography (CTA) scans and simplified geometric data. Segments such as the aorta (ascending and descending) and carotid bifurcation arteries of the experimental and digital arterial model was created from online available patient-specific CTA scan data provided by Charite' Clinical and Research Hospital. The cranial and coronary arteries were simplified arterial geometries developed from dimensional specification data used in previous work. For the patient specific geometries, a MATLAB code was written to upload the CTA scans of each artery, calculate the centroids, and produce surface splines at each discrete cross section along the lumen centerline to create the patient specific arterial geometries. The MATLAB code worked in conjunction with computer aided software (CAD) Solidworks to produce solid models of the patient specific geometries and united them with the simplified geometries to produce the full arterial model (CAD model). The CAD model was also used as a blueprint to fabricate the experimental model which was used for flow visualization via particle imaging velocimetry (PIV) and postural change studies. A custom pulse duplicator (pulsatile pump) was also designed and developed for the present work. The pulse duplicator is capable of producing patient-specific volumetric waveforms for inlet flow to the experimental arterial model. A simple fluid structure interaction (FSI) study was also conducted via optical techniques to establish the magnitude of vessel diameter change due to the pulsatile flow. A medical drug delivery (dye dispersion and tracing) case was simulated via a dye being dispersed into the pulsatile flow stream to measure the transit time of the dye front. Pressure waveforms for diseased cases (hypertension & stenotic cases) were also obtained from the experimental arterial model during postural changes from bedrest (0°) to upright position (90°). The postural changes were simulated via attaching the experimental model to a tile table the can transition from 0° to 90°. The PIV results obtained from the experimental model provided parametric data such as velocity and wall shear stress data. The medical drug delivery simulations (experimental and numerical) studies produce time dependent data which is useful for predicting flow trajectory and transit time of medical drug dispersion. In the case of postural change studies, pressure waveforms were obtained from the common carotid artery and the femoral sections to yield pressure difference data useful for orthostatic hypotension analysis. Flow parametric data such as vorticity (flow reversal), wall shear stress, normal stress, and medical drug transit data was also obtained from the digital arterial model CFD simulations. Although the present work is preliminary work, the experimental and digital models proves to be useful in providing flow parametric data of interest such as: (1) normal stress which is useful for predicting the magnitude of forces which could promote arterial rupture or dislodging of medical implants, (2) wall shear stress which is useful for analyzing the magnitude of drug transport at the arterial wall, (3) vorticity which is useful for predicting the magnitude of flow reversal, and (4) arterial compliance in the case of the experimental model which could be useful in the efforts of developing FSI numerical simulations that incorporates compliance which realistically models the flow in the arterial system.
Requirements for Medical Modeling Languages
van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes
2001-01-01
Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383
The International Reference Ionosphere 2012 - a model of international collaboration
NASA Astrophysics Data System (ADS)
Bilitza, Dieter; Altadill, David; Zhang, Yongliang; Mertens, Chris; Truhlik, Vladimir; Richards, Phil; McKinnell, Lee-Anne; Reinisch, Bodo
2014-02-01
The International Reference Ionosphere (IRI) project was established jointly by the Committee on Space Research (COSPAR) and the International Union of Radio Science (URSI) in the late sixties with the goal to develop an international standard for the specification of plasma parameters in the Earth's ionosphere. COSPAR needed such a specification for the evaluation of environmental effects on spacecraft and experiments in space, and URSI for radiowave propagation studies and applications. At the request of COSPAR and URSI, IRI was developed as a data-based model to avoid the uncertainty of theory-based models which are only as good as the evolving theoretical understanding. Being based on most of the available and reliable observations of the ionospheric plasma from the ground and from space, IRI describes monthly averages of electron density, electron temperature, ion temperature, ion composition, and several additional parameters in the altitude range from 60 km to 2000 km. A working group of about 50 international ionospheric experts is in charge of developing and improving the IRI model. Over time as new data became available and new modeling techniques emerged, steadily improved editions of the IRI model have been published. This paper gives a brief history of the IRI project and describes the latest version of the model, IRI-2012. It also briefly discusses efforts to develop a real-time IRI model. The IRI homepage is at http://IRImodel.org.
NASA Technical Reports Server (NTRS)
Santi, L. Michael; Helmicki, Arthur J.
1993-01-01
The objective of Phase I of this research effort was to develop an advanced mathematical-empirical model of SSME steady-state performance. Task 6 of Phase I is to develop component specific modification strategy for baseline case influence coefficient matrices. This report describes the background of SSME performance characteristics and provides a description of the control variable basis of three different gains models. The procedure used to establish influence coefficients for each of these three models is also described. Gains model analysis results are compared to Rocketdyne's power balance model (PBM).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobromir Panayotov; Andrew Grief; Brad J. Merrill
'Fusion for Energy' (F4E) develops designs and implements the European Test Blanket Systems (TBS) in ITER - Helium-Cooled Lithium-Lead (HCLL) and Helium-Cooled Pebble-Bed (HCPB). Safety demonstration is an essential element for the integration of TBS in ITER and accident analyses are one of its critical segments. A systematic approach to the accident analyses had been acquired under the F4E contract on TBS safety analyses. F4E technical requirements and AMEC and INL efforts resulted in the development of a comprehensive methodology for fusion breeding blanket accident analyses. It addresses the specificity of the breeding blankets design, materials and phenomena and atmore » the same time is consistent with the one already applied to ITER accident analyses. Methodology consists of several phases. At first the reference scenarios are selected on the base of FMEA studies. In the second place elaboration of the accident analyses specifications we use phenomena identification and ranking tables to identify the requirements to be met by the code(s) and TBS models. Thus the limitations of the codes are identified and possible solutions to be built into the models are proposed. These include among others the loose coupling of different codes or code versions in order to simulate multi-fluid flows and phenomena. The code selection and issue of the accident analyses specifications conclude this second step. Furthermore the breeding blanket and ancillary systems models are built on. In this work challenges met and solutions used in the development of both MELCOR and RELAP5 codes models of HCLL and HCPB TBSs will be shared. To continue the developed models are qualified by comparison with finite elements analyses, by code to code comparison and sensitivity studies. Finally, the qualified models are used for the execution of the accident analyses of specific scenario. When possible the methodology phases will be illustrated in the paper by limited number of tables and figures. Description of each phase and its results in detail as well the methodology applications to EU HCLL and HCPB TBSs will be published in separate papers. The developed methodology is applicable to accident analyses of other TBSs to be tested in ITER and as well to DEMO breeding blankets.« less
Conceptualizing a model: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force--2.
Roberts, Mark; Russell, Louise B; Paltiel, A David; Chambers, Michael; McEwan, Phil; Krahn, Murray
2012-01-01
The appropriate development of a model begins with understanding the problem that is being represented. The aim of this article was to provide a series of consensus-based best practices regarding the process of model conceptualization. For the purpose of this series of articles, we consider the development of models whose purpose is to inform medical decisions and health-related resource allocation questions. We specifically divide the conceptualization process into two distinct components: the conceptualization of the problem, which converts knowledge of the health care process or decision into a representation of the problem, followed by the conceptualization of the model itself, which matches the attributes and characteristics of a particular modeling type with the needs of the problem being represented. Recommendations are made regarding the structure of the modeling team, agreement on the statement of the problem, the structure, perspective, and target population of the model, and the interventions and outcomes represented. Best practices relating to the specific characteristics of model structure and which characteristics of the problem might be most easily represented in a specific modeling method are presented. Each section contains a number of recommendations that were iterated among the authors, as well as among the wider modeling taskforce, jointly set up by the International Society for Pharmacoeconomics and Outcomes Research and the Society for Medical Decision Making. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Toxin-Induced Experimental Models of Learning and Memory Impairment
More, Sandeep Vasant; Kumar, Hemant; Cho, Duk-Yeon; Yun, Yo-Sep; Choi, Dong-Kug
2016-01-01
Animal models for learning and memory have significantly contributed to novel strategies for drug development and hence are an imperative part in the assessment of therapeutics. Learning and memory involve different stages including acquisition, consolidation, and retrieval and each stage can be characterized using specific toxin. Recent studies have postulated the molecular basis of these processes and have also demonstrated many signaling molecules that are involved in several stages of memory. Most insights into learning and memory impairment and to develop a novel compound stems from the investigations performed in experimental models, especially those produced by neurotoxins models. Several toxins have been utilized based on their mechanism of action for learning and memory impairment such as scopolamine, streptozotocin, quinolinic acid, and domoic acid. Further, some toxins like 6-hydroxy dopamine (6-OHDA), 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP) and amyloid-β are known to cause specific learning and memory impairment which imitate the disease pathology of Parkinson’s disease dementia and Alzheimer’s disease dementia. Apart from these toxins, several other toxins come under a miscellaneous category like an environmental pollutant, snake venoms, botulinum, and lipopolysaccharide. This review will focus on the various classes of neurotoxin models for learning and memory impairment with their specific mechanism of action that could assist the process of drug discovery and development for dementia and cognitive disorders. PMID:27598124
Toxin-Induced Experimental Models of Learning and Memory Impairment.
More, Sandeep Vasant; Kumar, Hemant; Cho, Duk-Yeon; Yun, Yo-Sep; Choi, Dong-Kug
2016-09-01
Animal models for learning and memory have significantly contributed to novel strategies for drug development and hence are an imperative part in the assessment of therapeutics. Learning and memory involve different stages including acquisition, consolidation, and retrieval and each stage can be characterized using specific toxin. Recent studies have postulated the molecular basis of these processes and have also demonstrated many signaling molecules that are involved in several stages of memory. Most insights into learning and memory impairment and to develop a novel compound stems from the investigations performed in experimental models, especially those produced by neurotoxins models. Several toxins have been utilized based on their mechanism of action for learning and memory impairment such as scopolamine, streptozotocin, quinolinic acid, and domoic acid. Further, some toxins like 6-hydroxy dopamine (6-OHDA), 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP) and amyloid-β are known to cause specific learning and memory impairment which imitate the disease pathology of Parkinson's disease dementia and Alzheimer's disease dementia. Apart from these toxins, several other toxins come under a miscellaneous category like an environmental pollutant, snake venoms, botulinum, and lipopolysaccharide. This review will focus on the various classes of neurotoxin models for learning and memory impairment with their specific mechanism of action that could assist the process of drug discovery and development for dementia and cognitive disorders.
Artificial intelligence support for scientific model-building
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1992-01-01
Scientific model-building can be a time-intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientific development team to understand. We believe that artificial intelligence techniques can facilitate both the model-building and model-sharing process. In this paper, we overview our effort to build a scientific modeling software tool that aids the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high-level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities.
A Cognitive Diagnosis Model for Cognitively Based Multiple-Choice Options
ERIC Educational Resources Information Center
de la Torre, Jimmy
2009-01-01
Cognitive or skills diagnosis models are discrete latent variable models developed specifically for the purpose of identifying the presence or absence of multiple fine-grained skills. However, applications of these models typically involve dichotomous or dichotomized data, including data from multiple-choice (MC) assessments that are scored as…
Multilevel Modeling: A Review of Methodological Issues and Applications
ERIC Educational Resources Information Center
Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.
2009-01-01
This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…
Iteration and Prototyping in Creating Technical Specifications.
ERIC Educational Resources Information Center
Flynt, John P.
1994-01-01
Claims that the development process for computer software can be greatly aided by the writers of specifications if they employ basic iteration and prototyping techniques. Asserts that computer software configuration management practices provide ready models for iteration and prototyping. (HB)
Development and characterization of high-efficiency, high-specific impulse xenon Hall thrusters
NASA Astrophysics Data System (ADS)
Hofer, Richard Robert
This dissertation presents research aimed at extending the efficient operation of 1600 s specific impulse Hall thruster technology to the 2000--3000 s range. While recent studies of commercially developed Hall thrusters demonstrated greater than 4000 s specific impulse, maximum efficiency occurred at less than 3000 s. It was hypothesized that the efficiency maximum resulted as a consequence of modern magnetic field designs, optimized for 1600 s, which were unsuitable at high-specific impulse. Motivated by the industry efforts and mission studies, the aim of this research was to develop and characterize xenon Hall thrusters capable of both high-specific impulse and high-efficiency operation. The research divided into development and characterization phases. During the development phase, the laboratory-model NASA-173M Hall thrusters were designed with plasma lens magnetic field topographies and their performance and plasma characteristics were evaluated. Experiments with the NASA-173M version 1 (v1) validated the plasma lens design by showing how changing the magnetic field topography at high-specific impulse improved efficiency. Experiments with the NASA-173M version 2 (v2) showed there was a minimum current density and optimum magnetic field topography at which efficiency monotonically increased with voltage. Between 300--1000 V, total specific impulse and total efficiency of the NASA-173Mv2 operating at 10 mg/s ranged from 1600--3400 s and 51--61%, respectively. Comparison of the thrusters showed that efficiency can be optimized for specific impulse by varying the plasma lens design. During the characterization phase, additional plasma properties of the NASA-173Mv2 were measured and a performance model was derived accounting for a multiply-charged, partially-ionized plasma. Results from the model based on experimental data showed how efficient operation at high-specific impulse was enabled through regulation of the electron current with the magnetic field. The decrease of efficiency due to multiply-charged ions was minor. Efficiency was largely determined by the current utilization, which suggested maximum Hall thruster efficiency has yet to be reached. The electron Hall parameter was approximately constant with voltage, decreasing from an average of 210 at 300 V to an average of 160 between 400--900 V, which confirmed efficient operation can be realized only over a limited range of Hall parameters.
Youth Sport Readiness: A Predictive Model for Success.
ERIC Educational Resources Information Center
Aicinena, Steven
1992-01-01
A model for predicting organized youth sport participation readiness has four predictive components: sport-related fundamental motor skill development; sport-specific knowledge; motivation; and socialization. Physical maturation is also important. The model emphasizes the importance of preparing children for successful participation through…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... applications and assessment of site specific, generic, and process-oriented multimedia environmental models as... development and simulation supports interagency interests in risk assessment, uncertainty analyses, management...
PHYSICOCHEMICAL PROPERTY CALCULATIONS
Computer models have been developed to estimate a wide range of physical-chemical properties from molecular structure. The SPARC modeling system approaches calculations as site specific reactions (pKa, hydrolysis, hydration) and `whole molecule' properties (vapor pressure, boilin...
Progress in developing Poisson-Boltzmann equation solvers
Li, Chuan; Li, Lin; Petukh, Marharyta; Alexov, Emil
2013-01-01
This review outlines the recent progress made in developing more accurate and efficient solutions to model electrostatics in systems comprised of bio-macromolecules and nano-objects, the last one referring to objects that do not have biological function themselves but nowadays are frequently used in biophysical and medical approaches in conjunction with bio-macromolecules. The problem of modeling macromolecular electrostatics is reviewed from two different angles: as a mathematical task provided the specific definition of the system to be modeled and as a physical problem aiming to better capture the phenomena occurring in the real experiments. In addition, specific attention is paid to methods to extend the capabilities of the existing solvers to model large systems toward applications of calculations of the electrostatic potential and energies in molecular motors, mitochondria complex, photosynthetic machinery and systems involving large nano-objects. PMID:24199185
A Logic Model for Evaluating the Academic Health Department.
Erwin, Paul Campbell; McNeely, Clea S; Grubaugh, Julie H; Valentine, Jennifer; Miller, Mark D; Buchanan, Martha
2016-01-01
Academic Health Departments (AHDs) are collaborative partnerships between academic programs and practice settings. While case studies have informed our understanding of the development and activities of AHDs, there has been no formal published evaluation of AHDs, either singularly or collectively. Developing a framework for evaluating AHDs has potential to further aid our understanding of how these relationships may matter. In this article, we present a general theory of change, in the form of a logic model, for how AHDs impact public health at the community level. We then present a specific example of how the logic model has been customized for a specific AHD. Finally, we end with potential research questions on the AHD based on these concepts. We conclude that logic models are valuable tools, which can be used to assess the value and ultimate impact of the AHD.
Francy, Donna S.; Brady, Amie M.G.; Carvin, Rebecca B.; Corsi, Steven R.; Fuller, Lori M.; Harrison, John H.; Hayhurst, Brett A.; Lant, Jeremiah; Nevers, Meredith B.; Terrio, Paul J.; Zimmerman, Tammy M.
2013-01-01
Predictive models have been used at beaches to improve the timeliness and accuracy of recreational water-quality assessments over the most common current approach to water-quality monitoring, which relies on culturing fecal-indicator bacteria such as Escherichia coli (E. coli.). Beach-specific predictive models use environmental and water-quality variables that are easily and quickly measured as surrogates to estimate concentrations of fecal-indicator bacteria or to provide the probability that a State recreational water-quality standard will be exceeded. When predictive models are used for beach closure or advisory decisions, they are referred to as “nowcasts.” During the recreational seasons of 2010-12, the U.S. Geological Survey (USGS), in cooperation with 23 local and State agencies, worked to improve existing nowcasts at 4 beaches, validate predictive models at another 38 beaches, and collect data for predictive-model development at 7 beaches throughout the Great Lakes. This report summarizes efforts to collect data and develop predictive models by multiple agencies and to compile existing information on the beaches and beach-monitoring programs into one comprehensive report. Local agencies measured E. coli concentrations and variables expected to affect E. coli concentrations such as wave height, turbidity, water temperature, and numbers of birds at the time of sampling. In addition to these field measurements, equipment was installed by the USGS or local agencies at or near several beaches to collect water-quality and metrological measurements in near real time, including nearshore buoys, weather stations, and tributary staff gages and monitors. The USGS worked with local agencies to retrieve data from existing sources either manually or by use of tools designed specifically to compile and process data for predictive-model development. Predictive models were developed by use of linear regression and (or) partial least squares techniques for 42 beaches that had at least 2 years of data (2010-11 and sometimes earlier) and for 1 beach that had 1 year of data. For most models, software designed for model development by the U.S. Environmental Protection Agency (Virtual Beach) was used. The selected model for each beach was based on a combination of explanatory variables including, most commonly, turbidity, day of the year, change in lake level over 24 hours, wave height, wind direction and speed, and antecedent rainfall for various time periods. Forty-two predictive models were validated against data collected during an independent year (2012) and compared to the current method for assessing recreational water quality-using the previous day’s E. coli concentration (persistence model). Goals for good predictive-model performance were responses that were at least 5 percent greater than the persistence model and overall correct responses greater than or equal to 80 percent, sensitivities (percentage of exceedances of the bathing-water standard that were correctly predicted by the model) greater than or equal to 50 percent, and specificities (percentage of nonexceedances correctly predicted by the model) greater than or equal to 85 percent. Out of 42 predictive models, 24 models yielded over-all correct responses that were at least 5 percent greater than the use of the persistence model. Predictive-model responses met the performance goals more often than the persistence-model responses in terms of overall correctness (28 versus 17 models, respectively), sensitivity (17 versus 4 models), and specificity (34 versus 25 models). Gaining knowledge of each beach and the factors that affect E. coli concentrations is important for developing good predictive models. Collection of additional years of data with a wide range of environmental conditions may also help to improve future model performance. The USGS will continue to work with local agencies in 2013 and beyond to develop and validate predictive models at beaches and improve existing nowcasts, restructuring monitoring activities to accommodate future uncertainties in funding and resources.
Xu, Y; Li, YF; Zhang, D; Dockendorf, M; Tetteh, E; Rizk, ML; Grobler, JA; Lai, M‐T; Gobburu, J
2016-01-01
We applied model‐based meta‐analysis of viral suppression as a function of drug exposure and in vitro potency for short‐term monotherapy in human immunodeficiency virus type 1 (HIV‐1)‐infected treatment‐naïve patients to set pharmacokinetic targets for development of nonnucleoside reverse transcriptase inhibitors (NNRTIs) and integrase strand transfer inhibitors (InSTIs). We developed class‐specific models relating viral load kinetics from monotherapy studies to potency normalized steady‐state trough plasma concentrations. These models were integrated with a literature assessment of doses which demonstrated to have long‐term efficacy in combination therapy, in order to set steady‐state trough concentration targets of 6.17‐ and 2.15‐fold above potency for NNRTIs and InSTIs, respectively. Both the models developed and the pharmacokinetic targets derived can be used to guide compound selection during preclinical development and to predict the dose–response of new antiretrovirals to inform early clinical trial design. PMID:27171172
Diabetes-associated dry eye syndrome in a new humanized transgenic model of type 1 diabetes.
Imam, Shahnawaz; Elagin, Raya B; Jaume, Juan Carlos
2013-01-01
Patients with Type 1 Diabetes (T1D) are at high risk of developing lacrimal gland dysfunction. We have developed a new model of human T1D using double-transgenic mice carrying HLA-DQ8 diabetes-susceptibility haplotype instead of mouse MHC-class II and expressing the human beta cell autoantigen Glutamic Acid Decarboxylase in pancreatic beta cells. We report here the development of dry eye syndrome (DES) after diabetes induction in our humanized transgenic model. Double-transgenic mice were immunized with DNA encoding human GAD65, either naked or in adenoviral vectors, to induce T1D. Mice monitored for development of diabetes developed lacrimal gland dysfunction. Animals developed lacrimal gland disease (classically associated with diabetes in Non Obese Diabetic [NOD] mice and with T1D in humans) as they developed glucose intolerance and diabetes. Animals manifested obvious clinical signs of dry eye syndrome (DES), from corneal erosions to severe keratitis. Histological studies of peri-bulbar areas revealed lymphocytic infiltration of glandular structures. Indeed, infiltrative lesions were observed in lacrimal/Harderian glands within weeks following development of glucose intolerance. Lesions ranged from focal lymphocytic infiltration to complete acinar destruction. We observed a correlation between the severity of the pancreatic infiltration and the severity of the ocular disease. Our results demonstrate development of DES in association with antigen-specific insulitis and diabetes following immunization with clinically relevant human autoantigen concomitantly expressed in pancreatic beta cells of diabetes-susceptible mice. As in the NOD mouse model and as in human T1D, our animals developed diabetes-associated DES. This specific finding stresses the relevance of our model for studying these human diseases. We believe our model will facilitate studies to prevent/treat diabetes-associated DES as well as human diabetes.
Madaniyazi, Lina; Guo, Yuming; Chen, Renjie; Kan, Haidong; Tong, Shilu
2016-01-01
Estimating the burden of mortality associated with particulates requires knowledge of exposure-response associations. However, the evidence on exposure-response associations is limited in many cities, especially in developing countries. In this study, we predicted associations of particulates smaller than 10 μm in aerodynamic diameter (PM10) with mortality in 73 Chinese cities. The meta-regression model was used to test and quantify which city-specific characteristics contributed significantly to the heterogeneity of PM10-mortality associations for 16 Chinese cities. Then, those city-specific characteristics with statistically significant regression coefficients were treated as independent variables to build multivariate meta-regression models. The model with the best fitness was used to predict PM10-mortality associations in 73 Chinese cities in 2010. Mean temperature, PM10 concentration and green space per capita could best explain the heterogeneity in PM10-mortality associations. Based on city-specific characteristics, we were able to develop multivariate meta-regression models to predict associations between air pollutants and health outcomes reasonably well. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Aycock, Kenneth; Sastry, Shankar; Kim, Jibum; Shontz, Suzanne; Campbell, Robert; Manning, Keefe; Lynch, Frank; Craven, Brent
2013-11-01
A computational methodology for simulating inferior vena cava (IVC) filter placement and IVC hemodynamics was developed and tested on two patient-specific IVC geometries: a left-sided IVC, and an IVC with a retroaortic left renal vein. Virtual IVC filter placement was performed with finite element analysis (FEA) using non-linear material models and contact modeling, yielding maximum vein displacements of approximately 10% of the IVC diameters. Blood flow was then simulated using computational fluid dynamics (CFD) with four cases for each patient IVC: 1) an IVC only, 2) an IVC with a placed filter, 3) an IVC with a placed filter and a model embolus, all at resting flow conditions, and 4) an IVC with a placed filter and a model embolus at exercise flow conditions. Significant hemodynamic differences were observed between the two patient IVCs, with the development of a right-sided jet (all cases) and a larger stagnation region (cases 3-4) in the left-sided IVC. These results support further investigation of the effects of IVC filter placement on a patient-specific basis.
Fukuda, Haruhisa; Kuroki, Manabu
2016-03-01
To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.
[Model-based biofuels system analysis: a review].
Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin
2011-03-01
Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.
Falcinelli, Shane; Gowen, Brian B.; Trost, Brett; Napper, Scott; Kusalik, Anthony; Johnson, Reed F.; Safronetz, David; Prescott, Joseph; Wahl-Jensen, Victoria; Jahrling, Peter B.; Kindrachuk, Jason
2015-01-01
The Syrian golden hamster has been increasingly used to study viral hemorrhagic fever (VHF) pathogenesis and countermeasure efficacy. As VHFs are a global health concern, well-characterized animal models are essential for both the development of therapeutics and vaccines as well as for increasing our understanding of the molecular events that underlie viral pathogenesis. However, the paucity of reagents or platforms that are available for studying hamsters at a molecular level limits the ability to extract biological information from this important animal model. As such, there is a need to develop platforms/technologies for characterizing host responses of hamsters at a molecular level. To this end, we developed hamster-specific kinome peptide arrays to characterize the molecular host response of the Syrian golden hamster. After validating the functionality of the arrays using immune agonists of defined signaling mechanisms (lipopolysaccharide (LPS) and tumor necrosis factor (TNF)-α), we characterized the host response in a hamster model of VHF based on Pichinde virus (PICV1) infection by performing temporal kinome analysis of lung tissue. Our analysis revealed key roles for vascular endothelial growth factor (VEGF), interleukin (IL) responses, nuclear factor kappa-light-chain-enhancer of activated B cells (NF-κB) signaling, and Toll-like receptor (TLR) signaling in the response to PICV infection. These findings were validated through phosphorylation-specific Western blot analysis. Overall, we have demonstrated that hamster-specific kinome arrays are a robust tool for characterizing the species-specific molecular host response in a VHF model. Further, our results provide key insights into the hamster host response to PICV infection and will inform future studies with high-consequence VHF pathogens. PMID:25573744
NASA Astrophysics Data System (ADS)
Rivera, I.; Chadwick, B.; Rosen, G.; Wang, P. F.; Paquin, P.; Santore, R.; Ryan, A.
2015-12-01
Understanding the bioavailability of metals in the aquatic environment is important for defining appropriate regulatory constraints. A failure to recognize the importance of bioavailability factors on metal toxicity can result in criteria that are over- or under-protective. USEPA addresses the tendency of the national Water Quality Criterion (WQC) for regulation of copper in marine waters to underestimate the natural attenuation of copper toxicity in harbors by the application of site-specific Water Quality Standards (WQS). Which provides the level of protection intended by the WQC, and establishes realistic regulatory objectives. However, development of site-specific WQS involves a long-term effort, and does not account for temporal variation. The toxicity model seawater-Biotic Ligand Model (BLM) was developed and integrated with the existing Curvilinear Hydrodynamics in 3 Dimensions (CH3D) transport & fate model to create an efficient tool for development of site-specific WQS in harbors. The integrated model was demonstrated at a harbor-wide scale in San Diego Bay and Pearl Harbor, and accounted for the natural physical, chemical, biological and toxicological characteristics of the harbor to achieve more scientifically based compliance. In both harbors the spatial and temporal distributions of copper species, toxic effects, and Water Effect Ratio predicted by the integrated model are comparable to previous data. The model was further demonstrated in Shelter Island Yacht Basin (SIYB) marina in San Diego Bay. The integrated model agreed with toxicological and chemical approaches by indicating negligible bioavailability as well as no toxicity; but for a single event, even though an increasing gradient in Cu was observed both horizontally and vertically, with concentrations that reached levels well above current regulatory thresholds. These results support the incorporation by USEPA of the seawater-BLM in a full-strength seawater criterion.
2006-05-01
guinea pig model does present a significant problem...trying to correlate behavioral and protein changes due to the absence of guinea pig -specific antibodies. We...have developed a procedure to determine the specificity of commercially available, non- guinea pig -specific antibodies in guinea pig lysates.
NASA Astrophysics Data System (ADS)
Creutzig, Felix; Corbera, Esteve; Bolwig, Simon; Hunsberger, Carol
2013-09-01
Integrated assessment models suggest that the large-scale deployment of bioenergy could contribute to ambitious climate change mitigation efforts. However, such a shift would intensify the global competition for land, with possible consequences for 1.5 billion smallholder livelihoods that these models do not consider. Maintaining and enhancing robust livelihoods upon bioenergy deployment is an equally important sustainability goal that warrants greater attention. The social implications of biofuel production are complex, varied and place-specific, difficult to model, operationalize and quantify. However, a rapidly developing body of social science literature is advancing the understanding of these interactions. In this letter we link human geography research on the interaction between biofuel crops and livelihoods in developing countries to integrated assessments on biofuels. We review case-study research focused on first-generation biofuel crops to demonstrate that food, income, land and other assets such as health are key livelihood dimensions that can be impacted by such crops and we highlight how place-specific and global dynamics influence both aggregate and distributional outcomes across these livelihood dimensions. We argue that place-specific production models and land tenure regimes mediate livelihood outcomes, which are also in turn affected by global and regional markets and their resulting equilibrium dynamics. The place-specific perspective suggests that distributional consequences are a crucial complement to aggregate outcomes; this has not been given enough weight in comprehensive assessments to date. By narrowing the gap between place-specific case studies and global models, our discussion offers a route towards integrating livelihood and equity considerations into scenarios of future bioenergy deployment, thus contributing to a key challenge in sustainability sciences.
On the pursuit of a nuclear development capability: The case of the Cuban nuclear program
NASA Astrophysics Data System (ADS)
Benjamin-Alvarado, Jonathan Calvert
1998-09-01
While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.
Hardware proofs using EHDM and the RSRE verification methodology
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Sjogren, Jon A.
1988-01-01
Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.
Fresques, Tara; Swartz, S. Zachary; Juliano, Celina; Morino, Yoshiaki; Kikuchi, Mani; Akasaka, Koji; Wada, Hiroshi; Yajima, Mamiko; Wessel, Gary M.
2016-01-01
Specification of the germ cell lineage is required for sexual reproduction in all animals. However, the timing and mechanisms of germ cell specification is remarkably diverse in animal development. Echinoderms, such as sea urchins and sea stars, are excellent model systems to study the molecular and cellular mechanisms that contribute to germ cell specification. In several echinoderm embryos tested, the germ cell factor Vasa accumulates broadly during early development and is restricted after gastrulation to cells that contribute to the germ cell lineage. In the sea urchin, however, the germ cell factor Vasa is restricted to a specific lineage by the 32-cell stage. We therefore hypothesized that the germ cell specification program in the sea urchin/Euechinoid lineage has evolved to an earlier developmental time point. To test this hypothesis we determined the expression pattern of a second germ cell factor, Nanos, in four out of five extant echinoderm clades. Here we find that Nanos mRNA does not accumulate until the blastula stage or later during the development of all other echinoderm embryos except those that belong to the Echinoid lineage. Instead, Nanos is expressed in a restricted domain at the 32–128 cell stage in Echinoid embryos. Our results support the model that the germ cell specification program underwent a heterochronic shift in the Echinoid lineage. A comparison of Echinoid and non-Echinoid germ cell specification mechanisms will contribute to our understanding of how these mechanisms have changed during animal evolution. PMID:27402572
NASA Technical Reports Server (NTRS)
Spirka, T. A.; Myers, J. G.; Setser, R. M.; Halliburton, S. S.; White, R. D.; Chatzimavroudis, G. P.
2005-01-01
A priority of NASA is to identify and study possible risks to astronauts health during prolonged space missions [l]. The goal is to develop a procedure for a preflight evaluation of the cardiovascular system of an astronaut and to forecast how it will be affected during the mission. To predict these changes, a computational cardiovascular model must be constructed. Although physiology data can be used to make a general model, a more desirable subject-specific model requires anatomical, functional, and flow data from the specific astronaut. MRI has the unique advantage of providing images with all of the above information, including three-directional velocity data which can be used as boundary conditions in a computational fluid dynamics (CFD) program [2,3]. MRI-based CFD is very promising for reproduction of the flow patterns of a specific subject and prediction of changes in the absence of gravity. The aim of this study was to test the feasibility of this approach by reconstructing the geometry of MRI-scanned arterial models and reproducing the MRI-measured velocities using CFD simulations on these geometries.
NASA Technical Reports Server (NTRS)
Humphreys, B. T.; Thompson, W. K.; Lewandowski, B. E.; Cadwell, E. E.; Newby, N. J.; Fincke, R. S.; Sheehan, C.; Mulugeta, L.
2012-01-01
NASA's Digital Astronaut Project (DAP) implements well-vetted computational models to predict and assess spaceflight health and performance risks, and enhance countermeasure development. DAP provides expertise and computation tools to its research customers for model development, integration, or analysis. DAP is currently supporting the NASA Exercise Physiology and Countermeasures (ExPC) project by integrating their biomechanical models of specific exercise movements with dynamic models of the devices on which the exercises were performed. This presentation focuses on the development of a high fidelity dynamic module of the Advanced Resistive Exercise Device (ARED) on board the ISS. The ARED module, illustrated in the figure below, was developed using the Adams (MSC Santa Ana, California) simulation package. The Adams package provides the capabilities to perform multi rigid body, flexible body, and mixed dynamic analyses of complex mechanisms. These capabilities were applied to accurately simulate: Inertial and mass properties of the device such as the vibration isolation system (VIS) effects and other ARED components, Non-linear joint friction effects, The gas law dynamics of the vacuum cylinders and VIS components using custom written differential state equations, The ARED flywheel dynamics, including torque limiting clutch. Design data from the JSC ARED Engineering team was utilized in developing the model. This included solid modeling geometry files, component/system specifications, engineering reports and available data sets. The Adams ARED module is importable into LifeMOD (Life Modeler, Inc., San Clemente, CA) for biomechanical analyses of different resistive exercises such as squat and dead-lift. Using motion capture data from ground test subjects, the ExPC developed biomechanical exercise models in LifeMOD. The Adams ARED device module was then integrated with the exercise subject model into one integrated dynamic model. This presentation will describe the development of the Adams ARED module including its capabilities, limitations, and assumptions. Preliminary results, validation activities, and a practical application of the module to inform the relative effect of the flywheels on exercise will be discussed.
NASA Technical Reports Server (NTRS)
Valdivia, Roberto O.; Antle, John M.; Rosenzweig, Cynthia; Ruane, Alexander C.; Vervoort, Joost; Ashfaq, Muhammad; Hathie, Ibrahima; Tui, Sabine Homann-Kee; Mulwa, Richard; Nhemachena, Charles;
2015-01-01
The global change research community has recognized that new pathway and scenario concepts are needed to implement impact and vulnerability assessment where precise prediction is not possible, and also that these scenarios need to be logically consistent across local, regional, and global scales. For global climate models, representative concentration pathways (RCPs) have been developed that provide a range of time-series of atmospheric greenhouse-gas concentrations into the future. For impact and vulnerability assessment, new socio-economic pathway and scenario concepts have also been developed, with leadership from the Integrated Assessment Modeling Consortium (IAMC).This chapter presents concepts and methods for development of regional representative agricultural pathways (RAOs) and scenarios that can be used for agricultural model intercomparison, improvement, and impact assessment in a manner consistent with the new global pathways and scenarios. The development of agriculture-specific pathways and scenarios is motivated by the need for a protocol-based approach to climate impact, vulnerability, and adaptation assessment. Until now, the various global and regional models used for agricultural-impact assessment have been implemented with individualized scenarios using various data and model structures, often without transparent documentation, public availability, and consistency across disciplines. These practices have reduced the credibility of assessments, and also hampered the advancement of the science through model intercomparison, improvement, and synthesis of model results across studies. The recognition of the need for better coordination among the agricultural modeling community, including the development of standard reference scenarios with adequate agriculture-specific detail led to the creation of the Agricultural Model Intercomparison and Improvement Project (AgMIP) in 2010. The development of RAPs is one of the cross-cutting themes in AgMIP's work plan, and has been the subject of ongoing work by AgMIP since its creation.
Path Models of Vocal Emotion Communication
Bänziger, Tanja; Hosoya, Georg; Scherer, Klaus R.
2015-01-01
We propose to use a comprehensive path model of vocal emotion communication, encompassing encoding, transmission, and decoding processes, to empirically model data sets on emotion expression and recognition. The utility of the approach is demonstrated for two data sets from two different cultures and languages, based on corpora of vocal emotion enactment by professional actors and emotion inference by naïve listeners. Lens model equations, hierarchical regression, and multivariate path analysis are used to compare the relative contributions of objectively measured acoustic cues in the enacted expressions and subjective voice cues as perceived by listeners to the variance in emotion inference from vocal expressions for four emotion families (fear, anger, happiness, and sadness). While the results confirm the central role of arousal in vocal emotion communication, the utility of applying an extended path modeling framework is demonstrated by the identification of unique combinations of distal cues and proximal percepts carrying information about specific emotion families, independent of arousal. The statistical models generated show that more sophisticated acoustic parameters need to be developed to explain the distal underpinnings of subjective voice quality percepts that account for much of the variance in emotion inference, in particular voice instability and roughness. The general approach advocated here, as well as the specific results, open up new research strategies for work in psychology (specifically emotion and social perception research) and engineering and computer science (specifically research and development in the domain of affective computing, particularly on automatic emotion detection and synthetic emotion expression in avatars). PMID:26325076
To grow or not to grow: hair morphogenesis and human genetic hair disorders.
Duverger, Olivier; Morasso, Maria I
2014-01-01
Mouse models have greatly helped in elucidating the molecular mechanisms involved in hair formation and regeneration. Recent publications have reviewed the genes involved in mouse hair development based on the phenotype of transgenic, knockout and mutant animal models. While much of this information has been instrumental in determining molecular aspects of human hair development and cycling, mice exhibit a specific pattern of hair morphogenesis and hair distribution throughout the body that cannot be directly correlated to human hair. In this mini-review, we discuss specific aspects of human hair follicle development and present an up-to-date summary of human genetic disorders associated with abnormalities in hair follicle morphogenesis, structure or regeneration. Published by Elsevier Ltd.
Dyer, Michael A
2016-10-01
Retinoblastoma is a rare childhood cancer of the developing retina, and studies on this orphan disease have led to fundamental discoveries in cancer biology. Retinoblastoma has also emerged as a model for translational research for pediatric solid tumors, which is particularly important as personalized medicine expands in oncology. Research on retinoblastomas has been combined with the exploration of retinal development and retinal degeneration to advance a new model of cell type-specific disease susceptibility termed 'cellular pliancy'. The concept can even be extended to species-specific regeneration. This review discusses the remarkable path of retinoblastoma research and how it has shaped the most current efforts in basic, translational, and clinical research in oncology and beyond. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chang, Esther; Hancock, Karen; Hickman, Louise; Glasson, Janet; Davidson, Patricia
2007-09-01
There is a lack of research investigating models of nursing care for older hospitalised patients that address the nursing needs of this group. The objective of this study is to evaluate the efficacy of models of care for acutely older patients tailored to two contexts: an aged care specific ward and a medical ward. This is a repeated measures design. Efficacy of the models was evaluated in terms of: patient and nurses' satisfaction with care provided; increased activities of daily living; reduced unplanned hospital readmissions; and medication knowledge. An aged care specific ward and a medical ward in two Sydney teaching hospitals. There were two groups of patients aged 65 years or older who were admitted to hospital for an acute illness: those admitted prior to model implementation (n=232) and those admitted during model implementation (n=116). Patients with moderate or severe dementia were excluded. The two groups of nurses were the pre-model group (n=90) who were working on the medical and aged care wards for the study prior to model implementation, and the post-model group (n=22), who were the nurses working on the wards during model implementation. Action research was used to develop the models of care in two wards: one for an aged care specific ward and another for a general medical ward where older patients were admitted. The models developed were based on empirical data gathered in an earlier phase of this study. The models were successful in both wards in terms of increasing satisfaction levels in patients and nurses (p<0.001), increasing functional independence as measured by activities of daily living (p<0.01), and increasing medication knowledge (p<0.001). Findings indicate that models of care developed by nurses using an evidence-based action research strategy can enhance both satisfaction and health outcomes in older patients.
Scherf, K. Suzanne; Behrmann, Marlene; Dahl, Ronald E.
2015-01-01
Adolescence is a time of dramatic physical, cognitive, emotional, and social changes as well as a time for the development of many social-emotional problems. These characteristics raise compelling questions about accompanying neural changes that are unique to this period of development. Here, we propose that studying adolescent-specific changes in face processing and its underlying neural circuitry provides an ideal model for addressing these questions. We also use this model to formulate new hypotheses. Specifically, pubertal hormones are likely to increase motivation to master new peer-oriented developmental tasks, which will in turn, instigate the emergence of new social/affective components of face processing. We also predict that pubertal hormones have a fundamental impact on the reorganization of neural circuitry supporting face processing and propose, in particular, that, the functional connectivity, or temporal synchrony, between regions of the face-processing network will change with the emergence of these new components of face processing in adolescence. Finally, we show how this approach will help reveal why adolescence may be a period of vulnerability in brain development and suggest how it could lead to prevention and intervention strategies that facilitate more adaptive functional interactions between regions within the broader social information processing network. PMID:22483070
Interactive Structure (EUCLID) For Static And Dynamic Representation Of Human Body
NASA Astrophysics Data System (ADS)
Renaud, Ch.; Steck, R.
1983-07-01
A specific software (EUCLID) for static and dynamic representation of human models is described. The data processing system is connected with ERGODATA and used in interactive mode by intrinsic or specific functions. More or less complex representations in 3-D view of models of the human body are developed. Biostereometric and conventional anthropometric raw data from the data bank are processed for different applications in ergonomy.
Learning LM Specificity for Ganglion Cells
NASA Technical Reports Server (NTRS)
Ahumada, Albert J.
2015-01-01
Unsupervised learning models have been proposed based on experience (Ahumada and Mulligan, 1990;Wachtler, Doi, Lee and Sejnowski, 2007) that allow the cortex to develop units with LM specific color opponent receptive fields like the blob cells reported by Hubel and Wiesel on the basis of visual experience. These models used ganglion cells with LM indiscriminate wiring as inputs to the learning mechanism, which was presumed to occur at the cortical level.
Physiologically relevant organs on chips.
Yum, Kyungsuk; Hong, Soon Gweon; Healy, Kevin E; Lee, Luke P
2014-01-01
Recent advances in integrating microengineering and tissue engineering have generated promising microengineered physiological models for experimental medicine and pharmaceutical research. Here we review the recent development of microengineered physiological systems, or also known as "ogans-on-chips", that reconstitute the physiologically critical features of specific human tissues and organs and their interactions. This technology uses microengineering approaches to construct organ-specific microenvironments, reconstituting tissue structures, tissue-tissue interactions and interfaces, and dynamic mechanical and biochemical stimuli found in specific organs, to direct cells to assemble into functional tissues. We first discuss microengineering approaches to reproduce the key elements of physiologically important, dynamic mechanical microenvironments, biochemical microenvironments, and microarchitectures of specific tissues and organs in microfluidic cell culture systems. This is followed by examples of microengineered individual organ models that incorporate the key elements of physiological microenvironments into single microfluidic cell culture systems to reproduce organ-level functions. Finally, microengineered multiple organ systems that simulate multiple organ interactions to better represent human physiology, including human responses to drugs, is covered in this review. This emerging organs-on-chips technology has the potential to become an alternative to 2D and 3D cell culture and animal models for experimental medicine, human disease modeling, drug development, and toxicology. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ogada, Pamella Akoth; Moualeu, Dany Pascal; Poehling, Hans-Michael
2016-01-01
Several models have been studied on predictive epidemics of arthropod vectored plant viruses in an attempt to bring understanding to the complex but specific relationship between the three cornered pathosystem (virus, vector and host plant), as well as their interactions with the environment. A large body of studies mainly focuses on weather based models as management tool for monitoring pests and diseases, with very few incorporating the contribution of vector’s life processes in the disease dynamics, which is an essential aspect when mitigating virus incidences in a crop stand. In this study, we hypothesized that the multiplication and spread of tomato spotted wilt virus (TSWV) in a crop stand is strongly related to its influences on Frankliniella occidentalis preferential behavior and life expectancy. Model dynamics of important aspects in disease development within TSWV-F. occidentalis-host plant interactions were developed, focusing on F. occidentalis’ life processes as influenced by TSWV. The results show that the influence of TSWV on F. occidentalis preferential behaviour leads to an estimated increase in relative acquisition rate of the virus, and up to 33% increase in transmission rate to healthy plants. Also, increased life expectancy; which relates to improved fitness, is dependent on the virus induced preferential behaviour, consequently promoting multiplication and spread of the virus in a crop stand. The development of vector–based models could further help in elucidating the role of tri-trophic interactions in agricultural disease systems. Use of the model to examine the components of the disease process could also boost our understanding on how specific epidemiological characteristics interact to cause diseases in crops. With this level of understanding we can efficiently develop more precise control strategies for the virus and the vector. PMID:27159134
Ogada, Pamella Akoth; Moualeu, Dany Pascal; Poehling, Hans-Michael
2016-01-01
Several models have been studied on predictive epidemics of arthropod vectored plant viruses in an attempt to bring understanding to the complex but specific relationship between the three cornered pathosystem (virus, vector and host plant), as well as their interactions with the environment. A large body of studies mainly focuses on weather based models as management tool for monitoring pests and diseases, with very few incorporating the contribution of vector's life processes in the disease dynamics, which is an essential aspect when mitigating virus incidences in a crop stand. In this study, we hypothesized that the multiplication and spread of tomato spotted wilt virus (TSWV) in a crop stand is strongly related to its influences on Frankliniella occidentalis preferential behavior and life expectancy. Model dynamics of important aspects in disease development within TSWV-F. occidentalis-host plant interactions were developed, focusing on F. occidentalis' life processes as influenced by TSWV. The results show that the influence of TSWV on F. occidentalis preferential behaviour leads to an estimated increase in relative acquisition rate of the virus, and up to 33% increase in transmission rate to healthy plants. Also, increased life expectancy; which relates to improved fitness, is dependent on the virus induced preferential behaviour, consequently promoting multiplication and spread of the virus in a crop stand. The development of vector-based models could further help in elucidating the role of tri-trophic interactions in agricultural disease systems. Use of the model to examine the components of the disease process could also boost our understanding on how specific epidemiological characteristics interact to cause diseases in crops. With this level of understanding we can efficiently develop more precise control strategies for the virus and the vector.
Pre-Launch Tasks Proposed in our Contract of December 1991
NASA Technical Reports Server (NTRS)
1998-01-01
We propose, during the pre-EOS phase to: (1) develop, with other MODIS Team Members, a means of discriminating different major biome types with NDVI and other AVHRR-based data; (2) develop a simple ecosystem process model for each of these biomes, BIOME-BGC; (3) relate the seasonal trend of weekly composite NDVI to vegetation phenology and temperature limits to develop a satellite defined growing season for vegetation; and (4) define physiologically based energy to mass conversion factors for carbon and water for each biome. Our final core at-launch product will be simplified, completely satellite driven biome specific models for net primary production. We will build these biome specific satellite driven algorithms using a family of simple ecosystem process models as calibration models, collectively called BIOME-BGC, and establish coordination with an existing network of ecological study sites in order to test and validate these products. Field datasets will then be available for both BIOME-BGC development and testing, use for algorithm developments of other MODIS Team Members, and ultimately be our first test point for MODIS land vegetation products upon launch. We will use field sites from the National Science Foundation Long-Term Ecological Research network, and develop Glacier National Park as a major site for intensive validation.
Pre-Launch Tasks Proposed in our Contract of December 1991
NASA Technical Reports Server (NTRS)
Running, Steven W.; Nemani, Ramakrishna R.; Glassy, Joseph
1997-01-01
We propose, during the pre-EOS phase to: (1) develop, with other MODIS Team Members, a means of discriminating different major biome types with NDVI and other AVHRR-based data. (2) develop a simple ecosystem process model for each of these biomes, BIOME-BGC (3) relate the seasonal trend of weekly composite NDVI to vegetation phenology and temperature limits to develop a satellite defined growing season for vegetation; and (4) define physiologically based energy to mass conversion factors for carbon and water for each biome. Our final core at-launch product will be simplified, completely satellite driven biome specific models for net primary production. We will build these biome specific satellite driven algorithms using a family of simple ecosystem process models as calibration models, collectively called BIOME-BGC, and establish coordination with an existing network of ecological study sites in order to test and validate these products. Field datasets will then be available for both BIOME-BGC development and testing, use for algorithm developments of other MODIS Team Members, and ultimately be our first test point for MODIS land vegetation products upon launch. We will use field sites from the National Science Foundation Long-Term Ecological Research network, and develop Glacier National Park as a major site for intensive validation.
Challenges and opportunities for integrating lake ecosystem modelling approaches
Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.
2010-01-01
A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative view on the functioning of lake ecosystems. We end with a set of specific recommendations that may be of help in the further development of lake ecosystem models.
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
ERIC Educational Resources Information Center
Zigic, Sasha; Lemckert, Charles J.
2007-01-01
The following paper presents a computer-based learning strategy to assist in introducing and teaching water quality modelling to undergraduate civil engineering students. As part of the learning strategy, an interactive computer-based instructional (CBI) aid was specifically developed to assist students to set up, run and analyse the output from a…
ERIC Educational Resources Information Center
Koper, Rob; Manderveld, Jocelyn
2004-01-01
Nowadays there is a huge demand for flexible, independent learning without the constraints of time and place. Various trends in the field of education and training are the bases for the development of new technologies for education. This article describes the development of a learning technology specification, which supports these new demands for…
ERIC Educational Resources Information Center
Waight, Noemi; Liu, Xiufeng; Gregorius, Roberto Ma.
2015-01-01
This paper examined the nuances of the background process of design and development and follow up classroom implementation of computer-based models for high school chemistry. More specifically, the study examined the knowledge contributions of an interdisciplinary team of experts; points of tensions, negotiations and non-negotiable aspects of…
Development and Evaluation of Land-Use Regression Models Using Modeled Air Quality Concentrations
Abstract Land-use regression (LUR) models have emerged as a preferred methodology for estimating individual exposure to ambient air pollution in epidemiologic studies in absence of subject-specific measurements. Although there is a growing literature focused on LUR evaluation, fu...
A diagnostic model for chronic hypersensitivity pneumonitis
Johannson, Kerri A; Elicker, Brett M; Vittinghoff, Eric; Assayag, Deborah; de Boer, Kaïssa; Golden, Jeffrey A; Jones, Kirk D; King, Talmadge E; Koth, Laura L; Lee, Joyce S; Ley, Brett; Wolters, Paul J; Collard, Harold R
2017-01-01
The objective of this study was to develop a diagnostic model that allows for a highly specific diagnosis of chronic hypersensitivity pneumonitis using clinical and radiological variables alone. Chronic hypersensitivity pneumonitis and other interstitial lung disease cases were retrospectively identified from a longitudinal database. High-resolution CT scans were blindly scored for radiographic features (eg, ground-glass opacity, mosaic perfusion) as well as the radiologist’s diagnostic impression. Candidate models were developed then evaluated using clinical and radiographic variables and assessed by the cross-validated C-statistic. Forty-four chronic hypersensitivity pneumonitis and eighty other interstitial lung disease cases were identified. Two models were selected based on their statistical performance, clinical applicability and face validity. Key model variables included age, down feather and/or bird exposure, radiographic presence of ground-glass opacity and mosaic perfusion and moderate or high confidence in the radiographic impression of chronic hypersensitivity pneumonitis. Models were internally validated with good performance, and cut-off values were established that resulted in high specificity for a diagnosis of chronic hypersensitivity pneumonitis. PMID:27245779
Age-dependent Fourier model of the shape of the isolated ex vivo human crystalline lens.
Urs, Raksha; Ho, Arthur; Manns, Fabrice; Parel, Jean-Marie
2010-06-01
To develop an age-dependent mathematical model of the zero-order shape of the isolated ex vivo human crystalline lens, using one mathematical function, that can be subsequently used to facilitate the development of other models for specific purposes such as optical modeling and analytical and numerical modeling of the lens. Profiles of whole isolated human lenses (n=30) aged 20-69, were measured from shadow-photogrammetric images. The profiles were fit to a 10th-order Fourier series consisting of cosine functions in polar-co-ordinate system that included terms for tilt and decentration. The profiles were corrected using these terms and processed in two ways. In the first, each lens was fit to a 10th-order Fourier series to obtain thickness and diameter, while in the second, all lenses were simultaneously fit to a Fourier series equation that explicitly include linear terms for age to develop an age-dependent mathematical model for the whole lens shape. Thickness and diameter obtained from Fourier series fits exhibited high correlation with manual measurements made from shadow-photogrammetric images. The root-mean-squared-error of the age-dependent fit was 205 microm. The age-dependent equations provide a reliable lens model for ages 20-60 years. The contour of the whole human crystalline lens can be modeled with a Fourier series. Shape obtained from the age-dependent model described in this paper can be used to facilitate the development of other models for specific purposes such as optical modeling and analytical and numerical modeling of the lens. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
A parallel calibration utility for WRF-Hydro on high performance computers
NASA Astrophysics Data System (ADS)
Wang, J.; Wang, C.; Kotamarthi, V. R.
2017-12-01
A successful modeling of complex hydrological processes comprises establishing an integrated hydrological model which simulates the hydrological processes in each water regime, calibrates and validates the model performance based on observation data, and estimates the uncertainties from different sources especially those associated with parameters. Such a model system requires large computing resources and often have to be run on High Performance Computers (HPC). The recently developed WRF-Hydro modeling system provides a significant advancement in the capability to simulate regional water cycles more completely. The WRF-Hydro model has a large range of parameters such as those in the input table files — GENPARM.TBL, SOILPARM.TBL and CHANPARM.TBL — and several distributed scaling factors such as OVROUGHRTFAC. These parameters affect the behavior and outputs of the model and thus may need to be calibrated against the observations in order to obtain a good modeling performance. Having a parameter calibration tool specifically for automate calibration and uncertainty estimates of WRF-Hydro model can provide significant convenience for the modeling community. In this study, we developed a customized tool using the parallel version of the model-independent parameter estimation and uncertainty analysis tool, PEST, to enabled it to run on HPC with PBS and SLURM workload manager and job scheduler. We also developed a series of PEST input file templates that are specifically for WRF-Hydro model calibration and uncertainty analysis. Here we will present a flood case study occurred in April 2013 over Midwest. The sensitivity and uncertainties are analyzed using the customized PEST tool we developed.
Proposal for constructing an advanced software tool for planetary atmospheric modeling
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.
1990-01-01
Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.
A software quality model and metrics for risk assessment
NASA Technical Reports Server (NTRS)
Hyatt, L.; Rosenberg, L.
1996-01-01
A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.
.NET INTEROPERABILITY GUIDELINES
The CAPE-OPEN middleware standards were created to allow process modelling components (PMCs) developed by third parties to be used in any process modelling environment (PME) utilizing these standards. The CAPE-OPEN middleware specifications were based upon both Microsoft's Compo...
Developing Metrics in Systems Integration (ISS Program COTS Integration Model)
NASA Technical Reports Server (NTRS)
Lueders, Kathryn
2007-01-01
This viewgraph presentation reviews some of the complications in developing metrics for systems integration. Specifically it reviews a case study of how two programs within NASA try to develop and measure performance while meeting the encompassing organizational goals.
The PDS4 Information Model and its Role in Agile Science Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D.
2017-12-01
PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.
ERIC Educational Resources Information Center
Tarhini, Ali; Elyas, Tariq; Akour, Mohammad Ali; Al-Salti, Zahran
2016-01-01
The main aim of this paper is to develop an amalgamated conceptual model of technology acceptance that explains how individual, social, cultural and organizational factors affect the students' acceptance and usage behaviour of the Web-based learning systems. More specifically, the proposed model extends the Technology Acceptance Model (TAM) to…
Impacts of forest fragmentation on species richness: a hierarchical approach to community modelling
Zipkin, Elise F.; DeWan, Amielle; Royle, J. Andrew
2009-01-01
1. Species richness is often used as a tool for prioritizing conservation action. One method for predicting richness and other summaries of community structure is to develop species-specific models of occurrence probability based on habitat or landscape characteristics. However, this approach can be challenging for rare or elusive species for which survey data are often sparse. 2. Recent developments have allowed for improved inference about community structure based on species-specific models of occurrence probability, integrated within a hierarchical modelling framework. This framework offers advantages to inference about species richness over typical approaches by accounting for both species-level effects and the aggregated effects of landscape composition on a community as a whole, thus leading to increased precision in estimates of species richness by improving occupancy estimates for all species, including those that were observed infrequently. 3. We developed a hierarchical model to assess the community response of breeding birds in the Hudson River Valley, New York, to habitat fragmentation and analysed the model using a Bayesian approach. 4. The model was designed to estimate species-specific occurrence and the effects of fragment area and edge (as measured through the perimeter and the perimeter/area ratio, P/A), while accounting for imperfect detection of species. 5. We used the fitted model to make predictions of species richness within forest fragments of variable morphology. The model revealed that species richness of the observed bird community was maximized in small forest fragments with a high P/A. However, the number of forest interior species, a subset of the community with high conservation value, was maximized in large fragments with low P/A. 6. Synthesis and applications. Our results demonstrate the importance of understanding the responses of both individual, and groups of species, to environmental heterogeneity while illustrating the utility of hierarchical models for inference about species richness for conservation. This framework can be used to investigate the impacts of land-use change and fragmentation on species or assemblage richness, and to further understand trade-offs in species-specific occupancy probabilities associated with landscape variability.
Prototypic automated continuous recreational water quality monitoring of nine Chicago beaches
Dawn Shively,; Nevers, Meredith; Cathy Breitenbach,; Phanikumar, Mantha S.; Kasia Przybyla-Kelly,; Ashley M. Spoljaric,; Richard L. Whitman,
2015-01-01
Predictive empirical modeling is used in many locations worldwide as a rapid, alternative recreational water quality management tool to eliminate delayed notifications associated with traditional fecal indicator bacteria (FIB) culturing (referred to as the persistence model, PM) and to prevent errors in releasing swimming advisories. The goal of this study was to develop a fully automated water quality management system for multiple beaches using predictive empirical models (EM) and state-of-the-art technology. Many recent EMs rely on samples or data collected manually, which adds to analysis time and increases the burden to the beach manager. In this study, data from water quality buoys and weather stations were transmitted through cellular telemetry to a web hosting service. An executable program simultaneously retrieved and aggregated data for regression equations and calculated EM results each morning at 9:30 AM; results were transferred through RSS feed to a website, mapped to each beach, and received by the lifeguards to be posted at the beach. Models were initially developed for five beaches, but by the third year, 21 beaches were managed using refined and validated modeling systems. The adjusted R2 of the regressions relating Escherichia coli to hydrometeorological variables for the EMs were greater than those for the PMs, and ranged from 0.220 to 0.390 (2011) and 0.103 to 0.381 (2012). Validation results in 2013 revealed reduced predictive capabilities; however, three of the originally modeled beaches showed improvement in 2013 compared to 2012. The EMs generally showed higher accuracy and specificity than those of the PMs, and sensitivity was low for both approaches. In 2012 EM accuracy was 70–97%; specificity, 71–100%; and sensitivity, 0–64% and in 2013 accuracy was 68–97%; specificity, 73–100%; and sensitivity 0–36%. Factors that may have affected model capabilities include instrument malfunction, non-point source inputs, and sparse calibration data. The modeling system developed is the most extensive, fully-automated system for recreational water quality developed to date. Key insights for refining and improving large-scale empirical models for beach management have been developed through this multi-year effort.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tome, Carlos N; Caro, J A; Lebensohn, R A
2010-01-01
Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less
TSAPA: identification of tissue-specific alternative polyadenylation sites in plants.
Ji, Guoli; Chen, Moliang; Ye, Wenbin; Zhu, Sheng; Ye, Congting; Su, Yaru; Peng, Haonan; Wu, Xiaohui
2018-06-15
Alternative polyadenylation (APA) is now emerging as a widespread mechanism modulated tissue-specifically, which highlights the need to define tissue-specific poly(A) sites for profiling APA dynamics across tissues. We have developed an R package called TSAPA based on the machine learning model for identifying tissue-specific poly(A) sites in plants. A feature space including more than 200 features was assembled to specifically characterize poly(A) sites in plants. The classification model in TSAPA can be customized by selecting desirable features or classifiers. TSAPA is also capable of predicting tissue-specific poly(A) sites in unannotated intergenic regions. TSAPA will be a valuable addition to the community for studying dynamics of APA in plants. https://github.com/BMILAB/TSAPA. Supplementary data are available at Bioinformatics online.
ERIC Educational Resources Information Center
Human Engineering Inst., Cleveland, OH.
THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE OPERATION AND MAINTENANCE OF SPECIFIC MODELS OF AUTOMATIC TRANSMISSIONS USED ON DIESEL POWERED VEHICLES. TOPICS ARE (1) GENERAL SPECIFICATION DATA, (2) OPTIONS FOR VARIOUS APPLICATIONS, (3) ROAD TEST INSTRUCTIONS, (4) IDENTIFICATION AND SPECIFICATION DATA, (5) ALLISON…
Model Checking - My 27-Year Quest to Overcome the State Explosion Problem
NASA Technical Reports Server (NTRS)
Clarke, Ed
2009-01-01
Model Checking is an automatic verification technique for state-transition systems that are finite=state or that have finite-state abstractions. In the early 1980 s in a series of joint papers with my graduate students E.A. Emerson and A.P. Sistla, we proposed that Model Checking could be used for verifying concurrent systems and gave algorithms for this purpose. At roughly the same time, Joseph Sifakis and his student J.P. Queille at the University of Grenoble independently developed a similar technique. Model Checking has been used successfully to reason about computer hardware and communication protocols and is beginning to be used for verifying computer software. Specifications are written in temporal logic, which is particularly valuable for expressing concurrency properties. An intelligent, exhaustive search is used to determine if the specification is true or not. If the specification is not true, the Model Checker will produce a counterexample execution trace that shows why the specification does not hold. This feature is extremely useful for finding obscure errors in complex systems. The main disadvantage of Model Checking is the state-explosion problem, which can occur if the system under verification has many processes or complex data structures. Although the state-explosion problem is inevitable in worst case, over the past 27 years considerable progress has been made on the problem for certain classes of state-transition systems that occur often in practice. In this talk, I will describe what Model Checking is, how it works, and the main techniques that have been developed for combating the state explosion problem.
Specimen-specific modeling of hip fracture pattern and repair.
Ali, Azhar A; Cristofolini, Luca; Schileo, Enrico; Hu, Haixiang; Taddei, Fulvia; Kim, Raymond H; Rullkoetter, Paul J; Laz, Peter J
2014-01-22
Hip fracture remains a major health problem for the elderly. Clinical studies have assessed fracture risk based on bone quality in the aging population and cadaveric testing has quantified bone strength and fracture loads. Prior modeling has primarily focused on quantifying the strain distribution in bone as an indicator of fracture risk. Recent advances in the extended finite element method (XFEM) enable prediction of the initiation and propagation of cracks without requiring a priori knowledge of the crack path. Accordingly, the objectives of this study were to predict femoral fracture in specimen-specific models using the XFEM approach, to perform one-to-one comparisons of predicted and in vitro fracture patterns, and to develop a framework to assess the mechanics and load transfer in the fractured femur when it is repaired with an osteosynthesis implant. Five specimen-specific femur models were developed from in vitro experiments under a simulated stance loading condition. Predicted fracture patterns closely matched the in vitro patterns; however, predictions of fracture load differed by approximately 50% due to sensitivity to local material properties. Specimen-specific intertrochanteric fractures were induced by subjecting the femur models to a sideways fall and repaired with a contemporary implant. Under a post-surgical stance loading, model-predicted load sharing between the implant and bone across the fracture surface varied from 59%:41% to 89%:11%, underscoring the importance of considering anatomic and fracture variability in the evaluation of implants. XFEM modeling shows potential as a macro-level analysis enabling fracture investigations of clinical cohorts, including at-risk groups, and the design of robust implants. © 2013 Published by Elsevier Ltd.
A Simulation and Modeling Framework for Space Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S
This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less
Zheng, Ming-Jie; Wang, Jue; Xu, Lu; Zha, Xiao-Ming; Zhao, Yi; Ling, Li-Jun; Wang, Shui
2015-02-01
During the past decades, many efforts have been made in mimicking the clinical progress of human cancer in mouse models. Previously, we developed a human breast tissue-derived (HB) mouse model. Theoretically, it may mimic the interactions between "species-specific" mammary microenvironment of human origin and human breast cancer cells. However, detailed evidences are absent. The present study (in vivo, cellular, and molecular experiments) was designed to explore the regulatory role of human mammary microenvironment in the progress of human breast cancer cells. Subcutaneous (SUB), mammary fat pad (MFP), and HB mouse models were developed for in vivo comparisons. Then, the orthotopic tumor masses from three different mouse models were collected for primary culture. Finally, the biology of primary cultured human breast cancer cells was compared by cellular and molecular experiments. Results of in vivo mouse models indicated that human breast cancer cells grew better in human mammary microenvironment. Cellular and molecular experiments confirmed that primary cultured human breast cancer cells from HB mouse model showed a better proliferative and anti-apoptotic biology than those from SUB to MFP mouse models. Meanwhile, primary cultured human breast cancer cells from HB mouse model also obtained the migratory and invasive biology for "species-specific" tissue metastasis to human tissues. Comprehensive analyses suggest that "species-specific" mammary microenvironment of human origin better regulates the biology of human breast cancer cells in our humanized mouse model of breast cancer, which is more consistent with the clinical progress of human breast cancer.
The development of a classification system for maternity models of care.
Donnolley, Natasha; Butler-Henderson, Kerryn; Chapman, Michael; Sullivan, Elizabeth
2016-08-01
A lack of standard terminology or means to identify and define models of maternity care in Australia has prevented accurate evaluations of outcomes for mothers and babies in different models of maternity care. As part of the Commonwealth-funded National Maternity Data Development Project, a classification system was developed utilising a data set specification that defines characteristics of models of maternity care. The Maternity Care Classification System or MaCCS was developed using a participatory action research design that built upon the published and grey literature. The study identified the characteristics that differentiate models of care and classifies models into eleven different Major Model Categories. The MaCCS will enable individual health services, local health districts (networks), jurisdictional and national health authorities to make better informed decisions for planning, policy development and delivery of maternity services in Australia. © The Author(s) 2016.
Estimating Radiation Dose Metrics for Patients Undergoing Tube Current Modulation CT Scans
NASA Astrophysics Data System (ADS)
McMillan, Kyle Lorin
Computed tomography (CT) has long been a powerful tool in the diagnosis of disease, identification of tumors and guidance of interventional procedures. With CT examinations comes the concern of radiation exposure and the associated risks. In order to properly understand those risks on a patient-specific level, organ dose must be quantified for each CT scan. Some of the most widely used organ dose estimates are derived from fixed tube current (FTC) scans of a standard sized idealized patient model. However, in current clinical practice, patient size varies from neonates weighing just a few kg to morbidly obese patients weighing over 200 kg, and nearly all CT exams are performed with tube current modulation (TCM), a scanning technique that adjusts scanner output according to changes in patient attenuation. Methods to account for TCM in CT organ dose estimates have been previously demonstrated, but these methods are limited in scope and/or restricted to idealized TCM profiles that are not based on physical observations and not scanner specific (e.g. don't account for tube limits, scanner-specific effects, etc.). The goal of this work was to develop methods to estimate organ doses to patients undergoing CT scans that take into account both the patient size as well as the effects of TCM. This work started with the development and validation of methods to estimate scanner-specific TCM schemes for any voxelized patient model. An approach was developed to generate estimated TCM schemes that match actual TCM schemes that would have been acquired on the scanner for any patient model. Using this approach, TCM schemes were then generated for a variety of body CT protocols for a set of reference voxelized phantoms for which TCM information does not currently exist. These are whole body patient models representing a variety of sizes, ages and genders that have all radiosensitive organs identified. TCM schemes for these models facilitated Monte Carlo-based estimates of fully-, partially- and indirectly-irradiated organ dose from TCM CT exams. By accounting for the effects of patient size in the organ dose estimates, a comprehensive set of patient-specific dose estimates from TCM CT exams was developed. These patient-specific organ dose estimates from TCM CT exams will provide a more complete understanding of the dose impact and risks associated with modern body CT scanning protocols.
Wang, Zhi-Bo; Zhang, Xiaoqing; Li, Xue-Jun
2013-01-01
Establishing human cell models of spinal muscular atrophy (SMA) to mimic motor neuron-specific phenotypes holds the key to understanding the pathogenesis of this devastating disease. Here, we developed a closely representative cell model of SMA by knocking down the disease-determining gene, survival motor neuron (SMN), in human embryonic stem cells (hESCs). Our study with this cell model demonstrated that knocking down of SMN does not interfere with neural induction or the initial specification of spinal motor neurons. Notably, the axonal outgrowth of spinal motor neurons was significantly impaired and these disease-mimicking neurons subsequently degenerated. Furthermore, these disease phenotypes were caused by SMN-full length (SMN-FL) but not SMN-Δ7 (lacking exon 7) knockdown, and were specific to spinal motor neurons. Restoring the expression of SMN-FL completely ameliorated all of the disease phenotypes, including specific axonal defects and motor neuron loss. Finally, knockdown of SMN-FL led to excessive mitochondrial oxidative stress in human motor neuron progenitors. The involvement of oxidative stress in the degeneration of spinal motor neurons in the SMA cell model was further confirmed by the administration of N-acetylcysteine, a potent antioxidant, which prevented disease-related apoptosis and subsequent motor neuron death. Thus, we report here the successful establishment of an hESC-based SMA model, which exhibits disease gene isoform specificity, cell type specificity, and phenotype reversibility. Our model provides a unique paradigm for studying how motor neurons specifically degenerate and highlights the potential importance of antioxidants for the treatment of SMA. PMID:23208423
NASA Astrophysics Data System (ADS)
Butler, Samuel D.; Marciniak, Michael A.
2014-09-01
Since the development of the Torrance-Sparrow bidirectional re ectance distribution function (BRDF) model in 1967, several BRDF models have been created. Previous attempts to categorize BRDF models have relied upon somewhat vague descriptors, such as empirical, semi-empirical, and experimental. Our approach is to instead categorize BRDF models based on functional form: microfacet normal distribution, geometric attenua- tion, directional-volumetric and Fresnel terms, and cross section conversion factor. Several popular microfacet models are compared to a standardized notation for a microfacet BRDF model. A library of microfacet model components is developed, allowing for creation of unique microfacet models driven by experimentally measured BRDFs.
Kriegel, Johannes; Reckwitz, Luise; Auinger, Klemens; Tuttle-Weidinger, Linda; Schmitt-Rüth, Stephanie; Kränzl-Nagl, Renate
2017-01-01
The development of eHealth and AAL (Ambient Assisted Living) services with the aim to reduce the complexity of living environments for the elderly often does not lead to the desired results on the market. The design of an eHealth/AAL specific framework for continuous New Service Development is presented in this paper. Our research addresses this challenge with a new Service Excellence Model (SEM) and outlines the benefits of this specific approach. The research is based on the data of the DALIA project (Assistant for DAily LIfe Activities at Home) and the PenAAL project (Performance Measurement Index for AAL solutions), parts of which the projects were the classification of relevant business dimensions and the development of a related scoring tool for continuous benchmarking and improvement.
Davison, Kirsten K; Blake, Christine E; Blaine, Rachel E; Younginer, Nicholas A; Orloski, Alexandria; Hamtil, Heather A; Ganter, Claudia; Bruton, Yasmeen P; Vaughn, Amber E; Fisher, Jennifer O
2015-09-17
Snacking contributes to excessive energy intakes in children. Yet factors shaping child snacking are virtually unstudied. This study examines food parenting practices specific to child snacking among low-income caregivers. Semi-structured interviews were conducted in English or Spanish with 60 low-income caregivers of preschool-aged children (18 non-Hispanic white, 22 African American/Black, 20 Hispanic; 92% mothers). A structured interview guide was used to solicit caregivers' definitions of snacking and strategies they use to decide what, when and how much snack their child eats. Interviews were audio-recorded, transcribed verbatim and analyzed using an iterative theory-based and grounded approach. A conceptual model of food parenting specific to child snacking was developed to summarize the findings and inform future research. Caregivers' descriptions of food parenting practices specific to child snacking were consistent with previous models of food parenting developed based on expert opinion [1, 2]. A few noteworthy differences however emerged. More than half of participants mentioned permissive feeding approaches (e.g., my child is the boss when it comes to snacks). As a result, permissive feeding was included as a higher order feeding dimension in the resulting model. In addition, a number of novel feeding approaches specific to child snacking emerged including child-centered provision of snacks (i.e., responding to a child's hunger cues when making decisions about snacks), parent unilateral decision making (i.e., making decisions about a child's snacks without any input from the child), and excessive monitoring of snacks (i.e., monitoring all snacks provided to and consumed by the child). The resulting conceptual model includes four higher order feeding dimensions including autonomy support, coercive control, structure and permissiveness and 20 sub-dimensions. This study formulates a language around food parenting practices specific to child snacking, identifies dominant constructs, and proposes a conceptual framework to guide future research.
Carter, Jane V.; Roberts, Henry L.; Pan, Jianmin; Rice, Jonathan D.; Burton, James F.; Galbraith, Norman J.; Eichenberger, M. Robert; Jorden, Jeffery; Deveaux, Peter; Farmer, Russell; Williford, Anna; Kanaan, Ziad; Rai, Shesh N.; Galandiuk, Susan
2016-01-01
OBJECTIVE(S) Develop a plasma-based microRNA (miRNA) diagnostic assay specific for colorectal neoplasms, building upon our prior work. BACKGROUND Colorectal neoplasms (colorectal cancer [CRC] and colorectal advanced adenoma [CAA]) frequently develop in individuals at ages when other common cancers also occur. Current screening methods lack sensitivity, specificity, and have poor patient compliance. METHODS Plasma was screened for 380 miRNAs using microfluidic array technology from a “Training” cohort of 60 patients, (10 each) control, CRC, CAA, breast (BC), pancreatic (PC) and lung (LC) cancer. We identified uniquely dysregulated miRNAs specific for colorectal neoplasia (p<0.05, false discovery rate: 5%, adjusted α=0.0038). These miRNAs were evaluated using single assays in a “Test” cohort of 120 patients. A mathematical model was developed to predict blinded sample identity in a 150 patient “Validation” cohort using repeat-sub-sampling validation of the testing dataset with 1000 iterations each to assess model detection accuracy. RESULTS Seven miRNAs (miR-21, miR-29c, miR-122, miR-192, miR-346, miR-372, miR-374a) were selected based upon p-value, area-under-the-curve (AUC), fold-change, and biological plausibility. AUC (±95% CI) for “Test” cohort comparisons were 0.91 (0.85-0.96), 0.79 (0.70-0.88) and 0.98 (0.96-1.0), respectively. Our mathematical model predicted blinded sample identity with 69-77% accuracy between all neoplasia and controls, 67-76% accuracy between colorectal neoplasia and other cancers, and 86-90% accuracy between colorectal cancer and colorectal adenoma. CONCLUSIONS Our plasma miRNA assay and prediction model differentiates colorectal neoplasia from patients with other neoplasms and from controls with higher sensitivity and specificity compared to current clinical standards. PMID:27471839
Conceptual model of iCAL4LA: Proposing the components using comparative analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zulaiha; Mutalib, Ariffin Abdul
2016-08-01
This paper discusses an on-going study that initiates an initial process in determining the common components for a conceptual model of interactive computer-assisted learning that is specifically designed for low achieving children. This group of children needs a specific learning support that can be used as an alternative learning material in their learning environment. In order to develop the conceptual model, this study extracts the common components from 15 strongly justified computer assisted learning studies. A comparative analysis has been conducted to determine the most appropriate components by using a set of specific indication classification to prioritize the applicability. The results of the extraction process reveal 17 common components for consideration. Later, based on scientific justifications, 16 of them were selected as the proposed components for the model.
Fan, Yu; Xi, Liu; Hughes, Daniel S T; Zhang, Jianjun; Zhang, Jianhua; Futreal, P Andrew; Wheeler, David A; Wang, Wenyi
2016-08-24
Subclonal mutations reveal important features of the genetic architecture of tumors. However, accurate detection of mutations in genetically heterogeneous tumor cell populations using next-generation sequencing remains challenging. We develop MuSE ( http://bioinformatics.mdanderson.org/main/MuSE ), Mutation calling using a Markov Substitution model for Evolution, a novel approach for modeling the evolution of the allelic composition of the tumor and normal tissue at each reference base. MuSE adopts a sample-specific error model that reflects the underlying tumor heterogeneity to greatly improve the overall accuracy. We demonstrate the accuracy of MuSE in calling subclonal mutations in the context of large-scale tumor sequencing projects using whole exome and whole genome sequencing.
Multiple-Input Subject-Specific Modeling of Plasma Glucose Concentration for Feedforward Control.
Kotz, Kaylee; Cinar, Ali; Mei, Yong; Roggendorf, Amy; Littlejohn, Elizabeth; Quinn, Laurie; Rollins, Derrick K
2014-11-26
The ability to accurately develop subject-specific, input causation models, for blood glucose concentration (BGC) for large input sets can have a significant impact on tightening control for insulin dependent diabetes. More specifically, for Type 1 diabetics (T1Ds), it can lead to an effective artificial pancreas (i.e., an automatic control system that delivers exogenous insulin) under extreme changes in critical disturbances. These disturbances include food consumption, activity variations, and physiological stress changes. Thus, this paper presents a free-living, outpatient, multiple-input, modeling method for BGC with strong causation attributes that is stable and guards against overfitting to provide an effective modeling approach for feedforward control (FFC). This approach is a Wiener block-oriented methodology, which has unique attributes for meeting critical requirements for effective, long-term, FFC.
Sun, Yanqing; Qi, Li; Yang, Guangren; Gilbert, Peter B
2018-05-01
This article develops hypothesis testing procedures for the stratified mark-specific proportional hazards model with missing covariates where the baseline functions may vary with strata. The mark-specific proportional hazards model has been studied to evaluate mark-specific relative risks where the mark is the genetic distance of an infecting HIV sequence to an HIV sequence represented inside the vaccine. This research is motivated by analyzing the RV144 phase 3 HIV vaccine efficacy trial, to understand associations of immune response biomarkers on the mark-specific hazard of HIV infection, where the biomarkers are sampled via a two-phase sampling nested case-control design. We test whether the mark-specific relative risks are unity and how they change with the mark. The developed procedures enable assessment of whether risk of HIV infection with HIV variants close or far from the vaccine sequence are modified by immune responses induced by the HIV vaccine; this question is interesting because vaccine protection occurs through immune responses directed at specific HIV sequences. The test statistics are constructed based on augmented inverse probability weighted complete-case estimators. The asymptotic properties and finite-sample performances of the testing procedures are investigated, demonstrating double-robustness and effectiveness of the predictive auxiliaries to recover efficiency. The finite-sample performance of the proposed tests are examined through a comprehensive simulation study. The methods are applied to the RV144 trial. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Hitt, William D.; Agostino, Norman R.
This study to develop an education and training (E&T) system for inmates in Federal correctional institutions described and evaluated existing E&T systems and needs at Milan, Michigan, and Terre Haute, Indiana; formulated an E&T model; and made specific recommendations for implementation of each point in the model. A systems analysis approach was…
Recombination Rates of Electrons with Interstellar PAH Molecules
NASA Technical Reports Server (NTRS)
Ballester, Jorge (Cartographer)
1996-01-01
The goal of this project is to develop a general model for the recombination of electrons with PAH molecules in an interstellar environment. The model is being developed such that it can be applied to a small number of families of PAHs without reference to specific molecular structures. Special attention will be focused on modeling the approximately circular compact PAHs in a way that only depends on the number of carbon atoms.
Model-centric approaches for the development of health information systems.
Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa
2007-01-01
Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.
Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.
2017-12-01
Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.
Towards Formal Implementation of PUS Standard
NASA Astrophysics Data System (ADS)
Ilić, D.
2009-05-01
As an effort to promote the reuse of on-board and ground systems ESA developed a standard for packet telemetry and telecommand - PUS. It defines a set of standard service models with the corresponding structures of the associated telemetry and telecommand packets. Various missions then can choose to implement those standard PUS services that best conform to their specific requirements. In this paper we propose a formal development (based on the Event-B method) of reusable service patterns, which can be instantiated for concrete application. Our formal models allow us to formally express and verify specific service properties including various telecommand and telemetry packet structure validation.
Rorschach assessment of cognitive impairment from an object relations perspective.
Lerner, P M
1996-01-01
In 1986, H. Lerner and P. Lerner proposed an object relations model of thinking that integrated Piaget's theory of early cognitive development with Mahler's theory of separation-individuation. They identified three distinct, interdigitated stages, outlined the cognitive task for each stage, detailed the necessary role and function of the stage-specific caregiving object, and suggested potential cognitive impairments associated with the object not fulfilling its function. Herein, this conceptual model is extended to the Rorschach. Rorschach indices of cognitive impairments associated with each stage were developed. The indices are then applied to the Rorschach records of children who were selected as prototypical of specific developmental disorders.
NASA Technical Reports Server (NTRS)
Mcknight, R. L.
1985-01-01
A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.
Animal models for studying neural crest development: is the mouse different?
Barriga, Elias H; Trainor, Paul A; Bronner, Marianne; Mayor, Roberto
2015-05-01
The neural crest is a uniquely vertebrate cell type and has been well studied in a number of model systems. Zebrafish, Xenopus and chick embryos largely show consistent requirements for specific genes in early steps of neural crest development. By contrast, knockouts of homologous genes in the mouse often do not exhibit comparable early neural crest phenotypes. In this Spotlight article, we discuss these species-specific differences, suggest possible explanations for the divergent phenotypes in mouse and urge the community to consider these issues and the need for further research in complementary systems. © 2015. Published by The Company of Biologists Ltd.
NASA Technical Reports Server (NTRS)
Dalling, D. K.; Bailey, B. K.; Pugmire, R. J.
1984-01-01
A proton and carbon-13 nuclear magnetic resonance (NMR) study was conducted of Ashland shale oil refinery products, experimental referee broadened-specification jet fuels, and of related isoprenoid model compounds. Supercritical fluid chromatography techniques using carbon dioxide were developed on a preparative scale, so that samples could be quantitatively separated into saturates and aromatic fractions for study by NMR. An optimized average parameter treatment was developed, and the NMR results were analyzed in terms of the resulting average parameters; formulation of model mixtures was demonstrated. Application of novel spectroscopic techniques to fuel samples was investigated.