Approaches for the Application of Physiologically Based ...
EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. EPA is releasing a final report describing the evaluation and applications of physiologically based pharmacokinetic (PBPK) models in health risk assessment. This was announced in the September 22 2006 Federal Register Notice.
Agent-based modeling: Methods and techniques for simulating human systems
Bonabeau, Eric
2002-01-01
Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407
An improved task-role-based access control model for G-CSCW applications
NASA Astrophysics Data System (ADS)
He, Chaoying; Chen, Jun; Jiang, Jie; Han, Gang
2005-10-01
Access control is an important and popular security mechanism for multi-user applications. GIS-based Computer Supported Cooperative Work (G-CSCW) application is one of such applications. This paper presents an improved Task-Role-Based Access Control (X-TRBAC) model for G-CSCW applications. The new model inherits the basic concepts of the old ones, such as role and task. Moreover, it has introduced two concepts, i.e. object hierarchy and operation hierarchy, and the corresponding rules to improve the efficiency of permission definition in access control models. The experiments show that the method can simplify the definition of permissions, and it is more applicable for G-CSCW applications.
ERIC Educational Resources Information Center
Gu, X.; Blackmore, K. L.
2015-01-01
This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…
Satoshi Hirabayashi; Chuck Kroll; David Nowak
2011-01-01
The Urban Forest Effects-Deposition model (UFORE-D) was developed with a component-based modeling approach. Functions of the model were separated into components that are responsible for user interface, data input/output, and core model functions. Taking advantage of the component-based approach, three UFORE-D applications were developed: a base application to estimate...
An investigation of modelling and design for software service applications.
Anjum, Maria; Budgen, David
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, Robert C.; Ray, Jaideep; Malony, A.
2003-11-01
We present a case study of performance measurement and modeling of a CCA (Common Component Architecture) component-based application in a high performance computing environment. We explore issues peculiar to component-based HPC applications and propose a performance measurement infrastructure for HPC based loosely on recent work done for Grid environments. A prototypical implementation of the infrastructure is used to collect data for a three components in a scientific application and construct performance models for two of them. Both computational and message-passing performance are addressed.
Sakhteman, Amirhossein; Zare, Bijan
2016-01-01
An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276
An investigation of modelling and design for software service applications
2017-01-01
Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie
Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less
Integration of Optimal Scheduling with Case-Based Planning.
1995-08-01
integrates Case-Based Reasoning (CBR) and Rule-Based Reasoning (RBR) systems. ’ Tachyon : A Constraint-Based Temporal Reasoning Model and Its...Implementation’ provides an overview of the Tachyon temporal’s reasoning system and discusses its possible applications. ’Dual-Use Applications of Tachyon : From...Force Structure Modeling to Manufacturing Scheduling’ discusses the application of Tachyon to real world problems, specifically military force deployment and manufacturing scheduling.
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
NASA Astrophysics Data System (ADS)
Abisset-Chavanne, Emmanuelle; Duval, Jean Louis; Cueto, Elias; Chinesta, Francisco
2018-05-01
Traditionally, Simulation-Based Engineering Sciences (SBES) has relied on the use of static data inputs (model parameters, initial or boundary conditions, … obtained from adequate experiments) to perform simulations. A new paradigm in the field of Applied Sciences and Engineering has emerged in the last decade. Dynamic Data-Driven Application Systems [9, 10, 11, 12, 22] allow the linkage of simulation tools with measurement devices for real-time control of simulations and applications, entailing the ability to dynamically incorporate additional data into an executing application, and in reverse, the ability of an application to dynamically steer the measurement process. It is in that context that traditional "digital-twins" are giving raise to a new generation of goal-oriented data-driven application systems, also known as "hybrid-twins", embracing models based on physics and models exclusively based on data adequately collected and assimilated for filling the gap between usual model predictions and measurements. Within this framework new methodologies based on model learners, machine learning and kinetic goal-oriented design are defining a new paradigm in materials, processes and systems engineering.
Resource management and scheduling policy based on grid for AIoT
NASA Astrophysics Data System (ADS)
Zou, Yiqin; Quan, Li
2017-07-01
This paper has a research on resource management and scheduling policy based on grid technology for Agricultural Internet of Things (AIoT). Facing the situation of a variety of complex and heterogeneous agricultural resources in AIoT, it is difficult to represent them in a unified way. But from an abstract perspective, there are some common models which can express their characteristics and features. Based on this, we proposed a high-level model called Agricultural Resource Hierarchy Model (ARHM), which can be used for modeling various resources. It introduces the agricultural resource modeling method based on this model. Compared with traditional application-oriented three-layer model, ARHM can hide the differences of different applications and make all applications have a unified interface layer and be implemented without distinction. Furthermore, it proposes a Web Service Resource Framework (WSRF)-based resource management method and the encapsulation structure for it. Finally, it focuses on the discussion of multi-agent-based AG resource scheduler, which is a collaborative service provider pattern in multiple agricultural production domains.
EPA released the final report, Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment as announced in a September 22 2006 Federal Register Notice.This final report addresses the application and evaluati...
Demonstration of the Web-based Interspecies Correlation Estimation (Web-ICE) modeling application
The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...
Toward a Model-Based Predictive Controller Design in Brain–Computer Interfaces
Kamrunnahar, M.; Dias, N. S.; Schiff, S. J.
2013-01-01
A first step in designing a robust and optimal model-based predictive controller (MPC) for brain–computer interface (BCI) applications is presented in this article. An MPC has the potential to achieve improved BCI performance compared to the performance achieved by current ad hoc, nonmodel-based filter applications. The parameters in designing the controller were extracted as model-based features from motor imagery task-related human scalp electroencephalography. Although the parameters can be generated from any model-linear or non-linear, we here adopted a simple autoregressive model that has well-established applications in BCI task discriminations. It was shown that the parameters generated for the controller design can as well be used for motor imagery task discriminations with performance (with 8–23% task discrimination errors) comparable to the discrimination performance of the commonly used features such as frequency specific band powers and the AR model parameters directly used. An optimal MPC has significant implications for high performance BCI applications. PMID:21267657
Toward a model-based predictive controller design in brain-computer interfaces.
Kamrunnahar, M; Dias, N S; Schiff, S J
2011-05-01
A first step in designing a robust and optimal model-based predictive controller (MPC) for brain-computer interface (BCI) applications is presented in this article. An MPC has the potential to achieve improved BCI performance compared to the performance achieved by current ad hoc, nonmodel-based filter applications. The parameters in designing the controller were extracted as model-based features from motor imagery task-related human scalp electroencephalography. Although the parameters can be generated from any model-linear or non-linear, we here adopted a simple autoregressive model that has well-established applications in BCI task discriminations. It was shown that the parameters generated for the controller design can as well be used for motor imagery task discriminations with performance (with 8-23% task discrimination errors) comparable to the discrimination performance of the commonly used features such as frequency specific band powers and the AR model parameters directly used. An optimal MPC has significant implications for high performance BCI applications.
Web-based application for inverting one-dimensional magnetotelluric data using Python
NASA Astrophysics Data System (ADS)
Suryanto, Wiwit; Irnaka, Theodosius Marwan
2016-11-01
One-dimensional modeling of magnetotelluric (MT) data has been performed using an online application on a web-based virtual private server. The application was developed with the Python language using the Django framework with HTML and CSS components. The input data, including the apparent resistivity and phase as a function of period or frequency with standard deviation, can be entered through an interactive web page that can be freely accessed at https://komputasi.geofisika.ugm.ac.id. The subsurface models, represented by resistivity as a function of depth, are iteratively improved by changing the model parameters, such as the resistivity and the layer depth, based on the observed apparent resistivity and phase data. The output of the application displayed on the screen presents resistivity as a function of depth and includes the RMS error for each iteration. Synthetic and real data were used in comparative tests of the application's performance, and it is shown that the application developed accurate subsurface resistivity models. Hence, this application can be used for practical one-dimensional modeling of MT data.
Qualitative model-based diagnosis using possibility theory
NASA Technical Reports Server (NTRS)
Joslyn, Cliff
1994-01-01
The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.
Physiologically based pharmacokinetic (PBPK) models have great potential for application in regulatory and non-regulatory public health risk assessment. The development and application of PBPK models in chemical toxicology has grown steadily since their emergence in the 1980s. Ho...
Search algorithm complexity modeling with application to image alignment and matching
NASA Astrophysics Data System (ADS)
DelMarco, Stephen
2014-05-01
Search algorithm complexity modeling, in the form of penetration rate estimation, provides a useful way to estimate search efficiency in application domains which involve searching over a hypothesis space of reference templates or models, as in model-based object recognition, automatic target recognition, and biometric recognition. The penetration rate quantifies the expected portion of the database that must be searched, and is useful for estimating search algorithm computational requirements. In this paper we perform mathematical modeling to derive general equations for penetration rate estimates that are applicable to a wide range of recognition problems. We extend previous penetration rate analyses to use more general probabilistic modeling assumptions. In particular we provide penetration rate equations within the framework of a model-based image alignment application domain in which a prioritized hierarchical grid search is used to rank subspace bins based on matching probability. We derive general equations, and provide special cases based on simplifying assumptions. We show how previously-derived penetration rate equations are special cases of the general formulation. We apply the analysis to model-based logo image alignment in which a hierarchical grid search is used over a geometric misalignment transform hypothesis space. We present numerical results validating the modeling assumptions and derived formulation.
2015-03-01
UNCLASSIFIED UNCLASSIFIED Biotechnology on the Battlefield: An Application of Agent-based Modelling for Emerging Technology Assessment...wounds might be treatable using advanced biotechnologies to control haemorrhaging and reduce blood-loss until medical evacuation can be completed. This...APPROVED FOR PUBLIC RELEASE UNCLASSIFIED UNCLASSIFIED Biotechnology on the Battlefield: An Application
1993-01-01
animals in toxicology research, the application of pharmacokinetics and physiologically based pharmacokinetic mdels in chemical risk assessment, selected...metaplasia Neurotoxicity Nonmutagenic carcinogens Ozone P450 PBPK modeling Perfluorohexane Peroxisome proliferators Pharmacokinetics Pharmacokinetic models...Physiological modeling Physiologically based pharmacokinetic modeling Polycyclic organic matter Quantitative risk assessment RAIRM model Rats
Application of Complex Adaptive Systems in Portfolio Management
ERIC Educational Resources Information Center
Su, Zheyuan
2017-01-01
Simulation-based methods are becoming a promising research tool in financial markets. A general Complex Adaptive System can be tailored to different application scenarios. Based on the current research, we built two models that would benefit portfolio management by utilizing Complex Adaptive Systems (CAS) in Agent-based Modeling (ABM) approach.…
Hydrological Simulation Program - FORTRAN (HSPF) Data Formatting Tool (HDFT)
The HSPF data formatting and unit conversion tool has two seperate applications: a web-based application and a desktop application. The tool was developed to aid users in formatting data for HSPF stormwater modeling applications. Unlike traditional HSPF modeling applications, sto...
Web-based applications for building, managing and analysing kinetic models of biological systems.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
2009-01-01
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
A new service-oriented grid-based method for AIoT application and implementation
NASA Astrophysics Data System (ADS)
Zou, Yiqin; Quan, Li
2017-07-01
The traditional three-layer Internet of things (IoT) model, which includes physical perception layer, information transferring layer and service application layer, cannot express complexity and diversity in agricultural engineering area completely. It is hard to categorize, organize and manage the agricultural things with these three layers. Based on the above requirements, we propose a new service-oriented grid-based method to set up and build the agricultural IoT. Considering the heterogeneous, limitation, transparency and leveling attributes of agricultural things, we propose an abstract model for all agricultural resources. This model is service-oriented and expressed with Open Grid Services Architecture (OGSA). Information and data of agricultural things were described and encapsulated by using XML in this model. Every agricultural engineering application will provide service by enabling one application node in this service-oriented grid. Description of Web Service Resource Framework (WSRF)-based Agricultural Internet of Things (AIoT) and the encapsulation method were also discussed in this paper for resource management in this model.
A bootstrap based space-time surveillance model with an application to crime occurrences
NASA Astrophysics Data System (ADS)
Kim, Youngho; O'Kelly, Morton
2008-06-01
This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.
Predictive Microbiology and Food Safety Applications
USDA-ARS?s Scientific Manuscript database
Mathematical modeling is the science of systematic study of recurrent events or phenomena. When models are properly developed, their applications may save costs and time. For microbial food safety research and applications, predictive microbiology models may be developed based on the fact that most ...
Atom-Role-Based Access Control Model
NASA Astrophysics Data System (ADS)
Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong
Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.
40 CFR 93.159 - Procedures for conformity determinations of general Federal actions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... based on the applicable air quality models, data bases, and other requirements specified in the most... applicable air quality models, data bases, and other requirements specified in the most recent version of the... data are available, such as actual stack test data from stationary sources which are part of the...
The research on construction and application of machining process knowledge base
NASA Astrophysics Data System (ADS)
Zhao, Tan; Qiao, Lihong; Qie, Yifan; Guo, Kai
2018-03-01
In order to realize the application of knowledge in machining process design, from the perspective of knowledge in the application of computer aided process planning(CAPP), a hierarchical structure of knowledge classification is established according to the characteristics of mechanical engineering field. The expression of machining process knowledge is structured by means of production rules and the object-oriented methods. Three kinds of knowledge base models are constructed according to the representation of machining process knowledge. In this paper, the definition and classification of machining process knowledge, knowledge model, and the application flow of the process design based on the knowledge base are given, and the main steps of the design decision of the machine tool are carried out as an application by using the knowledge base.
Update on Small Modular Reactors Dynamic System Modeling Tool: Web Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Richard Edward; Cetiner, Sacit M.; Fugate, David L.
Previous reports focused on the development of component and system models as well as end-to-end system models using Modelica and Dymola for two advanced reactor architectures: (1) Advanced Liquid Metal Reactor and (2) fluoride high-temperature reactor (FHR). The focus of this report is the release of the first beta version of the web-based application for model use and collaboration, as well as an update on the FHR model. The web-based application allows novice users to configure end-to-end system models from preconfigured choices to investigate the instrumentation and controls implications of these designs and allows for the collaborative development of individualmore » component models that can be benchmarked against test systems for potential inclusion in the model library. A description of this application is provided along with examples of its use and a listing and discussion of all the models that currently exist in the library.« less
Continuum Fatigue Damage Modeling for Use in Life Extending Control
NASA Technical Reports Server (NTRS)
Lorenzo, Carl F.
1994-01-01
This paper develops a simplified continuum (continuous wrp to time, stress, etc.) fatigue damage model for use in Life Extending Controls (LEC) studies. The work is based on zero mean stress local strain cyclic damage modeling. New nonlinear explicit equation forms of cyclic damage in terms of stress amplitude are derived to facilitate the continuum modeling. Stress based continuum models are derived. Extension to plastic strain-strain rate models are also presented. Application of these models to LEC applications is considered. Progress toward a nonzero mean stress based continuum model is presented. Also, new nonlinear explicit equation forms in terms of stress amplitude are also derived for this case.
USDA-ARS?s Scientific Manuscript database
This work is devoted to review the new scientific divisions that emerged in agrophysics in the last 10-15 years. Among them are the following: 1) application of Adaptive Neuro-Fuzzy Inference System (ANFIS), 2) development and application of fuzzy indicator modeling, 3) agrophysical and physic-tech...
NASA Astrophysics Data System (ADS)
El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis
2018-02-01
Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.
Surface-Potential-Based Metal-Oxide-Silicon-Varactor Model for RF Applications
NASA Astrophysics Data System (ADS)
Miyake, Masataka; Sadachika, Norio; Navarro, Dondee; Mizukane, Yoshio; Matsumoto, Kenji; Ezaki, Tatsuya; Miura-Mattausch, Mitiko; Mattausch, Hans Juergen; Ohguro, Tatsuya; Iizuka, Takahiro; Taguchi, Masahiko; Kumashiro, Shigetaka; Miyamoto, Shunsuke
2007-04-01
We have developed a surface-potential-based metal-oxide-silicon (MOS)-varactor model valid for RF applications up to 200 GHz. The model enables the calculation of the MOS-varactor capacitance seamlessly from the depletion region to the accumulation region and explicitly considers the carrier-response delay causing a non-quasi-static (NQS) effect. It has been observed that capacitance reduction due to this non-quasi-static effect limits the MOS-varactor application to an RF regime.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
Measuring the Perceived Quality of an AR-Based Learning Application: A Multidimensional Model
ERIC Educational Resources Information Center
Pribeanu, Costin; Balog, Alexandru; Iordache, Dragos Daniel
2017-01-01
Augmented reality (AR) technologies could enhance learning in several ways. The quality of an AR-based educational platform is a combination of key features that manifests in usability, usefulness, and enjoyment for the learner. In this paper, we present a multidimensional model to measure the quality of an AR-based application as perceived by…
A physiologically based pharmacokinetic (PBPK) model was developed to investigate exposure scenarios of children to carbaryl following turf application. Physiological, pharmacokinetic and pharmacodynamic parameters describing the fate and effects of carbaryl in rats were scaled ...
Enterprise application architecture development based on DoDAF and TOGAF
NASA Astrophysics Data System (ADS)
Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng
2017-05-01
For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.
gPKPDSim: a SimBiology®-based GUI application for PKPD modeling in drug development.
Hosseini, Iraj; Gajjala, Anita; Bumbaca Yadav, Daniela; Sukumaran, Siddharth; Ramanujan, Saroja; Paxson, Ricardo; Gadkar, Kapil
2018-04-01
Modeling and simulation (M&S) is increasingly used in drug development to characterize pharmacokinetic-pharmacodynamic (PKPD) relationships and support various efforts such as target feasibility assessment, molecule selection, human PK projection, and preclinical and clinical dose and schedule determination. While model development typically require mathematical modeling expertise, model exploration and simulations could in many cases be performed by scientists in various disciplines to support the design, analysis and interpretation of experimental studies. To this end, we have developed a versatile graphical user interface (GUI) application to enable easy use of any model constructed in SimBiology ® to execute various common PKPD analyses. The MATLAB ® -based GUI application, called gPKPDSim, has a single screen interface and provides functionalities including simulation, data fitting (parameter estimation), population simulation (exploring the impact of parameter variability on the outputs of interest), and non-compartmental PK analysis. Further, gPKPDSim is a user-friendly tool with capabilities including interactive visualization, exporting of results and generation of presentation-ready figures. gPKPDSim was designed primarily for use in preclinical and translational drug development, although broader applications exist. gPKPDSim is a MATLAB ® -based open-source application and is publicly available to download from MATLAB ® Central™. We illustrate the use and features of gPKPDSim using multiple PKPD models to demonstrate the wide applications of this tool in pharmaceutical sciences. Overall, gPKPDSim provides an integrated, multi-purpose user-friendly GUI application to enable efficient use of PKPD models by scientists from various disciplines, regardless of their modeling expertise.
An ontology-based semantic configuration approach to constructing Data as a Service for enterprises
NASA Astrophysics Data System (ADS)
Cai, Hongming; Xie, Cheng; Jiang, Lihong; Fang, Lu; Huang, Chenxi
2016-03-01
To align business strategies with IT systems, enterprises should rapidly implement new applications based on existing information with complex associations to adapt to the continually changing external business environment. Thus, Data as a Service (DaaS) has become an enabling technology for enterprise through information integration and the configuration of existing distributed enterprise systems and heterogonous data sources. However, business modelling, system configuration and model alignment face challenges at the design and execution stages. To provide a comprehensive solution to facilitate data-centric application design in a highly complex and large-scale situation, a configurable ontology-based service integrated platform (COSIP) is proposed to support business modelling, system configuration and execution management. First, a meta-resource model is constructed and used to describe and encapsulate information resources by way of multi-view business modelling. Then, based on ontologies, three semantic configuration patterns, namely composite resource configuration, business scene configuration and runtime environment configuration, are designed to systematically connect business goals with executable applications. Finally, a software architecture based on model-view-controller (MVC) is provided and used to assemble components for software implementation. The result of the case study demonstrates that the proposed approach provides a flexible method of implementing data-centric applications.
Park, Hahnbeom; Lee, Gyu Rie; Heo, Lim; Seok, Chaok
2014-01-01
Protein loop modeling is a tool for predicting protein local structures of particular interest, providing opportunities for applications involving protein structure prediction and de novo protein design. Until recently, the majority of loop modeling methods have been developed and tested by reconstructing loops in frameworks of experimentally resolved structures. In many practical applications, however, the protein loops to be modeled are located in inaccurate structural environments. These include loops in model structures, low-resolution experimental structures, or experimental structures of different functional forms. Accordingly, discrepancies in the accuracy of the structural environment assumed in development of the method and that in practical applications present additional challenges to modern loop modeling methods. This study demonstrates a new strategy for employing a hybrid energy function combining physics-based and knowledge-based components to help tackle this challenge. The hybrid energy function is designed to combine the strengths of each energy component, simultaneously maintaining accurate loop structure prediction in a high-resolution framework structure and tolerating minor environmental errors in low-resolution structures. A loop modeling method based on global optimization of this new energy function is tested on loop targets situated in different levels of environmental errors, ranging from experimental structures to structures perturbed in backbone as well as side chains and template-based model structures. The new method performs comparably to force field-based approaches in loop reconstruction in crystal structures and better in loop prediction in inaccurate framework structures. This result suggests that higher-accuracy predictions would be possible for a broader range of applications. The web server for this method is available at http://galaxy.seoklab.org/loop with the PS2 option for the scoring function.
Application of optimization technique for flood damage modeling in river system
NASA Astrophysics Data System (ADS)
Barman, Sangita Deb; Choudhury, Parthasarathi
2018-04-01
A river system is defined as a network of channels that drains different parts of a basin uniting downstream to form a common outflow. An application of various models found in literatures, to a river system having multiple upstream flows is not always straight forward, involves a lengthy procedure; and with non-availability of data sets model calibration and applications may become difficult. In the case of a river system the flow modeling can be simplified to a large extent if the channel network is replaced by an equivalent single channel. In the present work optimization model formulations based on equivalent flow and applications of the mixed integer programming based pre-emptive goal programming model in evaluating flood control alternatives for a real life river system in India are proposed to be covered in the study.
Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E
2014-09-23
Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.
Computing Models for FPGA-Based Accelerators
Herbordt, Martin C.; Gu, Yongfeng; VanCourt, Tom; Model, Josh; Sukhwani, Bharat; Chiu, Matt
2011-01-01
Field-programmable gate arrays are widely considered as accelerators for compute-intensive applications. A critical phase of FPGA application development is finding and mapping to the appropriate computing model. FPGA computing enables models with highly flexible fine-grained parallelism and associative operations such as broadcast and collective response. Several case studies demonstrate the effectiveness of using these computing models in developing FPGA applications for molecular modeling. PMID:21603152
A generative tool for building health applications driven by ISO 13606 archetypes.
Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás
2012-10-01
The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.
Mortensen, Stig B; Klim, Søren; Dammann, Bernd; Kristensen, Niels R; Madsen, Henrik; Overgaard, Rune V
2007-10-01
The non-linear mixed-effects model based on stochastic differential equations (SDEs) provides an attractive residual error model, that is able to handle serially correlated residuals typically arising from structural mis-specification of the true underlying model. The use of SDEs also opens up for new tools for model development and easily allows for tracking of unknown inputs and parameters over time. An algorithm for maximum likelihood estimation of the model has earlier been proposed, and the present paper presents the first general implementation of this algorithm. The implementation is done in Matlab and also demonstrates the use of parallel computing for improved estimation times. The use of the implementation is illustrated by two examples of application which focus on the ability of the model to estimate unknown inputs facilitated by the extension to SDEs. The first application is a deconvolution-type estimation of the insulin secretion rate based on a linear two-compartment model for C-peptide measurements. In the second application the model is extended to also give an estimate of the time varying liver extraction based on both C-peptide and insulin measurements.
Integration of Web-based and PC-based clinical research databases.
Brandt, C A; Sun, K; Charpentier, P; Nadkarni, P M
2004-01-01
We have created a Web-based repository or data library of information about measurement instruments used in studies of multi-factorial geriatric health conditions (the Geriatrics Research Instrument Library - GRIL) based upon existing features of two separate clinical study data management systems. GRIL allows browsing, searching, and selecting measurement instruments based upon criteria such as keywords and areas of applicability. Measurement instruments selected can be printed and/or included in an automatically generated standalone microcomputer database application, which can be downloaded by investigators for use in data collection and data management. Integration of database applications requires the creation of a common semantic model, and mapping from each system to this model. Various database schema conflicts at the table and attribute level must be identified and resolved prior to integration. Using a conflict taxonomy and a mapping schema facilitates this process. Critical conflicts at the table level that required resolution included name and relationship differences. A major benefit of integration efforts is the sharing of features and cross-fertilization of applications created for similar purposes in different operating environments. Integration of applications mandates some degree of metadata model unification.
Mobile-Based Dictionary of Information and Communication Technology
NASA Astrophysics Data System (ADS)
Liando, O. E. S.; Mewengkang, A.; Kaseger, D.; Sangkop, F. I.; Rantung, V. P.; Rorimpandey, G. C.
2018-02-01
This study aims to design and build mobile-based dictionary of information and communication technology applications to provide access to information in the form of glossary of terms in the context of information and communication technologies. Applications built in this study using the Android platform, with SQLite database model. This research uses prototype model development method which covers the stages of communication, Quick Plan, Quick Design Modeling, Construction of Prototype, Deployment Delivery & Feedback, and Full System Transformation. The design of this application is designed in such a way as to facilitate the user in the process of learning and understanding the new terms or vocabularies encountered in the world of information and communication technology. Mobile-based dictionary of Information And Communication Technology applications that have been built can be an alternative to learning literature. In its simplest form, this application is able to meet the need for a comprehensive and accurate dictionary of Information And Communication Technology function.
On the upscaling of process-based models in deltaic applications
NASA Astrophysics Data System (ADS)
Li, L.; Storms, J. E. A.; Walstra, D. J. R.
2018-03-01
Process-based numerical models are increasingly used to study the evolution of marine and terrestrial depositional environments. Whilst a detailed description of small-scale processes provides an accurate representation of reality, application on geological timescales is restrained by the associated increase in computational time. In order to reduce the computational time, a number of acceleration methods are combined and evaluated for a schematic supply-driven delta (static base level) and an accommodation-driven delta (variable base level). The performance of the combined acceleration methods is evaluated by comparing the morphological indicators such as distributary channel networking and delta volumes derived from the model predictions for various levels of acceleration. The results of the accelerated models are compared to the outcomes from a series of simulations to capture autogenic variability. Autogenic variability is quantified by re-running identical models on an initial bathymetry with 1 cm added noise. The overall results show that the variability of the accelerated models fall within the autogenic variability range, suggesting that the application of acceleration methods does not significantly affect the simulated delta evolution. The Time-scale compression method (the acceleration method introduced in this paper) results in an increased computational efficiency of 75% without adversely affecting the simulated delta evolution compared to a base case. The combination of the Time-scale compression method with the existing acceleration methods has the potential to extend the application range of process-based models towards geologic timescales.
Zou, Jiaqi; Li, Na
2013-09-01
Proper design of nucleic acid sequences is crucial for many applications. We have previously established a thermodynamics-based quantitative model to help design aptamer-based nucleic acid probes by predicting equilibrium concentrations of all interacting species. To facilitate customization of this thermodynamic model for different applications, here we present a generic and easy-to-use platform to implement the algorithm of the model with Microsoft(®) Excel formulas and VBA (Visual Basic for Applications) macros. Two Excel spreadsheets have been developed: one for the applications involving only nucleic acid species, the other for the applications involving both nucleic acid and non-nucleic acid species. The spreadsheets take the nucleic acid sequences and the initial concentrations of all species as input, guide the user to retrieve the necessary thermodynamic constants, and finally calculate equilibrium concentrations for all species in various bound and unbound conformations. The validity of both spreadsheets has been verified by comparing the modeling results with the experimental results on nucleic acid sequences reported in the literature. This Excel-based platform described here will allow biomedical researchers to rationalize the sequence design of nucleic acid probes using the thermodynamics-based modeling even without relevant theoretical and computational skills. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Zeng, Mufan; de Vries, Wim; Bonten, Luc T C; Zhu, Qichao; Hao, Tianxiang; Liu, Xuejun; Xu, Minggang; Shi, Xiaojun; Zhang, Fusuo; Shen, Jianbo
2017-04-04
Agricultural soil acidification in China is known to be caused by the over-application of nitrogen (N) fertilizers, but the long-term impacts of different fertilization practices on intensive cropland soil acidification are largely unknown. Here, we further developed the soil acidification model VSD+ for intensive agricultural systems and validated it against observed data from three long-term fertilization experiments in China. The model simulated well the changes in soil pH and base saturation over the last 20 years. The validated model was adopted to quantify the contribution of N and base cation (BC) fluxes to soil acidification. The net NO 3 - leaching and NO 4 + input accounted for 80% of the proton production under N application, whereas one-third of acid was produced by BC uptake when N was not applied. The simulated long-term (1990-2050) effects of different fertilizations on soil acidification showed that balanced N application combined with manure application avoids reduction of both soil pH and base saturation, while application of calcium nitrate and liming increases these two soil properties. Reducing NH 4 + input and NO 3 - leaching by optimizing N management and increasing BC inputs by manure application thus already seem to be effective approaches to mitigating soil acidification in intensive cropland systems.
Approaches for the Application of Physiologically Based ...
This draft report of Approaches for the Application of Physiologically Based Pharmacokinetic (PBPK) Models and Supporting Data in Risk Assessment addresses the application and evaluation of PBPK models for risk assessment purposes. These models represent an important class of dosimetry models that are useful for predicting internal dose at target organs for risk assessment applications. Topics covered include:the types of data required use of PBPK models in risk assessment,evaluation of PBPK models for use in risk assessment, andthe application of these models to address uncertainties resulting from extrapolations (e.g. interspecies extrapolation) often used in risk assessment.In addition, appendices are provided that includea compilation of chemical partition coefficients and rate constants,algorithms for estimating chemical-specific parameters, anda list of publications relating to PBPK modeling. This report is primarily meant to serve as a learning tool for EPA scientists and risk assessors who may be less familiar with the field. In addition, this report can be informative to PBPK modelers within and outside the Agency, as it provides an assessment of the types of data and models that the EPA requires for consideration of a model for use in risk assessment.
Information Sharing Modalities for Mobile Ad-Hoc Networks
NASA Astrophysics Data System (ADS)
de Spindler, Alexandre; Grossniklaus, Michael; Lins, Christoph; Norrie, Moira C.
Current mobile phone technologies have fostered the emergence of a new generation of mobile applications. Such applications allow users to interact and share information opportunistically when their mobile devices are in physical proximity or close to fixed installations. It has been shown how mobile applications such as collaborative filtering and location-based services can take advantage of ad-hoc connectivity to use physical proximity as a filter mechanism inherent to the application logic. We discuss the different modes of information sharing that arise in such settings based on the models of persistence and synchronisation. We present a platform that supports the development of applications that can exploit these modes of ad-hoc information sharing and, by means of an example, show how such an application can be realised based on the supported event model.
Models in Science Education: Applications of Models in Learning and Teaching Science
ERIC Educational Resources Information Center
Ornek, Funda
2008-01-01
In this paper, I discuss different types of models in science education and applications of them in learning and teaching science, in particular physics. Based on the literature, I categorize models as conceptual and mental models according to their characteristics. In addition to these models, there is another model called "physics model" by the…
A Linguistic Model in Component Oriented Programming
NASA Astrophysics Data System (ADS)
Crăciunean, Daniel Cristian; Crăciunean, Vasile
2016-12-01
It is a fact that the component-oriented programming, well organized, can bring a large increase in efficiency in the development of large software systems. This paper proposes a model for building software systems by assembling components that can operate independently of each other. The model is based on a computing environment that runs parallel and distributed applications. This paper introduces concepts as: abstract aggregation scheme and aggregation application. Basically, an aggregation application is an application that is obtained by combining corresponding components. In our model an aggregation application is a word in a language.
Application of a data base management system to a finite element model
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1980-01-01
In today's software market, much effort is being expended on the development of data base management systems (DBMS). Most commercially available DBMS were designed for business use. However, the need for such systems within the engineering and scientific communities is becoming apparent. A potential DBMS application that appears attractive is the handling of data for finite element engineering models. The applications of a commercially available, business-oriented DBMS to a structural engineering, finite element model is explored. The model, DBMS, an approach to using the DBMS, advantages and disadvantages are described. Plans for research on a scientific and engineering DBMS are discussed.
Model-based metabolism design: constraints for kinetic and stoichiometric models
Stalidzans, Egils; Seiman, Andrus; Peebo, Karl; Komasilovs, Vitalijs; Pentjuss, Agris
2018-01-01
The implementation of model-based designs in metabolic engineering and synthetic biology may fail. One of the reasons for this failure is that only a part of the real-world complexity is included in models. Still, some knowledge can be simplified and taken into account in the form of optimization constraints to improve the feasibility of model-based designs of metabolic pathways in organisms. Some constraints (mass balance, energy balance, and steady-state assumption) serve as a basis for many modelling approaches. There are others (total enzyme activity constraint and homeostatic constraint) proposed decades ago, but which are frequently ignored in design development. Several new approaches of cellular analysis have made possible the application of constraints like cell size, surface, and resource balance. Constraints for kinetic and stoichiometric models are grouped according to their applicability preconditions in (1) general constraints, (2) organism-level constraints, and (3) experiment-level constraints. General constraints are universal and are applicable for any system. Organism-level constraints are applicable for biological systems and usually are organism-specific, but these constraints can be applied without information about experimental conditions. To apply experimental-level constraints, peculiarities of the organism and the experimental set-up have to be taken into account to calculate the values of constraints. The limitations of applicability of particular constraints for kinetic and stoichiometric models are addressed. PMID:29472367
Nonlocal continuum-based modeling of mechanical characteristics of nanoscopic structures
NASA Astrophysics Data System (ADS)
Rafii-Tabar, Hashem; Ghavanloo, Esmaeal; Fazelzadeh, S. Ahmad
2016-06-01
Insight into the mechanical characteristics of nanoscopic structures is of fundamental interest and indeed poses a great challenge to the research communities around the world. These structures are ultra fine in size and consequently performing standard experiments to measure their various properties is an extremely difficult and expensive endeavor. Hence, to predict the mechanical characteristics of the nanoscopic structures, different theoretical models, numerical modeling techniques, and computer-based simulation methods have been developed. Among several proposed approaches, the nonlocal continuum-based modeling is of particular significance because the results obtained from this modeling for different nanoscopic structures are in very good agreement with the data obtained from both experimental and atomistic-based studies. A review of the essentials of this model together with its applications is presented here. Our paper is a self contained presentation of the nonlocal elasticity theory and contains the analysis of the recent works employing this model within the field of nanoscopic structures. In this review, the concepts from both the classical (local) and the nonlocal elasticity theories are presented and their applications to static and dynamic behavior of nanoscopic structures with various morphologies are discussed. We first introduce the various nanoscopic structures, both carbon-based and non carbon-based types, and then after a brief review of the definitions and concepts from classical elasticity theory, and the basic assumptions underlying size-dependent continuum theories, the mathematical details of the nonlocal elasticity theory are presented. A comprehensive discussion on the nonlocal version of the beam, the plate and the shell theories that are employed in modeling of the mechanical properties and behavior of nanoscopic structures is then provided. Next, an overview of the current literature discussing the application of the nonlocal models of nanoscopic carbon allotropes is presented. We then discuss the application of the models to the investigation of the properties of nanoscopic structures from different materials and with different types of morphologies. Furthermore, we also present recent developments in the application of the nonlocal models. Finally, conclusions and discussions regarding the potentiality of these models for future research are provided.
Applications of SPICE for modeling miniaturized biomedical sensor systems
NASA Technical Reports Server (NTRS)
Mundt, C. W.; Nagle, H. T.
2000-01-01
This paper proposes a model for a miniaturized signal conditioning system for biopotential and ion-selective electrode arrays. The system consists of three main components: sensors, interconnections, and signal conditioning chip. The model for this system is based on SPICE. Transmission-line based equivalent circuits are used to represent the sensors, lumped resistance-capacitance circuits describe the interconnections, and a model for the signal conditioning chip is extracted from its layout. A system for measurements of biopotentials and ionic activities can be miniaturized and optimized for cardiovascular applications based on the development of an integrated SPICE system model of its electrochemical, interconnection, and electronic components.
ABSTRACT
Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...
NASA Astrophysics Data System (ADS)
Miner, Nadine Elizabeth
1998-09-01
This dissertation presents a new wavelet-based method for synthesizing perceptually convincing, dynamic sounds using parameterized sound models. The sound synthesis method is applicable to a variety of applications including Virtual Reality (VR), multi-media, entertainment, and the World Wide Web (WWW). A unique contribution of this research is the modeling of the stochastic, or non-pitched, sound components. This stochastic-based modeling approach leads to perceptually compelling sound synthesis. Two preliminary studies conducted provide data on multi-sensory interaction and audio-visual synchronization timing. These results contributed to the design of the new sound synthesis method. The method uses a four-phase development process, including analysis, parameterization, synthesis and validation, to create the wavelet-based sound models. A patent is pending for this dynamic sound synthesis method, which provides perceptually-realistic, real-time sound generation. This dissertation also presents a battery of perceptual experiments developed to verify the sound synthesis results. These experiments are applicable for validation of any sound synthesis technique.
Computational Modeling of Human Metabolism and Its Application to Systems Biomedicine.
Aurich, Maike K; Thiele, Ines
2016-01-01
Modern high-throughput techniques offer immense opportunities to investigate whole-systems behavior, such as those underlying human diseases. However, the complexity of the data presents challenges in interpretation, and new avenues are needed to address the complexity of both diseases and data. Constraint-based modeling is one formalism applied in systems biology. It relies on a genome-scale reconstruction that captures extensive biochemical knowledge regarding an organism. The human genome-scale metabolic reconstruction is increasingly used to understand normal cellular and disease states because metabolism is an important factor in many human diseases. The application of human genome-scale reconstruction ranges from mere querying of the model as a knowledge base to studies that take advantage of the model's topology and, most notably, to functional predictions based on cell- and condition-specific metabolic models built based on omics data.An increasing number and diversity of biomedical questions are being addressed using constraint-based modeling and metabolic models. One of the most successful biomedical applications to date is cancer metabolism, but constraint-based modeling also holds great potential for inborn errors of metabolism or obesity. In addition, it offers great prospects for individualized approaches to diagnostics and the design of disease prevention and intervention strategies. Metabolic models support this endeavor by providing easy access to complex high-throughput datasets. Personalized metabolic models have been introduced. Finally, constraint-based modeling can be used to model whole-body metabolism, which will enable the elucidation of metabolic interactions between organs and disturbances of these interactions as either causes or consequence of metabolic diseases. This chapter introduces constraint-based modeling and describes some of its contributions to systems biomedicine.
Information-Flow-Based Access Control for Web Browsers
NASA Astrophysics Data System (ADS)
Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu
The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.
Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.
2011-01-01
Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608
Judging Alignment of Curriculum-Based Measures in Mathematics and Common Core Standards
ERIC Educational Resources Information Center
Morton, Christopher
2013-01-01
Measurement literature supports the utility of alignment models for application with state standards and large-scale assessments. However, the literature is lacking in the application of these models to curriculum-based measures (CBMs) and common core standards. In this study, I investigate the alignment of CBMs and standards, with specific…
USDA-ARS?s Scientific Manuscript database
Operational application of the two source energy balance model (TSEB) which can estimate evaportranspiration (ET) and the components evaporation (E), transpiration (T) of the land surface in different climates is very useful for many applications in hydrology and agriculture. The TSEB model uses an ...
Sun, Jianyu; Liang, Peng; Yan, Xiaoxu; Zuo, Kuichang; Xiao, Kang; Xia, Junlin; Qiu, Yong; Wu, Qing; Wu, Shijia; Huang, Xia; Qi, Meng; Wen, Xianghua
2016-04-15
Reducing the energy consumption of membrane bioreactors (MBRs) is highly important for their wider application in wastewater treatment engineering. Of particular significance is reducing aeration in aerobic tanks to reduce the overall energy consumption. This study proposed an in situ ammonia-N-based feedback control strategy for aeration in aerobic tanks; this was tested via model simulation and through a large-scale (50,000 m(3)/d) engineering application. A full-scale MBR model was developed based on the activated sludge model (ASM) and was calibrated to the actual MBR. The aeration control strategy took the form of a two-step cascaded proportion-integration (PI) feedback algorithm. Algorithmic parameters were optimized via model simulation. The strategy achieved real-time adjustment of aeration amounts based on feedback from effluent quality (i.e., ammonia-N). The effectiveness of the strategy was evaluated through both the model platform and the full-scale engineering application. In the former, the aeration flow rate was reduced by 15-20%. In the engineering application, the aeration flow rate was reduced by 20%, and overall specific energy consumption correspondingly reduced by 4% to 0.45 kWh/m(3)-effluent, using the present practice of regulating the angle of guide vanes of fixed-frequency blowers. Potential energy savings are expected to be higher for MBRs with variable-frequency blowers. This study indicated that the ammonia-N-based aeration control strategy holds promise for application in full-scale MBRs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Computational neuroanatomy: ontology-based representation of neural components and connectivity.
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-02-05
A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.
Electro-thermal battery model identification for automotive applications
NASA Astrophysics Data System (ADS)
Hu, Y.; Yurkovich, S.; Guezennec, Y.; Yurkovich, B. J.
This paper describes a model identification procedure for identifying an electro-thermal model of lithium ion batteries used in automotive applications. The dynamic model structure adopted is based on an equivalent circuit model whose parameters are scheduled on the state-of-charge, temperature, and current direction. Linear spline functions are used as the functional form for the parametric dependence. The model identified in this way is valid inside a large range of temperatures and state-of-charge, so that the resulting model can be used for automotive applications such as on-board estimation of the state-of-charge and state-of-health. The model coefficients are identified using a multiple step genetic algorithm based optimization procedure designed for large scale optimization problems. The validity of the procedure is demonstrated experimentally for an A123 lithium ion iron-phosphate battery.
Global Futures: a multithreaded execution model for Global Arrays-based applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Krishnamoorthy, Sriram; Vishnu, Abhinav
2012-05-31
We present Global Futures (GF), an execution model extension to Global Arrays, which is based on a PGAS-compatible Active Message-based paradigm. We describe the design and implementation of Global Futures and illustrate its use in a computational chemistry application benchmark (Hartree-Fock matrix construction using the Self-Consistent Field method). Our results show how we used GF to increase the scalability of the Hartree-Fock matrix build to up to 6,144 cores of an Infiniband cluster. We also show how GF's multithreaded execution has comparable performance to the traditional process-based SPMD model.
Lim, Cherry; Wannapinij, Prapass; White, Lisa; Day, Nicholas P J; Cooper, Ben S; Peacock, Sharon J; Limmathurotsakul, Direk
2013-01-01
Estimates of the sensitivity and specificity for new diagnostic tests based on evaluation against a known gold standard are imprecise when the accuracy of the gold standard is imperfect. Bayesian latent class models (LCMs) can be helpful under these circumstances, but the necessary analysis requires expertise in computational programming. Here, we describe open-access web-based applications that allow non-experts to apply Bayesian LCMs to their own data sets via a user-friendly interface. Applications for Bayesian LCMs were constructed on a web server using R and WinBUGS programs. The models provided (http://mice.tropmedres.ac) include two Bayesian LCMs: the two-tests in two-population model (Hui and Walter model) and the three-tests in one-population model (Walter and Irwig model). Both models are available with simplified and advanced interfaces. In the former, all settings for Bayesian statistics are fixed as defaults. Users input their data set into a table provided on the webpage. Disease prevalence and accuracy of diagnostic tests are then estimated using the Bayesian LCM, and provided on the web page within a few minutes. With the advanced interfaces, experienced researchers can modify all settings in the models as needed. These settings include correlation among diagnostic test results and prior distributions for all unknown parameters. The web pages provide worked examples with both models using the original data sets presented by Hui and Walter in 1980, and by Walter and Irwig in 1988. We also illustrate the utility of the advanced interface using the Walter and Irwig model on a data set from a recent melioidosis study. The results obtained from the web-based applications were comparable to those published previously. The newly developed web-based applications are open-access and provide an important new resource for researchers worldwide to evaluate new diagnostic tests.
Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.
2012-01-01
Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.
Artificial intelligence based models for stream-flow forecasting: 2000-2015
NASA Astrophysics Data System (ADS)
Yaseen, Zaher Mundher; El-shafie, Ahmed; Jaafar, Othman; Afan, Haitham Abdulmohsin; Sayl, Khamis Naba
2015-11-01
The use of Artificial Intelligence (AI) has increased since the middle of the 20th century as seen in its application in a wide range of engineering and science problems. The last two decades, for example, has seen a dramatic increase in the development and application of various types of AI approaches for stream-flow forecasting. Generally speaking, AI has exhibited significant progress in forecasting and modeling non-linear hydrological applications and in capturing the noise complexity in the dataset. This paper explores the state-of-the-art application of AI in stream-flow forecasting, focusing on defining the data-driven of AI, the advantages of complementary models, as well as the literature and their possible future application in modeling and forecasting stream-flow. The review also identifies the major challenges and opportunities for prospective research, including, a new scheme for modeling the inflow, a novel method for preprocessing time series frequency based on Fast Orthogonal Search (FOS) techniques, and Swarm Intelligence (SI) as an optimization approach.
Bishop, Christopher M
2013-02-13
Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.
Bishop, Christopher M.
2013-01-01
Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications. PMID:23277612
Johnson, Brent A
2009-10-01
We consider estimation and variable selection in the partial linear model for censored data. The partial linear model for censored data is a direct extension of the accelerated failure time model, the latter of which is a very important alternative model to the proportional hazards model. We extend rank-based lasso-type estimators to a model that may contain nonlinear effects. Variable selection in such partial linear model has direct application to high-dimensional survival analyses that attempt to adjust for clinical predictors. In the microarray setting, previous methods can adjust for other clinical predictors by assuming that clinical and gene expression data enter the model linearly in the same fashion. Here, we select important variables after adjusting for prognostic clinical variables but the clinical effects are assumed nonlinear. Our estimator is based on stratification and can be extended naturally to account for multiple nonlinear effects. We illustrate the utility of our method through simulation studies and application to the Wisconsin prognostic breast cancer data set.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
Biologically-based pharmacokinetic models are being increasingly used in the risk assessment of environmental chemicals. These models are based on biological, mathematical, statistical and engineering principles. Their potential uses in risk assessment include extrapolation betwe...
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
Timing fungicide application intervals based on airborne Erysiphe necator concentrations
USDA-ARS?s Scientific Manuscript database
Management of grape powdery mildew (Erysiphe necator) and other polycyclic diseases relies on numerous fungicide applications that follow a calendar or model-based application intervals, both of which assume that inoculum is always present. Quantitative molecular assays have been previously develope...
Design Of Computer Based Test Using The Unified Modeling Language
NASA Astrophysics Data System (ADS)
Tedyyana, Agus; Danuri; Lidyawati
2017-12-01
The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.
Li, Zhengkai; Spaulding, Malcolm; French McCay, Deborah; Crowley, Deborah; Payne, James R
2017-01-15
An oil droplet size model was developed for a variety of turbulent conditions based on non-dimensional analysis of disruptive and restorative forces, which is applicable to oil droplet formation under both surface breaking-wave and subsurface-blowout conditions, with or without dispersant application. This new model was calibrated and successfully validated with droplet size data obtained from controlled laboratory studies of dispersant-treated and non-treated oil in subsea dispersant tank tests and field surveys, including the Deep Spill experimental release and the Deepwater Horizon blowout oil spill. This model is an advancement over prior models, as it explicitly addresses the effects of the dispersed phase viscosity, resulting from dispersant application and constrains the maximum stable droplet size based on Rayleigh-Taylor instability that is invoked for a release from a large aperture. Copyright © 2016 Elsevier Ltd. All rights reserved.
Data Clustering and Evolving Fuzzy Decision Tree for Data Base Classification Problems
NASA Astrophysics Data System (ADS)
Chang, Pei-Chann; Fan, Chin-Yuan; Wang, Yen-Wen
Data base classification suffers from two well known difficulties, i.e., the high dimensionality and non-stationary variations within the large historic data. This paper presents a hybrid classification model by integrating a case based reasoning technique, a Fuzzy Decision Tree (FDT), and Genetic Algorithms (GA) to construct a decision-making system for data classification in various data base applications. The model is major based on the idea that the historic data base can be transformed into a smaller case-base together with a group of fuzzy decision rules. As a result, the model can be more accurately respond to the current data under classifying from the inductions by these smaller cases based fuzzy decision trees. Hit rate is applied as a performance measure and the effectiveness of our proposed model is demonstrated by experimentally compared with other approaches on different data base classification applications. The average hit rate of our proposed model is the highest among others.
Application of particle and lattice codes to simulation of hydraulic fracturing
NASA Astrophysics Data System (ADS)
Damjanac, Branko; Detournay, Christine; Cundall, Peter A.
2016-04-01
With the development of unconventional oil and gas reservoirs over the last 15 years, the understanding and capability to model the propagation of hydraulic fractures in inhomogeneous and naturally fractured reservoirs has become very important for the petroleum industry (but also for some other industries like mining and geothermal). Particle-based models provide advantages over other models and solutions for the simulation of fracturing of rock masses that cannot be assumed to be continuous and homogeneous. It has been demonstrated (Potyondy and Cundall Int J Rock Mech Min Sci Geomech Abstr 41:1329-1364, 2004) that particle models based on a simple force criterion for fracture propagation match theoretical solutions and scale effects derived using the principles of linear elastic fracture mechanics (LEFM). The challenge is how to apply these models effectively (i.e., with acceptable models sizes and computer run times) to the coupled hydro-mechanical problems of relevant time and length scales for practical field applications (i.e., reservoir scale and hours of injection time). A formulation of a fully coupled hydro-mechanical particle-based model and its application to the simulation of hydraulic treatment of unconventional reservoirs are presented. Model validation by comparing with available analytical asymptotic solutions (penny-shape crack) and some examples of field application (e.g., interaction with DFN) are also included.
nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.
Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia
2017-12-01
Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Design and implementation of space physics multi-model application integration based on web
NASA Astrophysics Data System (ADS)
Jiang, Wenping; Zou, Ziming
With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8 IGRF T96 models and solar proton prediction model geomagnetic transmission model etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides system into independent modules according to different business needs is applied to solve the problem of the independence of the physical space between multiple models. The classic MVC(Model View Controller) software design pattern is concerned to build the architecture of space physics multi-model application integrated system. The JSP+servlet+javabean technology is used to integrate the web application programs of space physics multi-model. It solves the problem of multi-user requesting the same job of model computing and effectively balances each server computing tasks. In addition, we also complete follow tasks: establishing standard graphical user interface based on Java Applet application program; Designing the interface between model computing and model computing results visualization; Realizing three-dimensional network visualization without plug-ins; Using Java3D technology to achieve a three-dimensional network scene interaction; Improved ability to interact with web pages and dynamic execution capabilities, including rendering three-dimensional graphics, fonts and color control. Through the design and implementation of the SPMAIS based on Web, we provide an online computing and application runtime environment of space physics multi-model. The practical application improves that researchers could be benefit from our system in space physics research and engineering applications.
NASA Astrophysics Data System (ADS)
Paiva Fonseca, Gabriel; Landry, Guillaume; White, Shane; D'Amours, Michel; Yoriyaz, Hélio; Beaulieu, Luc; Reniers, Brigitte; Verhaegen, Frank
2014-10-01
Accounting for brachytherapy applicator attenuation is part of the recommendations from the recent report of AAPM Task Group 186. To do so, model based dose calculation algorithms require accurate modelling of the applicator geometry. This can be non-trivial in the case of irregularly shaped applicators such as the Fletcher Williamson gynaecological applicator or balloon applicators with possibly irregular shapes employed in accelerated partial breast irradiation (APBI) performed using electronic brachytherapy sources (EBS). While many of these applicators can be modelled using constructive solid geometry (CSG), the latter may be difficult and time-consuming. Alternatively, these complex geometries can be modelled using tessellated geometries such as tetrahedral meshes (mesh geometries (MG)). Recent versions of Monte Carlo (MC) codes Geant4 and MCNP6 allow for the use of MG. The goal of this work was to model a series of applicators relevant to brachytherapy using MG. Applicators designed for 192Ir sources and 50 kV EBS were studied; a shielded vaginal applicator, a shielded Fletcher Williamson applicator and an APBI balloon applicator. All applicators were modelled in Geant4 and MCNP6 using MG and CSG for dose calculations. CSG derived dose distributions were considered as reference and used to validate MG models by comparing dose distribution ratios. In general agreement within 1% for the dose calculations was observed for all applicators between MG and CSG and between codes when considering volumes inside the 25% isodose surface. When compared to CSG, MG required longer computation times by a factor of at least 2 for MC simulations using the same code. MCNP6 calculation times were more than ten times shorter than Geant4 in some cases. In conclusion we presented methods allowing for high fidelity modelling with results equivalent to CSG. To the best of our knowledge MG offers the most accurate representation of an irregular APBI balloon applicator.
USDA-ARS?s Scientific Manuscript database
Operational application of a remote sensing-based two source energy balance model (TSEB) to estimate evaportranspiration (ET) and the components evaporation (E), transpiration (T) at a range of space and time scales is very useful for managing water resources in arid and semiarid watersheds. The TSE...
An e-Portfolio-Based Model for the Application and Sharing of College English ESP MOOCs
ERIC Educational Resources Information Center
Chen, Jinshi
2017-01-01
The informationalized knowledge sharing of MOOCs not only promotes the change of teaching concept and the reform of teaching methodology, but also provides a new opportunity for the teaching resource integration and sharing between different universities. The present study has constructed an e-Portfolio-based model for the application and sharing…
Polese, Pierluigi; Torre, Manuela Del; Stecchini, Mara Lucia
2018-03-31
The use of predictive modelling tools, which mainly describe the response of microorganisms to a particular set of environmental conditions, may contribute to a better understanding of microbial behaviour in foods. In this paper, a tertiary model, in the form of a readily available and userfriendly web-based application Praedicere Possumus (PP) is presented with research examples from our laboratories. Through the PP application, users have access to different modules, which apply a set of published models considered reliable for determining the compliance of a food product with EU safety criteria and for optimising processing throughout the identification of critical control points. The application pivots around a growth/no-growth boundary model, coupled with a growth model, and includes thermal and non-thermal inactivation models. Integrated functionalities, such as the fractional contribution of each inhibitory factor to growth probability (f) and the time evolution of the growth probability (P t ), have also been included. The PP application is expected to assist food industry and food safety authorities in their common commitment towards the improvement of food safety.
Information Interaction Study for DER and DMS Interoperability
NASA Astrophysics Data System (ADS)
Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui
The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.
2008-07-01
generation of process partitioning, a thread pipelining becomes possible. In this paper we briefly summarize the requirements and trends for FADEC based... FADEC environment, presenting a hypothetical realization of an example application. Finally we discuss the application of Time-Triggered...based control applications of the future. 15. SUBJECT TERMS Gas turbine, FADEC , Multi-core processing technology, disturbed based control
NASA Astrophysics Data System (ADS)
Jayaweera, H. M. P. C.; Muhtaroğlu, Ali
2016-11-01
A novel model based methodology is presented to determine optimal device parameters for the fully integrated ultra low voltage DC-DC converter for energy harvesting applications. The proposed model feasibly contributes to determine the maximum efficient number of charge pump stages to fulfill the voltage requirement of the energy harvester application. The proposed DC-DC converter based power consumption model enables the analytical derivation of the charge pump efficiency when utilized simultaneously with the known LC tank oscillator behavior under resonant conditions, and voltage step up characteristics of the cross-coupled charge pump topology. The verification of the model has been done using a circuit simulator. The optimized system through the established model achieves more than 40% maximum efficiency yielding 0.45 V output with single stage, 0.75 V output with two stages, and 0.9 V with three stages for 2.5 kΩ, 3.5 kΩ and 5 kΩ loads respectively using 0.2 V input.
Karimi, Davood; Ward, Rabab K
2016-10-01
Image models are central to all image processing tasks. The great advancements in digital image processing would not have been made possible without powerful models which, themselves, have evolved over time. In the past decade, "patch-based" models have emerged as one of the most effective models for natural images. Patch-based methods have outperformed other competing methods in many image processing tasks. These developments have come at a time when greater availability of powerful computational resources and growing concerns over the health risks of the ionizing radiation encourage research on image processing algorithms for computed tomography (CT). The goal of this paper is to explain the principles of patch-based methods and to review some of their recent applications in CT. We first review the central concepts in patch-based image processing and explain some of the state-of-the-art algorithms, with a focus on aspects that are more relevant to CT. Then, we review some of the recent application of patch-based methods in CT. Patch-based methods have already transformed the field of image processing, leading to state-of-the-art results in many applications. More recently, several studies have proposed patch-based algorithms for various image processing tasks in CT, from denoising and restoration to iterative reconstruction. Although these studies have reported good results, the true potential of patch-based methods for CT has not been yet appreciated. Patch-based methods can play a central role in image reconstruction and processing for CT. They have the potential to lead to substantial improvements in the current state of the art.
A novel medical image data-based multi-physics simulation platform for computational life sciences.
Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels
2013-04-06
Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.
Technical Note: Approximate Bayesian parameterization of a process-based tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2014-02-01
Inverse parameter estimation of process-based models is a long-standing problem in many scientific disciplines. A key question for inverse parameter estimation is how to define the metric that quantifies how well model predictions fit to the data. This metric can be expressed by general cost or objective functions, but statistical inversion methods require a particular metric, the probability of observing the data given the model parameters, known as the likelihood. For technical and computational reasons, likelihoods for process-based stochastic models are usually based on general assumptions about variability in the observed data, and not on the stochasticity generated by the model. Only in recent years have new methods become available that allow the generation of likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional Markov chain Monte Carlo (MCMC) sampler, performs well in retrieving known parameter values from virtual inventory data generated by the forest model. We analyze the results of the parameter estimation, examine its sensitivity to the choice and aggregation of model outputs and observed data (summary statistics), and demonstrate the application of this method by fitting the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss how this approach differs from approximate Bayesian computation (ABC), another method commonly used to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can be successfully applied to process-based models of high complexity. The methodology is particularly suitable for heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models.
Compact modeling of CRS devices based on ECM cells for memory, logic and neuromorphic applications.
Linn, E; Menzel, S; Ferch, S; Waser, R
2013-09-27
Dynamic physics-based models of resistive switching devices are of great interest for the realization of complex circuits required for memory, logic and neuromorphic applications. Here, we apply such a model of an electrochemical metallization (ECM) cell to complementary resistive switches (CRSs), which are favorable devices to realize ultra-dense passive crossbar arrays. Since a CRS consists of two resistive switching devices, it is straightforward to apply the dynamic ECM model for CRS simulation with MATLAB and SPICE, enabling study of the device behavior in terms of sweep rate and series resistance variations. Furthermore, typical memory access operations as well as basic implication logic operations can be analyzed, revealing requirements for proper spike and level read operations. This basic understanding facilitates applications of massively parallel computing paradigms required for neuromorphic applications.
Population balance modeling: current status and future prospects.
Ramkrishna, Doraiswami; Singh, Meenesh R
2014-01-01
Population balance modeling is undergoing phenomenal growth in its applications, and this growth is accompanied by multifarious reviews. This review aims to fortify the model's fundamental base, as well as point to a variety of new applications, including modeling of crystal morphology, cell growth and differentiation, gene regulatory processes, and transfer of drug resistance. This is accomplished by presenting the many faces of population balance equations that arise in the foregoing applications.
de Carvalho, Elias César Araujo; Batilana, Adelia Portero; Simkins, Julie; Martins, Henrique; Shah, Jatin; Rajgor, Dimple; Shah, Anand; Rockart, Scott; Pietrobon, Ricardo
2010-02-19
Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.
Applicability of central auditory processing disorder models.
Jutras, Benoît; Loubert, Monique; Dupuis, Jean-Luc; Marcoux, Caroline; Dumont, Véronique; Baril, Michèle
2007-12-01
Central auditory processing disorder ([C]APD) is a relatively recent construct that has given rise to 2 theoretical models: the Buffalo Model and the Bellis/Ferre Model. These models describe 4 and 5 (C)APD categories, respectively. The present study examines the applicability of these models to clinical practice. Neither of these models was based on data from peer-reviewed sources. This is a retrospective study that reviewed 178 records of children diagnosed with (C)APD, of which 48 were retained for analysis. More than 80% of the children could be classified into one of the Buffalo Model categories, while more than 90% remained unclassified under the Bellis/Ferre Model. This discrepancy can be explained by the fact that the classification of the Buffalo Model is based primarily on a single central auditory test (Staggered Spondaic Word), whereas the Bellis/Ferre Model classification uses a combination of auditory test results. The 2 models provide a conceptual framework for (C)APD, but they must be further refined to be fully applicable in clinical settings.
NASA Technical Reports Server (NTRS)
Vanlunteren, A.; Stassen, H. G.
1973-01-01
Parameter estimation techniques are discussed with emphasis on unbiased estimates in the presence of noise. A distinction between open and closed loop systems is made. A method is given based on the application of external forcing functions consisting of a sun of sinusoids; this method is thus based on the estimation of Fourier coefficients and is applicable for models with poles and zeros in open and closed loop systems.
The comprehensive individual field-measurements on non-dietary exposure collected in the Children's-Post-Pesticide-Application-Exposure-Study (CPPAES) were used within MENTOR/SHEDS-Pesticides, a physically based stochastic human exposure and dose model. In this application, howev...
Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E
2016-10-01
This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology. © 2016 John Wiley & Sons Ltd.
Control algorithms and applications of the wavefront sensorless adaptive optics
NASA Astrophysics Data System (ADS)
Ma, Liang; Wang, Bin; Zhou, Yuanshen; Yang, Huizhen
2017-10-01
Compared with the conventional adaptive optics (AO) system, the wavefront sensorless (WFSless) AO system need not to measure the wavefront and reconstruct it. It is simpler than the conventional AO in system architecture and can be applied to the complex conditions. Based on the analysis of principle and system model of the WFSless AO system, wavefront correction methods of the WFSless AO system were divided into two categories: model-free-based and model-based control algorithms. The WFSless AO system based on model-free-based control algorithms commonly considers the performance metric as a function of the control parameters and then uses certain control algorithm to improve the performance metric. The model-based control algorithms include modal control algorithms, nonlinear control algorithms and control algorithms based on geometrical optics. Based on the brief description of above typical control algorithms, hybrid methods combining the model-free-based control algorithm with the model-based control algorithm were generalized. Additionally, characteristics of various control algorithms were compared and analyzed. We also discussed the extensive applications of WFSless AO system in free space optical communication (FSO), retinal imaging in the human eye, confocal microscope, coherent beam combination (CBC) techniques and extended objects.
Code of Federal Regulations, 2014 CFR
2014-01-01
... using industry standard model (need to add name and location of this open source model) to show... the project is based and applicant's financial model presenting project pro forma statements for the... Standards of Professional Appraisal Practice,” promulgated by the Appraisal Standards Board of the Appraisal...
Code of Federal Regulations, 2012 CFR
2012-01-01
... using industry standard model (need to add name and location of this open source model) to show... the project is based and applicant's financial model presenting project pro forma statements for the... Standards of Professional Appraisal Practice,” promulgated by the Appraisal Standards Board of the Appraisal...
Code of Federal Regulations, 2013 CFR
2013-01-01
... using industry standard model (need to add name and location of this open source model) to show... the project is based and applicant's financial model presenting project pro forma statements for the... Standards of Professional Appraisal Practice,” promulgated by the Appraisal Standards Board of the Appraisal...
NASA Astrophysics Data System (ADS)
Al-garni, Abdullah M.
Urban information systems are economic resources that can benefit decision makers in the planning, development, and management of urban projects and resources. In this research, a conceptual model-based prototype Urban Geographic Information System (UGIS) is developed. The base maps used in developing the system and acquiring visual attributes are obtained from aerial photographs. The system is a multi-purpose parcel-based one that can serve many urban applications such as public utilities, health centres, schools, population estimation, road engineering and maintenance, and many others. A modern region in the capital city of Saudi Arabia is used for the study. The developed model is operational for one urban application (population estimation) and is tested for that particular application. The results showed that the system has a satisfactory accuracy and that it may well be promising for other similar urban applications in countries with similar demographic and social characteristics.
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas M., Jr.; Roelofs, Larry H.; Dorfman, Erik
1991-01-01
A methodology for optimizing organization of data obtained by NASA earth and space missions is discussed. The methodology uses a concept based on semantic data modeling techniques implemented in a hierarchical storage model. The modeling is used to organize objects in mass storage devices, relational database systems, and object-oriented databases. The semantic data modeling at the metadata record level is examined, including the simulation of a knowledge base and semantic metadata storage issues. The semantic data model hierarchy and its application for efficient data storage is addressed, as is the mapping of the application structure to the mass storage.
Usefulness of Neuro-Fuzzy Models' Application for Tobacco Control
NASA Astrophysics Data System (ADS)
Petrovic-Lazarevic, Sonja; Zhang, Jian Ying
2007-12-01
The paper presents neuro-fuzzy models' application appropriate for tobacco control: the fuzzy control model, Adaptive Network Based Fuzzy Inference System, Evolving Fuzzy Neural Network models, and EVOlving POLicies. We propose further the use of Fuzzy Casual Networks to help tobacco control decision makers develop policies and measure their impact on social regulation.
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
Applications of agent-based modeling to nutrient movement Lake Michigan
As part of an ongoing project aiming to provide useful information for nearshore management (harmful algal blooms, nutrient loading), we explore the value of agent-based models in Lake Michigan. Agent-based models follow many individual “agents” moving through a simul...
Models and Frameworks: A Synergistic Association for Developing Component-Based Applications
Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858
Models and frameworks: a synergistic association for developing component-based applications.
Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara
2014-01-01
The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.
Business model framework applications in health care: A systematic review.
Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl
2017-11-01
It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.
Review of Development Survey of Phase Change Material Models in Building Applications
Akeiber, Hussein J.; Wahid, Mazlan A.; Hussen, Hasanen M.; Mohammad, Abdulrahman Th.
2014-01-01
The application of phase change materials (PCMs) in green buildings has been increasing rapidly. PCM applications in green buildings include several development models. This paper briefly surveys the recent research and development activities of PCM technology in building applications. Firstly, a basic description of phase change and their principles is provided; the classification and applications of PCMs are also included. Secondly, PCM models in buildings are reviewed and discussed according to the wall, roof, floor, and cooling systems. Finally, conclusions are presented based on the collected data. PMID:25313367
ERIC Educational Resources Information Center
Yadiannur, Mitra; Supahar
2017-01-01
This research aims to determine the feasibility and effectivity of mobile learning based Worked Example in Electric Circuits (WEIEC) application in improving the high school students' electric circuits interpretation ability on Direct Current Circuits materials. The research method used was a combination of Four-D Models and ADDIE model. The…
Jinghao Li; John F. Hunt; Shaoqin Gong; Zhiyong Cai
2014-01-01
As the demand for sustainable materials increases, there are unique challenges and opportunities to develop light-weight green composites materials for a wide range of applications. Thus wood-based composite materials from renewable forests may provide options for some niche applications while helping to protect our environment. In this paper, the wood-based tri-axial...
NASA Astrophysics Data System (ADS)
Coudarcher, Rémi; Duculty, Florent; Serot, Jocelyn; Jurie, Frédéric; Derutin, Jean-Pierre; Dhome, Michel
2005-12-01
SKiPPER is a SKeleton-based Parallel Programming EnviRonment being developed since 1996 and running at LASMEA Laboratory, the Blaise-Pascal University, France. The main goal of the project was to demonstrate the applicability of skeleton-based parallel programming techniques to the fast prototyping of reactive vision applications. This paper deals with the special features embedded in the latest version of the project: algorithmic skeleton nesting capabilities and a fully dynamic operating model. Throughout the case study of a complete and realistic image processing application, in which we have pointed out the requirement for skeleton nesting, we are presenting the operating model of this feature. The work described here is one of the few reported experiments showing the application of skeleton nesting facilities for the parallelisation of a realistic application, especially in the area of image processing. The image processing application we have chosen is a 3D face-tracking algorithm from appearance.
[Study on the automatic parameters identification of water pipe network model].
Jia, Hai-Feng; Zhao, Qi-Feng
2010-01-01
Based on the problems analysis on development and application of water pipe network model, the model parameters automatic identification is regarded as a kernel bottleneck of model's application in water supply enterprise. The methodology of water pipe network model parameters automatic identification based on GIS and SCADA database is proposed. Then the kernel algorithm of model parameters automatic identification is studied, RSA (Regionalized Sensitivity Analysis) is used for automatic recognition of sensitive parameters, and MCS (Monte-Carlo Sampling) is used for automatic identification of parameters, the detail technical route based on RSA and MCS is presented. The module of water pipe network model parameters automatic identification is developed. At last, selected a typical water pipe network as a case, the case study on water pipe network model parameters automatic identification is conducted and the satisfied results are achieved.
Computational neuroanatomy: ontology-based representation of neural components and connectivity
Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron
2009-01-01
Background A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. Results We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Conclusion Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future. PMID:19208191
Integration of Dynamic Models in Range Operations
NASA Technical Reports Server (NTRS)
Bardina, Jorge; Thirumalainambi, Rajkumar
2004-01-01
This work addresses the various model interactions in real-time to make an efficient internet based decision making tool for Shuttle launch. The decision making tool depends on the launch commit criteria coupled with physical models. Dynamic interaction between a wide variety of simulation applications and techniques, embedded algorithms, and data visualizations are needed to exploit the full potential of modeling and simulation. This paper also discusses in depth details of web based 3-D graphics and applications to range safety. The advantages of this dynamic model integration are secure accessibility and distribution of real time information to other NASA centers.
Reuse: A knowledge-based approach
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui
1992-01-01
This paper describes our research in automating the reuse process through the use of application domain models. Application domain models are explicit formal representations of the application knowledge necessary to understand, specify, and generate application programs. Furthermore, they provide a unified repository for the operational structure, rules, policies, and constraints of a specific application area. In our approach, domain models are expressed in terms of a transaction-based meta-modeling language. This paper has described in detail the creation and maintenance of hierarchical structures. These structures are created through a process that includes reverse engineering of data models with supplementary enhancement from application experts. Source code is also reverse engineered but is not a major source of domain model instantiation at this time. In the second phase of the software synthesis process, program specifications are interactively synthesized from an instantiated domain model. These specifications are currently integrated into a manual programming process but will eventually be used to derive executable code with mechanically assisted transformations. This research is performed within the context of programming-in-the-large types of systems. Although our goals are ambitious, we are implementing the synthesis system in an incremental manner through which we can realize tangible results. The client/server architecture is capable of supporting 16 simultaneous X/Motif users and tens of thousands of attributes and classes. Domain models have been partially synthesized from five different application areas. As additional domain models are synthesized and additional knowledge is gathered, we will inevitably add to and modify our representation. However, our current experience indicates that it will scale and expand to meet our modeling needs.
An Object-Based Requirements Modeling Method.
ERIC Educational Resources Information Center
Cordes, David W.; Carver, Doris L.
1992-01-01
Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…
A Framework for the Specification of the Semantics and the Dynamics of Instructional Applications
ERIC Educational Resources Information Center
Buendia-Garcia, Felix; Diaz, Paloma
2003-01-01
An instructional application consists of a set of resources and activities to implement interacting, interrelated, and structured experiences oriented towards achieving specific educational objectives. The development of computer-based instructional applications has to follow a well defined process, so models for computer-based instructional…
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
NASA Astrophysics Data System (ADS)
Havens, Scott; Marks, Danny; Kormos, Patrick; Hedrick, Andrew
2017-12-01
In the Western US and many mountainous regions of the world, critical water resources and climate conditions are difficult to monitor because the observation network is generally very sparse. The critical resource from the mountain snowpack is water flowing into streams and reservoirs that will provide for irrigation, flood control, power generation, and ecosystem services. Water supply forecasting in a rapidly changing climate has become increasingly difficult because of non-stationary conditions. In response, operational water supply managers have begun to move from statistical techniques towards the use of physically based models. As we begin to transition physically based models from research to operational use, we must address the most difficult and time-consuming aspect of model initiation: the need for robust methods to develop and distribute the input forcing data. In this paper, we present a new open source framework, the Spatial Modeling for Resources Framework (SMRF), which automates and simplifies the common forcing data distribution methods. It is computationally efficient and can be implemented for both research and operational applications. We present an example of how SMRF is able to generate all of the forcing data required to a run physically based snow model at 50-100 m resolution over regions of 1000-7000 km2. The approach has been successfully applied in real time and historical applications for both the Boise River Basin in Idaho, USA and the Tuolumne River Basin in California, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of a physically based snow model possible.
Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.
2014-01-01
The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083
Overview of Computer-Based Models Applicable to Freight Car Utilization
DOT National Transportation Integrated Search
1977-10-01
This report documents a study performed to identify and analyze twenty-two of the important computer-based models of railroad operations. The models are divided into three categories: network simulations, yard simulations, and network optimizations. ...
Application for managing model-based material properties for simulation-based engineering
Hoffman, Edward L [Alameda, CA
2009-03-03
An application for generating a property set associated with a constitutive model of a material includes a first program module adapted to receive test data associated with the material and to extract loading conditions from the test data. A material model driver is adapted to receive the loading conditions and a property set and operable in response to the loading conditions and the property set to generate a model response for the material. A numerical optimization module is adapted to receive the test data and the model response and operable in response to the test data and the model response to generate the property set.
A cavitation model based on Eulerian stochastic fields
NASA Astrophysics Data System (ADS)
Magagnato, F.; Dumond, J.
2013-12-01
Non-linear phenomena can often be described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and in particular to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. Firstly, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.
Different Manhattan project: automatic statistical model generation
NASA Astrophysics Data System (ADS)
Yap, Chee Keng; Biermann, Henning; Hertzmann, Aaron; Li, Chen; Meyer, Jon; Pao, Hsing-Kuo; Paxia, Salvatore
2002-03-01
We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.
Semantic Description of Educational Adaptive Hypermedia Based on a Conceptual Model
ERIC Educational Resources Information Center
Papasalouros, Andreas; Retalis, Symeon; Papaspyrou, Nikolaos
2004-01-01
The role of conceptual modeling in Educational Adaptive Hypermedia Applications (EAHA) is especially important. A conceptual model of an educational application depicts the instructional solution that is implemented, containing information about concepts that must be ac-quired by learners, tasks in which learners must be involved and resources…
76 FR 53137 - Bundled Payments for Care Improvement Initiative: Request for Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-25
... (RFA) will test episode-based payment for acute care and associated post-acute care, using both retrospective and prospective bundled payment methods. The RFA requests applications to test models centered around acute care; these models will inform the design of future models, including care improvement for...
NASA Astrophysics Data System (ADS)
Havens, S.; Marks, D. G.; Kormos, P.; Hedrick, A. R.; Johnson, M.; Robertson, M.; Sandusky, M.
2017-12-01
In the Western US, operational water supply managers rely on statistical techniques to forecast the volume of water left to enter the reservoirs. As the climate changes and the demand increases for stored water utilized for irrigation, flood control, power generation, and ecosystem services, water managers have begun to move from statistical techniques towards using physically based models. To assist with the transition, a new open source framework was developed, the Spatial Modeling for Resources Framework (SMRF), to automate and simplify the most common forcing data distribution methods. SMRF is computationally efficient and can be implemented for both research and operational applications. Currently, SMRF is able to generate all of the forcing data required to run physically based snow or hydrologic models at 50-100 m resolution over regions of 500-10,000 km2, and has been successfully applied in real time and historical applications for the Boise River Basin in Idaho, USA, the Tuolumne River Basin and San Joaquin in California, USA, and Reynolds Creek Experimental Watershed in Idaho, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input data. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of physics-based snow and hydrologic models possible.
The MVP Model: Overview and Application
ERIC Educational Resources Information Center
Keller, John M.
2017-01-01
This chapter contains an overview of the MVP model that is used as a basis for the other chapters in this issue. It also contains a description of key steps in the ARCS-V design process that is derived from the MVP model and a summary of a design-based research study illustrating the application of the ARCS-V model.
2010-10-07
AT A time when evidence-based practice is the predominant nursing model, the authors of this book want to interest academics and practitioners in models that were in vogue in the Uk in the 1980s and 1990s.
2010-09-22
The authors set themselves the interesting challenge of reviving the interest of academics and practitioners in nursing models. Such models were in vogue in the UK in the 1980s and 1990s, at a time dominated by the evidence-based practice movement.
A demonstrative model of a lunar base simulation on a personal computer
NASA Technical Reports Server (NTRS)
1985-01-01
The initial demonstration model of a lunar base simulation is described. This initial model was developed on the personal computer level to demonstrate feasibility and technique before proceeding to a larger computer-based model. Lotus Symphony Version 1.1 software was used to base the demonstration model on an personal computer with an MS-DOS operating system. The personal computer-based model determined the applicability of lunar base modeling techniques developed at an LSPI/NASA workshop. In addition, the personnal computer-based demonstration model defined a modeling structure that could be employed on a larger, more comprehensive VAX-based lunar base simulation. Refinement of this personal computer model and the development of a VAX-based model is planned in the near future.
A connectionist model for dynamic control
NASA Technical Reports Server (NTRS)
Whitfield, Kevin C.; Goodall, Sharon M.; Reggia, James A.
1989-01-01
The application of a connectionist modeling method known as competition-based spreading activation to a camera tracking task is described. The potential is explored for automation of control and planning applications using connectionist technology. The emphasis is on applications suitable for use in the NASA Space Station and in related space activities. The results are quite general and could be applicable to control systems in general.
Integration of a three-dimensional process-based hydrological model into the Object Modeling System
USDA-ARS?s Scientific Manuscript database
The integration of a spatial process model into an environmental modelling framework can enhance the model’s capabilities. We present the integration of the GEOtop model into the Object Modeling System (OMS) version 3.0 and illustrate its application in a small watershed. GEOtop is a physically base...
Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.
2009-01-01
The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.
A method of designing smartphone interface based on the extended user's mental model
NASA Astrophysics Data System (ADS)
Zhao, Wei; Li, Fengmin; Bian, Jiali; Pan, Juchen; Song, Song
2017-01-01
The user's mental model is the core guiding theory of product design, especially practical products. The essence of practical product is a tool which is used by users to meet their needs. Then, the most important feature of a tool is usability. The design method based on the user's mental model provides a series of practical and feasible theoretical guidance for improving the usability of the product according to the user's awareness of things. In this paper, we propose a method of designing smartphone interface based on the extended user's mental model according to further research on user groups. This approach achieves personalized customization of smartphone application interface and enhance application using efficiency.
Illumination modelling of a mobile device environment for effective use in driving mobile apps
NASA Astrophysics Data System (ADS)
Marhoubi, Asmaa H.; Saravi, Sara; Edirisinghe, Eran A.; Bez, Helmut E.
2015-05-01
The present generation of Ambient Light Sensors (ALS) of a mobile handheld device suffer from two practical shortcomings. The ALSs are narrow angle, i.e. they respond effectively only within a narrow angle of operation and there is a latency of operation. As a result mobile applications that operate based on the ALS readings could perform sub-optimally especially when operated in environments with non-uniform illumination. The applications will either adopt with unacceptable levels of latency or/and may demonstrate a discrete nature of operation. In this paper we propose a framework to predict the ambient illumination of an environment in which a mobile device is present. The predictions are based on an illumination model that is developed based on a small number of readings taken during an application calibration stage. We use a machine learning based approach in developing the models. Five different regression models were developed, implemented and compared based on Polynomial, Gaussian, Sum of Sine, Fourier and Smoothing Spline functions. Approaches to remove noisy data, missing values and outliers were used prior to the modelling stage to remove their negative effects on modelling. The prediction accuracy for all models were found to be above 0.99 when measured using R-Squared test with the best performance being from Smoothing Spline. In this paper we will discuss mathematical complexity of each model and investigate how to make compromises in finding the best model.
LINKING THE CMAQ AND HYSPLIT MODELING SYSTEM INTERFACE PROGRAM AND EXAMPLE APPLICATION
A new software tool has been developed to link the Eulerian-based Community Multiscale Air Quality (CMAQ) modeling system with the Lagrangian-based HYSPLIT (HYbrid Single-Particle Lagrangian Integrated Trajectory) model. Both models require many of the same hourly meteorological...
Application of LogitBoost Classifier for Traceability Using SNP Chip Data
Kang, Hyunsung; Cho, Seoae; Kim, Heebal; Seo, Kang-Seok
2015-01-01
Consumer attention to food safety has increased rapidly due to animal-related diseases; therefore, it is important to identify their places of origin (POO) for safety purposes. However, only a few studies have addressed this issue and focused on machine learning-based approaches. In the present study, classification analyses were performed using a customized SNP chip for POO prediction. To accomplish this, 4,122 pigs originating from 104 farms were genotyped using the SNP chip. Several factors were considered to establish the best prediction model based on these data. We also assessed the applicability of the suggested model using a kinship coefficient-filtering approach. Our results showed that the LogitBoost-based prediction model outperformed other classifiers in terms of classification performance under most conditions. Specifically, a greater level of accuracy was observed when a higher kinship-based cutoff was employed. These results demonstrated the applicability of a machine learning-based approach using SNP chip data for practical traceability. PMID:26436917
Application of LogitBoost Classifier for Traceability Using SNP Chip Data.
Kim, Kwondo; Seo, Minseok; Kang, Hyunsung; Cho, Seoae; Kim, Heebal; Seo, Kang-Seok
2015-01-01
Consumer attention to food safety has increased rapidly due to animal-related diseases; therefore, it is important to identify their places of origin (POO) for safety purposes. However, only a few studies have addressed this issue and focused on machine learning-based approaches. In the present study, classification analyses were performed using a customized SNP chip for POO prediction. To accomplish this, 4,122 pigs originating from 104 farms were genotyped using the SNP chip. Several factors were considered to establish the best prediction model based on these data. We also assessed the applicability of the suggested model using a kinship coefficient-filtering approach. Our results showed that the LogitBoost-based prediction model outperformed other classifiers in terms of classification performance under most conditions. Specifically, a greater level of accuracy was observed when a higher kinship-based cutoff was employed. These results demonstrated the applicability of a machine learning-based approach using SNP chip data for practical traceability.
Image analysis and modeling in medical image computing. Recent developments and advances.
Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T
2012-01-01
Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.
SU-E-T-366: Clinical Implementation of MR-Guided Vaginal Cylinder Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owrangi, A; Jolly, S; Balter, J
2014-06-01
Purpose: To evaluate the accuracy of MR-based vaginal brachytherapy source localization using an in-house MR-visible marker versus the alignment of an applicator model to MR images. Methods: Three consecutive patients undergoing vaginal HDR brachytherapy with a plastic cylinder were scanned with both CT and MRI (including T1- and T2- weighted images). An MR-visible source localization marker, consisting of a sealed thin catheter filled with either water (for T2 contrast) or Gd-doped water (for T1 contrast), was assembled shortly before scanning. Clinically, the applicator channel was digitized on CT with an x-ray marker. To evaluate the efficacy of MR-based applicator reconstruction,more » each MR image volume was aligned locally to the CT images based on the region containing the cylinder. Applicator digitization was performed on the MR images using (1) the MR visible marker and (2) alignment of an applicator surface model from Varian's Brachytherapy Planning software to the MRI images. Resulting source positions were compared with the original CT digitization. Results: Although the source path was visualized by the MR marker, the applicator tip proved difficult to identify due to challenges in achieving a watertight seal. This resulted in observed displacements of the catheter tip, at times >1cm. Deviations between the central source positions identified via aligning the applicator surface model to MR and using the xray marker on CT ranged from 0.07 – 0.19 cm and 0.07 – 0.20 cm on T1- weighted and T2-weighted images, respectively. Conclusion: Based on the current study, aligning the applicator model to MRI provides a practical, current approach to perform MR-based brachytherapy planning. Further study is needed to produce catheters with reliably and reproducibly identifiable tips. Attempts are being made to improve catheter seals, as well as to increase the viscosity of the contrast material to decrease fluid mobility inside the catheter.« less
Common IED exploitation target set ontology
NASA Astrophysics Data System (ADS)
Russomanno, David J.; Qualls, Joseph; Wowczuk, Zenovy; Franken, Paul; Robinson, William
2010-04-01
The Common IED Exploitation Target Set (CIEDETS) ontology provides a comprehensive semantic data model for capturing knowledge about sensors, platforms, missions, environments, and other aspects of systems under test. The ontology also includes representative IEDs; modeled as explosives, camouflage, concealment objects, and other background objects, which comprise an overall threat scene. The ontology is represented using the Web Ontology Language and the SPARQL Protocol and RDF Query Language, which ensures portability of the acquired knowledge base across applications. The resulting knowledge base is a component of the CIEDETS application, which is intended to support the end user sensor test and evaluation community. CIEDETS associates a system under test to a subset of cataloged threats based on the probability that the system will detect the threat. The associations between systems under test, threats, and the detection probabilities are established based on a hybrid reasoning strategy, which applies a combination of heuristics and simplified modeling techniques. Besides supporting the CIEDETS application, which is focused on efficient and consistent system testing, the ontology can be leveraged in a myriad of other applications, including serving as a knowledge source for mission planning tools.
Applicability of common stomatal conductance models in maize under varying soil moisture conditions.
Wang, Qiuling; He, Qijin; Zhou, Guangsheng
2018-07-01
In the context of climate warming, the varying soil moisture caused by precipitation pattern change will affect the applicability of stomatal conductance models, thereby affecting the simulation accuracy of carbon-nitrogen-water cycles in ecosystems. We studied the applicability of four common stomatal conductance models including Jarvis, Ball-Woodrow-Berry (BWB), Ball-Berry-Leuning (BBL) and unified stomatal optimization (USO) models based on summer maize leaf gas exchange data from a soil moisture consecutive decrease manipulation experiment. The results showed that the USO model performed best, followed by the BBL model, BWB model, and the Jarvis model performed worst under varying soil moisture conditions. The effects of soil moisture made a difference in the relative performance among the models. By introducing a water response function, the performance of the Jarvis, BWB, and USO models improved, which decreased the normalized root mean square error (NRMSE) by 15.7%, 16.6% and 3.9%, respectively; however, the performance of the BBL model was negative, which increased the NRMSE by 5.3%. It was observed that the models of Jarvis, BWB, BBL and USO were applicable within different ranges of soil relative water content (i.e., 55%-65%, 56%-67%, 37%-79% and 37%-95%, respectively) based on the 95% confidence limits. Moreover, introducing a water response function, the applicability of the Jarvis and BWB models improved. The USO model performed best with or without introducing the water response function and was applicable under varying soil moisture conditions. Our results provide a basis for selecting appropriate stomatal conductance models under drought conditions. Copyright © 2018 Elsevier B.V. All rights reserved.
Systems modeling and simulation applications for critical care medicine
2012-01-01
Critical care delivery is a complex, expensive, error prone, medical specialty and remains the focal point of major improvement efforts in healthcare delivery. Various modeling and simulation techniques offer unique opportunities to better understand the interactions between clinical physiology and care delivery. The novel insights gained from the systems perspective can then be used to develop and test new treatment strategies and make critical care delivery more efficient and effective. However, modeling and simulation applications in critical care remain underutilized. This article provides an overview of major computer-based simulation techniques as applied to critical care medicine. We provide three application examples of different simulation techniques, including a) pathophysiological model of acute lung injury, b) process modeling of critical care delivery, and c) an agent-based model to study interaction between pathophysiology and healthcare delivery. Finally, we identify certain challenges to, and opportunities for, future research in the area. PMID:22703718
Statistical Analysis of Q-matrix Based Diagnostic Classification Models
Chen, Yunxiao; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2014-01-01
Diagnostic classification models have recently gained prominence in educational assessment, psychiatric evaluation, and many other disciplines. Central to the model specification is the so-called Q-matrix that provides a qualitative specification of the item-attribute relationship. In this paper, we develop theories on the identifiability for the Q-matrix under the DINA and the DINO models. We further propose an estimation procedure for the Q-matrix through the regularized maximum likelihood. The applicability of this procedure is not limited to the DINA or the DINO model and it can be applied to essentially all Q-matrix based diagnostic classification models. Simulation studies are conducted to illustrate its performance. Furthermore, two case studies are presented. The first case is a data set on fraction subtraction (educational application) and the second case is a subsample of the National Epidemiological Survey on Alcohol and Related Conditions concerning the social anxiety disorder (psychiatric application). PMID:26294801
Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.
1998-01-01
Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.
Usability evaluation of mobile applications; where do we stand?
NASA Astrophysics Data System (ADS)
Zahra, Fatima; Hussain, Azham; Mohd, Haslina
2017-10-01
The range and availability of mobile applications is expanding rapidly. With the increased processing power available on portable devices, developers are increasing the range of services by embracing smartphones in their extensive and diverse practices. While usability testing and evaluations of mobile applications have not yet touched the accuracy level of other web based applications. The existing usability models do not adequately capture the complexities of interacting with applications on a mobile platform. Therefore, this study aims to presents review on existing usability models for mobile applications. These models are in their infancy but with time and more research they may eventually be adopted. Moreover, different categories of mobile apps (medical, entertainment, education) possess different functional and non-functional requirements thus customized models are required for diverse mobile applications.
Application of simulation models for the optimization of business processes
NASA Astrophysics Data System (ADS)
Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří
2016-06-01
The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.
Transdiagnostic Theory and Application of Family-Based Treatment for Youth with Eating Disorders
ERIC Educational Resources Information Center
Loeb, Katharine L.; Lock, James; Greif, Rebecca; le Grange, Daniel
2012-01-01
This paper describes the transdiagnostic theory and application of family-based treatment (FBT) for children and adolescents with eating disorders. We review the fundamentals of FBT, a transdiagnostic theoretical model of FBT and the literature supporting its clinical application, adaptations across developmental stages and the diagnostic spectrum…
NASA Astrophysics Data System (ADS)
McLaughlin, B. D.; Pawloski, A. W.
2017-12-01
NASA ESDIS has been moving a variety of data ingest, distribution, and science data processing applications into a cloud environment over the last 2 years. As expected, there have been a number of challenges in migrating primarily on-premises applications into a cloud-based environment, related to architecture and taking advantage of cloud-based services. What was not expected is a number of issues that were beyond purely technical application re-architectures. From surprising network policy limitations, billing challenges in a government-based cost model, and obtaining certificates in an NASA security-compliant manner to working with multiple applications in a shared and resource-constrained AWS account, these have been the relevant challenges in taking advantage of a cloud model. And most surprising of all… well, you'll just have to wait and see the "gotcha" that caught our entire team off guard!
Collaborative development of predictive toxicology applications
2010-01-01
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436
Collaborative development of predictive toxicology applications.
Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia
2010-08-31
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
Generalized Ordinary Differential Equation Models 1
Miao, Hongyu; Wu, Hulin; Xue, Hongqi
2014-01-01
Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method. PMID:25544787
Generalized Ordinary Differential Equation Models.
Miao, Hongyu; Wu, Hulin; Xue, Hongqi
2014-10-01
Existing estimation methods for ordinary differential equation (ODE) models are not applicable to discrete data. The generalized ODE (GODE) model is therefore proposed and investigated for the first time. We develop the likelihood-based parameter estimation and inference methods for GODE models. We propose robust computing algorithms and rigorously investigate the asymptotic properties of the proposed estimator by considering both measurement errors and numerical errors in solving ODEs. The simulation study and application of our methods to an influenza viral dynamics study suggest that the proposed methods have a superior performance in terms of accuracy over the existing ODE model estimation approach and the extended smoothing-based (ESB) method.
Working covariance model selection for generalized estimating equations.
Carey, Vincent J; Wang, You-Gan
2011-11-20
We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.
Nano-QSPR Modelling of Carbon-Based Nanomaterials Properties.
Salahinejad, Maryam
2015-01-01
Evaluation of chemical and physical properties of nanomaterials is of critical importance in a broad variety of nanotechnology researches. There is an increasing interest in computational methods capable of predicting properties of new and modified nanomaterials in the absence of time-consuming and costly experimental studies. Quantitative Structure- Property Relationship (QSPR) approaches are progressive tools in modelling and prediction of many physicochemical properties of nanomaterials, which are also known as nano-QSPR. This review provides insight into the concepts, challenges and applications of QSPR modelling of carbon-based nanomaterials. First, we try to provide a general overview of QSPR implications, by focusing on the difficulties and limitations on each step of the QSPR modelling of nanomaterials. Then follows with the most significant achievements of QSPR methods in modelling of carbon-based nanomaterials properties and their recent applications to generate predictive models. This review specifically addresses the QSPR modelling of physicochemical properties of carbon-based nanomaterials including fullerenes, single-walled carbon nanotube (SWNT), multi-walled carbon nanotube (MWNT) and graphene.
ERIC Educational Resources Information Center
Fulmer, Gavin W.; Liang, Ling L.
2013-01-01
This study tested a student survey to detect differences in instruction between teachers in a modeling-based science program and comparison group teachers. The Instructional Activities Survey measured teachers' frequency of modeling, inquiry, and lecture instruction. Factor analysis and Rasch modeling identified three subscales, Modeling and…
The Application of FIA-based Data to Wildlife Habitat Modeling: A Comparative Study
Thomas C., Jr. Edwards; Gretchen G. Moisen; Tracey S. Frescino; Randall J. Schultz
2005-01-01
We evaluated the capability of two types of models, one based on spatially explicit variables derived from FIA data and one using so-called traditional habitat evaluation methods, for predicting the presence of cavity-nesting bird habitat in Fishlake National Forest, Utah. Both models performed equally well, in measures of predictive accuracy, with the FIA-based model...
Reduced-Order Models Based on Linear and Nonlinear Aerodynamic Impulse Responses
NASA Technical Reports Server (NTRS)
Silva, Walter A.
1999-01-01
This paper discusses a method for the identification and application of reduced-order models based on linear and nonlinear aerodynamic impulse responses. The Volterra theory of nonlinear systems and an appropriate kernel identification technique are described. Insight into the nature of kernels is provided by applying the method to the nonlinear Riccati equation in a non-aerodynamic application. The method is then applied to a nonlinear aerodynamic model of RAE 2822 supercritical airfoil undergoing plunge motions using the CFL3D Navier-Stokes flow solver with the Spalart-Allmaras turbulence model. Results demonstrate the computational efficiency of the technique.
Reduced Order Models Based on Linear and Nonlinear Aerodynamic Impulse Responses
NASA Technical Reports Server (NTRS)
Silva, Walter A.
1999-01-01
This paper discusses a method for the identification and application of reduced-order models based on linear and nonlinear aerodynamic impulse responses. The Volterra theory of nonlinear systems and an appropriate kernel identification technique are described. Insight into the nature of kernels is provided by applying the method to the nonlinear Riccati equation in a non-aerodynamic application. The method is then applied to a nonlinear aerodynamic model of an RAE 2822 supercritical airfoil undergoing plunge motions using the CFL3D Navier-Stokes flow solver with the Spalart-Allmaras turbulence model. Results demonstrate the computational efficiency of the technique.
Markov models of genome segmentation
NASA Astrophysics Data System (ADS)
Thakur, Vivek; Azad, Rajeev K.; Ramaswamy, Ram
2007-01-01
We introduce Markov models for segmentation of symbolic sequences, extending a segmentation procedure based on the Jensen-Shannon divergence that has been introduced earlier. Higher-order Markov models are more sensitive to the details of local patterns and in application to genome analysis, this makes it possible to segment a sequence at positions that are biologically meaningful. We show the advantage of higher-order Markov-model-based segmentation procedures in detecting compositional inhomogeneity in chimeric DNA sequences constructed from genomes of diverse species, and in application to the E. coli K12 genome, boundaries of genomic islands, cryptic prophages, and horizontally acquired regions are accurately identified.
System equivalent model mixing
NASA Astrophysics Data System (ADS)
Klaassen, Steven W. B.; van der Seijs, Maarten V.; de Klerk, Dennis
2018-05-01
This paper introduces SEMM: a method based on Frequency Based Substructuring (FBS) techniques that enables the construction of hybrid dynamic models. With System Equivalent Model Mixing (SEMM) frequency based models, either of numerical or experimental nature, can be mixed to form a hybrid model. This model follows the dynamic behaviour of a predefined weighted master model. A large variety of applications can be thought of, such as the DoF-space expansion of relatively small experimental models using numerical models, or the blending of different models in the frequency spectrum. SEMM is outlined, both mathematically and conceptually, based on a notation commonly used in FBS. A critical physical interpretation of the theory is provided next, along with a comparison to similar techniques; namely DoF expansion techniques. SEMM's concept is further illustrated by means of a numerical example. It will become apparent that the basic method of SEMM has some shortcomings which warrant a few extensions to the method. One of the main applications is tested in a practical case, performed on a validated benchmark structure; it will emphasize the practicality of the method.
Administrator Training and Development: Conceptual Model.
ERIC Educational Resources Information Center
Boardman, Gerald R.
A conceptual model for an individualized training program for school administrators integrates processes, characteristics, and tasks through theory training and application. Based on an application of contingency theory, it provides a system matching up administrative candidates' needs in three areas (administrative process, administrative…
Version Control in Project-Based Learning
ERIC Educational Resources Information Center
Milentijevic, Ivan; Ciric, Vladimir; Vojinovic, Oliver
2008-01-01
This paper deals with the development of a generalized model for version control systems application as a support in a range of project-based learning methods. The model is given as UML sequence diagram and described in detail. The proposed model encompasses a wide range of different project-based learning approaches by assigning a supervisory…
Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications
NASA Technical Reports Server (NTRS)
Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.
2017-01-01
Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.
Systems Engineering Model and Training Application for Desktop Environment
NASA Technical Reports Server (NTRS)
May, Jeffrey T.
2010-01-01
Provide a graphical user interface based simulator for desktop training, operations and procedure development and system reference. This simulator allows for engineers to train and further understand the dynamics of their system from their local desktops. It allows the users to train and evaluate their system at a pace and skill level based on the user's competency and from a perspective based on the user's need. The simulator will not require any special resources to execute and should generally be available for use. The interface is based on a concept of presenting the model of the system in ways that best suits the user's application or training needs. The three levels of views are Component View, the System View (overall system), and the Console View (monitor). These views are portals into a single model, so changing the model from one view or from a model manager Graphical User Interface will be reflected on all other views.
A six-parameter Iwan model and its application
NASA Astrophysics Data System (ADS)
Li, Yikun; Hao, Zhiming
2016-02-01
Iwan model is a practical tool to describe the constitutive behaviors of joints. In this paper, a six-parameter Iwan model based on a truncated power-law distribution with two Dirac delta functions is proposed, which gives a more comprehensive description of joints than the previous Iwan models. Its analytical expressions including backbone curve, unloading curves and energy dissipation are deduced. Parameter identification procedures and the discretization method are also provided. A model application based on Segalman et al.'s experiment works with bolted joints is carried out. Simulation effects of different numbers of Jenkins elements are discussed. The results indicate that the six-parameter Iwan model can be used to accurately reproduce the experimental phenomena of joints.
NASA Astrophysics Data System (ADS)
Aneri, Parikh; Sumathy, S.
2017-11-01
Cloud computing provides services over the internet and provides application resources and data to the users based on their demand. Base of the Cloud Computing is consumer provider model. Cloud provider provides resources which consumer can access using cloud computing model in order to build their application based on their demand. Cloud data center is a bulk of resources on shared pool architecture for cloud user to access. Virtualization is the heart of the Cloud computing model, it provides virtual machine as per application specific configuration and those applications are free to choose their own configuration. On one hand, there is huge number of resources and on other hand it has to serve huge number of requests effectively. Therefore, resource allocation policy and scheduling policy play very important role in allocation and managing resources in this cloud computing model. This paper proposes the load balancing policy using Hungarian algorithm. Hungarian Algorithm provides dynamic load balancing policy with a monitor component. Monitor component helps to increase cloud resource utilization by managing the Hungarian algorithm by monitoring its state and altering its state based on artificial intelligent. CloudSim used in this proposal is an extensible toolkit and it simulates cloud computing environment.
NASA Astrophysics Data System (ADS)
Othman, M. F.; Kurniawan, R.; Schramm, D.; Ariffin, A. K.
2018-05-01
Modeling a cable model in multibody dynamics simulation tool which dynamically varies in length, mass and stiffness is a challenging task. Simulation of cable-driven parallel robots (CDPR) for instance requires a cable model that can dynamically change in length for every desired pose of the platform. Thus, in this paper, a detailed procedure for modeling and simulation of a dynamic cable model in Dymola is proposed. The approach is also applicable for other types of Modelica simulation environments. The cable is modeled using standard mechanical elements like mass, spring, damper and joint. The parameters of the cable model are based on the factsheet of the manufacturer and experimental results. Its dynamic ability is tested by applying it on a complete planar CDPR model in which the parameters are based on a prototype named CABLAR, which is developed in Chair of Mechatronics, University of Duisburg-Essen. The prototype has been developed to demonstrate an application of CDPR as a goods storage and retrieval machine. The performance of the cable model during the simulation is analyzed and discussed.
von Eye, Alexander; Mun, Eun Young; Bogat, G Anne
2008-03-01
This article reviews the premises of configural frequency analysis (CFA), including methods of choosing significance tests and base models, as well as protecting alpha, and discusses why CFA is a useful approach when conducting longitudinal person-oriented research. CFA operates at the manifest variable level. Longitudinal CFA seeks to identify those temporal patterns that stand out as more frequent (CFA types) or less frequent (CFA antitypes) than expected with reference to a base model. A base model that has been used frequently in CFA applications, prediction CFA, and a new base model, auto-association CFA, are discussed for analysis of cross-classifications of longitudinal data. The former base model takes the associations among predictors and among criteria into account. The latter takes the auto-associations among repeatedly observed variables into account. Application examples of each are given using data from a longitudinal study of domestic violence. It is demonstrated that CFA results are not redundant with results from log-linear modeling or multinomial regression and that, of these approaches, CFA shows particular utility when conducting person-oriented research.
Language Model Applications to Spelling with Brain-Computer Interfaces
Mora-Cortes, Anderson; Manyakov, Nikolay V.; Chumerin, Nikolay; Van Hulle, Marc M.
2014-01-01
Within the Ambient Assisted Living (AAL) community, Brain-Computer Interfaces (BCIs) have raised great hopes as they provide alternative communication means for persons with disabilities bypassing the need for speech and other motor activities. Although significant advancements have been realized in the last decade, applications of language models (e.g., word prediction, completion) have only recently started to appear in BCI systems. The main goal of this article is to review the language model applications that supplement non-invasive BCI-based communication systems by discussing their potential and limitations, and to discern future trends. First, a brief overview of the most prominent BCI spelling systems is given, followed by an in-depth discussion of the language models applied to them. These language models are classified according to their functionality in the context of BCI-based spelling: the static/dynamic nature of the user interface, the use of error correction and predictive spelling, and the potential to improve their classification performance by using language models. To conclude, the review offers an overview of the advantages and challenges when implementing language models in BCI-based communication systems when implemented in conjunction with other AAL technologies. PMID:24675760
Modelling metaldehyde in catchments: a River Thames case-study.
Lu, Q; Whitehead, P G; Bussi, G; Futter, M N; Nizzetto, L
2017-04-19
The application of metaldehyde to agricultural catchment areas to control slugs and snails has caused severe problems for drinking water supply in recent years. In the River Thames catchment, metaldehyde has been detected at levels well above the EU and UK drinking water standards of 0.1 μg l -1 at many sites across the catchment between 2008 and 2015. Metaldehyde is applied in autumn and winter, leading to its increased concentrations in surface waters. It is shown that a process-based hydro-biogeochemical transport model (INCA-contaminants) can be used to simulate metaldehyde transport in catchments from areas of application to the aquatic environment. Simulations indicate that high concentrations in the river system are a direct consequence of excessive application rates. A simple application control strategy for metaldehyde in the Thames catchment based on model results is presented.
DOT National Transportation Integrated Search
2018-01-01
This report explores the application of a discrete computational model for predicting the fracture behavior of asphalt mixtures at low temperatures based on the results of simple laboratory experiments. In this discrete element model, coarse aggregat...
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
GTM-Based QSAR Models and Their Applicability Domains.
Gaspar, H A; Baskin, I I; Marcou, G; Horvath, D; Varnek, A
2015-06-01
In this paper we demonstrate that Generative Topographic Mapping (GTM), a machine learning method traditionally used for data visualisation, can be efficiently applied to QSAR modelling using probability distribution functions (PDF) computed in the latent 2-dimensional space. Several different scenarios of the activity assessment were considered: (i) the "activity landscape" approach based on direct use of PDF, (ii) QSAR models involving GTM-generated on descriptors derived from PDF, and, (iii) the k-Nearest Neighbours approach in 2D latent space. Benchmarking calculations were performed on five different datasets: stability constants of metal cations Ca(2+) , Gd(3+) and Lu(3+) complexes with organic ligands in water, aqueous solubility and activity of thrombin inhibitors. It has been shown that the performance of GTM-based regression models is similar to that obtained with some popular machine-learning methods (random forest, k-NN, M5P regression tree and PLS) and ISIDA fragment descriptors. By comparing GTM activity landscapes built both on predicted and experimental activities, we may visually assess the model's performance and identify the areas in the chemical space corresponding to reliable predictions. The applicability domain used in this work is based on data likelihood. Its application has significantly improved the model performances for 4 out of 5 datasets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
Liu, Dong-jun; Li, Li
2015-01-01
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field. PMID:26110332
Liu, Dong-jun; Li, Li
2015-06-23
For the issue of haze-fog, PM2.5 is the main influence factor of haze-fog pollution in China. The trend of PM2.5 concentration was analyzed from a qualitative point of view based on mathematical models and simulation in this study. The comprehensive forecasting model (CFM) was developed based on the combination forecasting ideas. Autoregressive Integrated Moving Average Model (ARIMA), Artificial Neural Networks (ANNs) model and Exponential Smoothing Method (ESM) were used to predict the time series data of PM2.5 concentration. The results of the comprehensive forecasting model were obtained by combining the results of three methods based on the weights from the Entropy Weighting Method. The trend of PM2.5 concentration in Guangzhou China was quantitatively forecasted based on the comprehensive forecasting model. The results were compared with those of three single models, and PM2.5 concentration values in the next ten days were predicted. The comprehensive forecasting model balanced the deviation of each single prediction method, and had better applicability. It broadens a new prediction method for the air quality forecasting field.
Models for Microbial Fuel Cells: A critical review
NASA Astrophysics Data System (ADS)
Xia, Chengshuo; Zhang, Daxing; Pedrycz, Witold; Zhu, Yingmin; Guo, Yongxian
2018-01-01
Microbial fuel cells (MFCs) have been widely viewed as one of the most promising alternative sources of renewable energy. A recognition of needs of efficient development methods based on multidisciplinary research becomes crucial for the optimization of MFCs. Modeling of MFCs is an effective way for not only gaining a thorough understanding of the effects of operation conditions on the performance of power generation but also becomes of essential interest to the successful implementation of MFCs. The MFC models encompass the underlying reaction process and limiting factors of the MFC. The models come in various forms, such as the mathematical equations or the equivalent circuits. Different modeling focuses and approaches of the MFC have emerged. In this study, we present a state of the art of MFCs modeling; the past modeling methods are reviewed as well. Models and modeling methods are elaborated on based on the classification provided by Mechanism-based models and Application-based models. Mechanisms, advantages, drawbacks, and application fields of different models are illustrated as well. We exhibit a complete and comprehensive exposition of the different models for MFCs and offer further guidance to promote the performance of MFCs.
Introduction to Information Visualization (InfoVis) Techniques for Model-Based Systems Engineering
NASA Technical Reports Server (NTRS)
Sindiy, Oleg; Litomisky, Krystof; Davidoff, Scott; Dekens, Frank
2013-01-01
This paper presents insights that conform to numerous system modeling languages/representation standards. The insights are drawn from best practices of Information Visualization as applied to aerospace-based applications.
RINGMesh: A programming library for developing mesh-based geomodeling applications
NASA Astrophysics Data System (ADS)
Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume
2017-07-01
RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.
A genetic algorithm-based job scheduling model for big data analytics.
Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei
Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.
González, Martín Maximino León
2009-10-01
With the purpose to analyze the health strategic planning model based on determinants experienced in the municipality of Campo Bom, Rio Grande do Sul State, it was conducted an observational, qualitative study, of documental analysis as well as an evaluation of new process technologies in local health administration. This study contains an analysis of the methodological coherency and applicability of this model, based on the revision of the elaborated plans. The plans presented at Campo Bom case shows the possibility of integration and applicability at local level, of a health strategic planning model oriented to the new health concepts considering elements of different theoretical developments that enables the response to the most common local needs and situations. It was identified evolutional stages of health planning and analyzed integrative elements of the model and limitations of its application, pointing to the need of support the deepening on the study and the development of the field.
Chuang, Huan-Ming
2013-01-01
Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike. PMID:24453902
Chuang, Huan-Ming; Lin, Chien-Ku; Chen, Da-Ren; Chen, You-Shyang
2013-01-01
Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike.
NASA Astrophysics Data System (ADS)
Le, A.; Pricope, N. G.
2015-12-01
Projections indicate that increasing population density, food production, and urbanization in conjunction with changing climate conditions will place stress on water resource availability. As a result, a holistic understanding of current and future water resource distribution is necessary for creating strategies to identify the most sustainable means of accessing this resource. Currently, most water resource management strategies rely on the application of global climate predictions to physically based hydrologic models to understand potential changes in water availability. However, the need to focus on understanding community-level social behaviors that determine individual water usage is becoming increasingly evident, as predictions derived only from hydrologic models cannot accurately represent the coevolution of basin hydrology and human water and land usage. Models that are better equipped to represent the complexity and heterogeneity of human systems and satellite-derived products in place of or in conjunction with historic data significantly improve preexisting hydrologic model accuracy and application outcomes. We used a novel agent-based sociotechnical model that combines the Soil and Water Assessment Tool (SWAT) and Agent Analyst and applied it in the Nzoia Basin, an area in western Kenya that is becoming rapidly urbanized and industrialized. Informed by a combination of satellite-derived products and over 150 household surveys, the combined sociotechnical model provided unique insight into how populations self-organize and make decisions based on water availability. In addition, the model depicted how population organization and current management alter water availability currently and in the future.
Application of nonlinear adaptive motion washout to transport ground-handling simulation
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Martin, D. J., Jr.
1983-01-01
The application of a nonlinear coordinated adaptive motion washout to the transport ground-handling environment is documented. Additions to both the aircraft math model and the motion washout system are discussed. The additions to the simulated-aircraft math model provided improved modeling fidelity for braking and reverse-thrust application, and the additions to the motion-base washout system allowed transition from the desired flight parameters to the less restrictive ground parameters of the washout.
Fraser, Grant; Rohde, Ken; Silburn, Mark
2017-08-01
Dissolved inorganic nitrogen (DIN) movement from Australian sugarcane farms is believed to be a major cause of crown-of-thorns starfish outbreaks which have reduced the Great Barrier Reef coral cover by ~21% (1985-2012). We develop a daily model of DIN concentration in runoff based on >200 field monitored runoff events. Runoff DIN concentrations were related to nitrogen fertiliser application rates and decreased after application with time and cumulative rainfall. Runoff after liquid fertiliser applications had higher initial DIN concentrations, though these concentrations diminished more rapidly in comparison to granular fertiliser applications. The model was validated using an independent field dataset and provided reasonable estimates of runoff DIN concentrations based on a number of modelling efficiency score results. The runoff DIN concentration model was combined with a water balance cropping model to investigate temporal aspects of sugarcane fertiliser management. Nitrogen fertiliser application in December (start of wet season) had the highest risk of DIN movement, and this was further exacerbated in years with a climate forecast for 'wet' seasonal conditions. The potential utility of a climate forecasting system to predict forthcoming wet months and hence DIN loss risk is demonstrated. Earlier fertiliser application or reducing fertiliser application rates in seasons with a wet climate forecast may markedly reduce runoff DIN loads; however, it is recommended that these findings be tested at a broader scale.
Chen, Y; Mao, J; Lin, J; Yu, H; Peters, S; Shebley, M
2016-01-01
This subteam under the Drug Metabolism Leadership Group (Innovation and Quality Consortium) investigated the quantitative role of circulating inhibitory metabolites in drug–drug interactions using physiologically based pharmacokinetic (PBPK) modeling. Three drugs with major circulating inhibitory metabolites (amiodarone, gemfibrozil, and sertraline) were systematically evaluated in addition to the literature review of recent examples. The application of PBPK modeling in drug interactions by inhibitory parent–metabolite pairs is described and guidance on strategic application is provided. PMID:27642087
Description and availability of the SMARTS spectral model for photovoltaic applications
NASA Astrophysics Data System (ADS)
Myers, Daryl R.; Gueymard, Christian A.
2004-11-01
Limited spectral response range of photocoltaic (PV) devices requires device performance be characterized with respect to widely varying terrestrial solar spectra. The FORTRAN code "Simple Model for Atmospheric Transmission of Sunshine" (SMARTS) was developed for various clear-sky solar renewable energy applications. The model is partly based on parameterizations of transmittance functions in the MODTRAN/LOWTRAN band model family of radiative transfer codes. SMARTS computes spectra with a resolution of 0.5 nanometers (nm) below 400 nm, 1.0 nm from 400 nm to 1700 nm, and 5 nm from 1700 nm to 4000 nm. Fewer than 20 input parameters are required to compute spectral irradiance distributions including spectral direct beam, total, and diffuse hemispherical radiation, and up to 30 other spectral parameters. A spreadsheet-based graphical user interface can be used to simplify the construction of input files for the model. The model is the basis for new terrestrial reference spectra developed by the American Society for Testing and Materials (ASTM) for photovoltaic and materials degradation applications. We describe the model accuracy, functionality, and the availability of source and executable code. Applications to PV rating and efficiency and the combined effects of spectral selectivity and varying atmospheric conditions are briefly discussed.
Russell, Solomon; Distefano, Joseph J
2006-07-01
W(3)MAMCAT is a new web-based and interactive system for building and quantifying the parameters or parameter ranges of n-compartment mammillary and catenary model structures, with input and output in the first compartment, from unstructured multiexponential (sum-of-n-exponentials) models. It handles unidentifiable as well as identifiable models and, as such, provides finite parameter interval solutions for unidentifiable models, whereas direct parameter search programs typically do not. It also tutorially develops the theory of model distinguishability for same order mammillary versus catenary models, as did its desktop application predecessor MAMCAT+. This includes expert system analysis for distinguishing mammillary from catenary structures, given input and output in similarly numbered compartments. W(3)MAMCAT provides for universal deployment via the internet and enhanced application error checking. It uses supported Microsoft technologies to form an extensible application framework for maintaining a stable and easily updatable application. Most important, anybody, anywhere, is welcome to access it using Internet Explorer 6.0 over the internet for their teaching or research needs. It is available on the Biocybernetics Laboratory website at UCLA: www.biocyb.cs.ucla.edu.
Application of Fused Deposition Modelling (FDM) Method of 3D Printing in Drug Delivery.
Long, Jingjunjiao; Gholizadeh, Hamideh; Lu, Jun; Bunt, Craig; Seyfoddin, Ali
2017-01-01
Three-dimensional (3D) printing is an emerging manufacturing technology for biomedical and pharmaceutical applications. Fused deposition modelling (FDM) is a low cost extrusion-based 3D printing technique that can deposit materials layer-by-layer to create solid geometries. This review article aims to provide an overview of FDM based 3D printing application in developing new drug delivery systems. The principle methodology, suitable polymers and important parameters in FDM technology and its applications in fabrication of personalised tablets and drug delivery devices are discussed in this review. FDM based 3D printing is a novel and versatile manufacturing technique for creating customised drug delivery devices that contain accurate dose of medicine( s) and provide controlled drug released profiles. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Knowledge sifters in MDA technologies
NASA Astrophysics Data System (ADS)
Kravchenko, Yuri; Kursitys, Ilona; Bova, Victoria
2018-05-01
The article considers a new approach to efficient management of information processes on the basis of object models. With the help of special design tools, a generic and application-independent application model is created, and then the program is implemented in a specific development environment. At the same time, the development process is completely based on a model that must contain all the information necessary for programming. The presence of a detailed model provides the automatic creation of typical parts of the application, the development of which is amenable to automation.
Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo
2018-01-01
It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.
Protein docking by the interface structure similarity: how much structure is needed?
Sinha, Rohita; Kundrotas, Petras J; Vakser, Ilya A
2012-01-01
The increasing availability of co-crystallized protein-protein complexes provides an opportunity to use template-based modeling for protein-protein docking. Structure alignment techniques are useful in detection of remote target-template similarities. The size of the structure involved in the alignment is important for the success in modeling. This paper describes a systematic large-scale study to find the optimal definition/size of the interfaces for the structure alignment-based docking applications. The results showed that structural areas corresponding to the cutoff values <12 Å across the interface inadequately represent structural details of the interfaces. With the increase of the cutoff beyond 12 Å, the success rate for the benchmark set of 99 protein complexes, did not increase significantly for higher accuracy models, and decreased for lower-accuracy models. The 12 Å cutoff was optimal in our interface alignment-based docking, and a likely best choice for the large-scale (e.g., on the scale of the entire genome) applications to protein interaction networks. The results provide guidelines for the docking approaches, including high-throughput applications to modeled structures.
Advances in the Application of Decision Theory to Test-Based Decision Making.
ERIC Educational Resources Information Center
van der Linden, Wim J.
This paper reviews recent research in the Netherlands on the application of decision theory to test-based decision making about personnel selection and student placement. The review is based on an earlier model proposed for the classification of decision problems, and emphasizes an empirical Bayesian framework. Classification decisions with…
The development and application of physiologically based pharmacokinetic (PBPK) models in chemical toxicology have grown steadily since their emergence in the 1980s. However, critical evaluation of PBPK models to support public health decision-making across federal agencies has t...
Assessing the applicability of template-based protein docking in the twilight zone.
Negroni, Jacopo; Mosca, Roberto; Aloy, Patrick
2014-09-02
The structural modeling of protein interactions in the absence of close homologous templates is a challenging task. Recently, template-based docking methods have emerged to exploit local structural similarities to help ab-initio protocols provide reliable 3D models for protein interactions. In this work, we critically assess the performance of template-based docking in the twilight zone. Our results show that, while it is possible to find templates for nearly all known interactions, the quality of the obtained models is rather limited. We can increase the precision of the models at expenses of coverage, but it drastically reduces the potential applicability of the method, as illustrated by the whole-interactome modeling of nine organisms. Template-based docking is likely to play an important role in the structural characterization of the interaction space, but we still need to improve the repertoire of structural templates onto which we can reliably model protein complexes. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Yangdong; Han, Zhen; Liao, Zhongping
2009-10-01
Spatiality, temporality, legality, accuracy and continuality are characteristic of cadastral information, and the cadastral management demands that the cadastral data should be accurate, integrated and updated timely. It's a good idea to build an effective GIS management system to manage the cadastral data which are characterized by spatiality and temporality. Because no sound spatio-temporal data models have been adopted, however, the spatio-temporal characteristics of cadastral data are not well expressed in the existing cadastral management systems. An event-version-based spatio-temporal modeling approach is first proposed from the angle of event and version. Then with the help of it, an event-version-based spatio-temporal cadastral data model is built to represent spatio-temporal cadastral data. At last, the previous model is used in the design and implementation of a spatio-temporal cadastral management system. The result of the application of the system shows that the event-version-based spatio-temporal data model is very suitable for the representation and organization of cadastral data.
Model-based video segmentation for vision-augmented interactive games
NASA Astrophysics Data System (ADS)
Liu, Lurng-Kuo
2000-04-01
This paper presents an architecture and algorithms for model based video object segmentation and its applications to vision augmented interactive game. We are especially interested in real time low cost vision based applications that can be implemented in software in a PC. We use different models for background and a player object. The object segmentation algorithm is performed in two different levels: pixel level and object level. At pixel level, the segmentation algorithm is formulated as a maximizing a posteriori probability (MAP) problem. The statistical likelihood of each pixel is calculated and used in the MAP problem. Object level segmentation is used to improve segmentation quality by utilizing the information about the spatial and temporal extent of the object. The concept of an active region, which is defined based on motion histogram and trajectory prediction, is introduced to indicate the possibility of a video object region for both background and foreground modeling. It also reduces the overall computation complexity. In contrast with other applications, the proposed video object segmentation system is able to create background and foreground models on the fly even without introductory background frames. Furthermore, we apply different rate of self-tuning on the scene model so that the system can adapt to the environment when there is a scene change. We applied the proposed video object segmentation algorithms to several prototype virtual interactive games. In our prototype vision augmented interactive games, a player can immerse himself/herself inside a game and can virtually interact with other animated characters in a real time manner without being constrained by helmets, gloves, special sensing devices, or background environment. The potential applications of the proposed algorithms including human computer gesture interface and object based video coding such as MPEG-4 video coding.
NASA Astrophysics Data System (ADS)
Cipriani, L.; Fantini, F.; Bertacchi, S.
2014-06-01
Image-based modelling tools based on SfM algorithms gained great popularity since several software houses provided applications able to achieve 3D textured models easily and automatically. The aim of this paper is to point out the importance of controlling models parameterization process, considering that automatic solutions included in these modelling tools can produce poor results in terms of texture utilization. In order to achieve a better quality of textured models from image-based modelling applications, this research presents a series of practical strategies aimed at providing a better balance between geometric resolution of models from passive sensors and their corresponding (u,v) map reference systems. This aspect is essential for the achievement of a high-quality 3D representation, since "apparent colour" is a fundamental aspect in the field of Cultural Heritage documentation. Complex meshes without native parameterization have to be "flatten" or "unwrapped" in the (u,v) parameter space, with the main objective to be mapped with a single image. This result can be obtained by using two different strategies: the former automatic and faster, while the latter manual and time-consuming. Reverse modelling applications provide automatic solutions based on splitting the models by means of different algorithms, that produce a sort of "atlas" of the original model in the parameter space, in many instances not adequate and negatively affecting the overall quality of representation. Using in synergy different solutions, ranging from semantic aware modelling techniques to quad-dominant meshes achieved using retopology tools, it is possible to obtain a complete control of the parameterization process.
Integration agent-based models and GIS as a virtual urban dynamic laboratory
NASA Astrophysics Data System (ADS)
Chen, Peng; Liu, Miaolong
2007-06-01
Based on the Agent-based Model and spatial data model, a tight-coupling integrating method of GIS and Agent-based Model (ABM) is to be discussed in this paper. The use of object-orientation for both spatial data and spatial process models facilitates their integration, which can allow exploration and explanation of spatial-temporal phenomena such as urban dynamic. In order to better understand how tight coupling might proceed and to evaluate the possible functional and efficiency gains from such a tight coupling, the agent-based model and spatial data model are discussed, and then the relationships affecting spatial data model and agent-based process models interaction. After that, a realistic crowd flow simulation experiment is presented. Using some tools provided by general GIS systems and a few specific programming languages, a new software system integrating GIS and MAS as a virtual laboratory applicable for simulating pedestrian flows in a crowd activity centre has been developed successfully. Under the environment supported by the software system, as an applicable case, a dynamic evolution process of the pedestrian's flows (dispersed process for the spectators) in a crowds' activity center - The Shanghai Stadium has been simulated successfully. At the end of the paper, some new research problems have been pointed out for the future.
The Missing Stakeholder Group: Why Patients Should be Involved in Health Economic Modelling.
van Voorn, George A K; Vemer, Pepijn; Hamerlijnck, Dominique; Ramos, Isaac Corro; Teunissen, Geertruida J; Al, Maiwenn; Feenstra, Talitha L
2016-04-01
Evaluations of healthcare interventions, e.g. new drugs or other new treatment strategies, commonly include a cost-effectiveness analysis (CEA) that is based on the application of health economic (HE) models. As end users, patients are important stakeholders regarding the outcomes of CEAs, yet their knowledge of HE model development and application, or their involvement therein, is absent. This paper considers possible benefits and risks of patient involvement in HE model development and application for modellers and patients. An exploratory review of the literature has been performed on stakeholder-involved modelling in various disciplines. In addition, Dutch patient experts have been interviewed about their experience in, and opinion about, the application of HE models. Patients have little to no knowledge of HE models and are seldom involved in HE model development and application. Benefits of becoming involved would include a greater understanding and possible acceptance by patients of HE model application, improved model validation, and a more direct infusion of patient expertise. Risks would include patient bias and increased costs of modelling. Patient involvement in HE modelling seems to carry several benefits as well as risks. We claim that the benefits may outweigh the risks and that patients should become involved.
Constraint-Based Local Search for Constrained Optimum Paths Problems
NASA Astrophysics Data System (ADS)
Pham, Quang Dung; Deville, Yves; van Hentenryck, Pascal
Constrained Optimum Path (COP) problems arise in many real-life applications and are ubiquitous in communication networks. They have been traditionally approached by dedicated algorithms, which are often hard to extend with side constraints and to apply widely. This paper proposes a constraint-based local search (CBLS) framework for COP applications, bringing the compositionality, reuse, and extensibility at the core of CBLS and CP systems. The modeling contribution is the ability to express compositional models for various COP applications at a high level of abstraction, while cleanly separating the model and the search procedure. The main technical contribution is a connected neighborhood based on rooted spanning trees to find high-quality solutions to COP problems. The framework, implemented in COMET, is applied to Resource Constrained Shortest Path (RCSP) problems (with and without side constraints) and to the edge-disjoint paths problem (EDP). Computational results show the potential significance of the approach.
Modeling method of time sequence model based grey system theory and application proceedings
NASA Astrophysics Data System (ADS)
Wei, Xuexia; Luo, Yaling; Zhang, Shiqiang
2015-12-01
This article gives a modeling method of grey system GM(1,1) model based on reusing information and the grey system theory. This method not only extremely enhances the fitting and predicting accuracy of GM(1,1) model, but also maintains the conventional routes' merit of simple computation. By this way, we have given one syphilis trend forecast method based on reusing information and the grey system GM(1,1) model.
Pharmacokinetic modeling in aquatic animals. 1. Models and concepts
Barron, M.G.; Stehly, Guy R.; Hayton, W.L.
1990-01-01
While clinical and toxicological applications of pharmacokinetics have continued to evolve both conceptually and experimentally, pharmacokinetics modeling in aquatic animals has not progressed accordingly. In this paper we present methods and concepts of pharmacokinetic modeling in aquatic animals using multicompartmental, clearance-based, non-compartmental and physiologically-based pharmacokinetic models. These models should be considered as alternatives to traditional approaches, which assume that the animal acts as a single homogeneous compartment based on apparent monoexponential elimination.
NASA Astrophysics Data System (ADS)
Gendreau, Audrey
Efficient self-organizing virtual clusterheads that supervise data collection based on their wireless connectivity, risk, and overhead costs, are an important element of Wireless Sensor Networks (WSNs). This function is especially critical during deployment when system resources are allocated to a subsequent application. In the presented research, a model used to deploy intrusion detection capability on a Local Area Network (LAN), in the literature, was extended to develop a role-based hierarchical agent deployment algorithm for a WSN. The resulting model took into consideration the monitoring capability, risk, deployment distribution cost, and monitoring cost associated with each node. Changing the original LAN methodology approach to model a cluster-based sensor network depended on the ability to duplicate a specific parameter that represented the monitoring capability. Furthermore, other parameters derived from a LAN can elevate costs and risk of deployment, as well as jeopardize the success of an application on a WSN. A key component of the approach presented in this research was to reduce the costs when established clusterheads in the network were found to be capable of hosting additional detection agents. In addition, another cost savings component of the study addressed the reduction of vulnerabilities associated with deployment of agents to high volume nodes. The effectiveness of the presented method was validated by comparing it against a type of a power-based scheme that used each node's remaining energy as the deployment value. While available energy is directly related to the model used in the presented method, the study deliberately sought out nodes that were identified with having superior monitoring capability, cost less to create and sustain, and are at low-risk of an attack. This work investigated improving the efficiency of an intrusion detection system (IDS) by using the proposed model to deploy monitoring agents after a temperature sensing application had established the network traffic flow to the sink. The same scenario was repeated using a power-based IDS to compare it against the proposed model. To identify a clusterhead's ability to host monitoring agents after the temperature sensing application terminated, the deployed IDS utilized the communication history and other network factors in order to rank the nodes. Similarly, using the node's communication history, the deployed power-based IDS ranked nodes based on their remaining power. For each individual scenario, and after the IDS application was deployed, the temperature sensing application was run for a second time. This time, to monitor the temperature sensing agents as the data flowed towards the sink, the network traffic was rerouted through the new intrusion detection clusterheads. Consequently, if the clusterheads were shared, the re-routing step was not preformed. Experimental results in this research demonstrated the effectiveness of applying a robust deployment metric to improve upon the energy efficiency of a deployed application in a multi-application WSN. It was found that in the scenarios with the intrusion detection application that utilized the proposed model resulted in more remaining energy than in the scenarios that implemented the power-based IDS. The algorithm especially had a positive impact on the small, dense, and more homogeneous networks. This finding was reinforced by the smaller percentage of new clusterheads that was selected. Essentially, the energy cost of the route to the sink was reduced because the network traffic was rerouted through fewer new clusterheads. Additionally, it was found that the intrusion detection topology that used the proposed approach formed smaller and more connected sets of clusterheads than the power-based IDS. As a consequence, this proposed approach essentially achieved the research objective for enhancing energy use in a multi-application WSN.
NASA Astrophysics Data System (ADS)
Kock, B. E.
2008-12-01
The increased availability and understanding of agent-based modeling technology and techniques provides a unique opportunity for water resources modelers, allowing them to go beyond traditional behavioral approaches from neoclassical economics, and add rich cognition to social-hydrological models. Agent-based models provide for an individual focus, and the easier and more realistic incorporation of learning, memory and other mechanisms for increased cognitive sophistication. We are in an age of global change impacting complex water resources systems, and social responses are increasingly recognized as fundamentally adaptive and emergent. In consideration of this, water resources models and modelers need to better address social dynamics in a manner beyond the capabilities of neoclassical economics theory and practice. However, going beyond the unitary curve requires unique levels of engagement with stakeholders, both to elicit the richer knowledge necessary for structuring and parameterizing agent-based models, but also to make sure such models are appropriately used. With the aim of encouraging epistemological and methodological convergence in the agent-based modeling of water resources, we have developed a water resources-specific cognitive model and an associated collaborative modeling process. Our cognitive model emphasizes efficiency in architecture and operation, and capacity to adapt to different application contexts. We describe a current application of this cognitive model and modeling process in the Arkansas Basin of Colorado. In particular, we highlight the potential benefits of, and challenges to, using more sophisticated cognitive models in agent-based water resources models.
Whelan, M J; Davenport, E J; Smith, B G
2007-05-15
A screening model of pesticide leaching loss is described which forms part of a multi-criteria risk-based indicator system called PRoMPT (Pesticide Risk Management and Profiling Tool). The leaching model evaluates pesticide fate in soil for any application rate and time of application (including multiple applications), for any land-based location in the world. It considers a generic evaluative environment with fixed dimensions and soil properties. The soil profile is conceptualised as a number of discrete layers. Equilibrium partitioning between adsorbed and dissolved chemical (based on the organic carbon-water partition coefficient [K(OC)]) is assumed in each time step, in each layer. Non-leaching losses are described using first order kinetics. Drainage is assumed to be uniform throughout the soil profile but varies temporally. The drainage rate, which can be augmented by evapotranspiration-adjusted irrigation, is derived from long-term mean monthly water balance model calculations performed for 30 arc-minute grid cells across the entire ice-free land surface of the earth. Although, such predictions are approximate, they do capture the seasonality and relative magnitude of drainage and allow the model to be applied anywhere, without the need for extensive data compilation. PRoMPT predictions are shown to be consistent with those made by more sophisticated models (PRZM, PELMO and PEARL) for the FOCUS groundwater scenarios.
Development and application of air quality models at the US ...
Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Assessment of municipal solid waste settlement models based on field-scale data analysis.
Bareither, Christopher A; Kwak, Seungbok
2015-08-01
An evaluation of municipal solid waste (MSW) settlement model performance and applicability was conducted based on analysis of two field-scale datasets: (1) Yolo and (2) Deer Track Bioreactor Experiment (DTBE). Twelve MSW settlement models were considered that included a range of compression behavior (i.e., immediate compression, mechanical creep, and biocompression) and range of total (2-22) and optimized (2-7) model parameters. A multi-layer immediate settlement analysis developed for Yolo provides a framework to estimate initial waste thickness and waste thickness at the end-of-immediate compression. Model application to the Yolo test cells (conventional and bioreactor landfills) via least squares optimization yielded high coefficient of determinations for all settlement models (R(2)>0.83). However, empirical models (i.e., power creep, logarithmic, and hyperbolic models) are not recommended for use in MSW settlement modeling due to potential non-representative long-term MSW behavior, limited physical significance of model parameters, and required settlement data for model parameterization. Settlement models that combine mechanical creep and biocompression into a single mathematical function constrain time-dependent settlement to a single process with finite magnitude, which limits model applicability. Overall, all models evaluated that couple multiple compression processes (immediate, creep, and biocompression) provided accurate representations of both Yolo and DTBE datasets. A model presented in Gourc et al. (2010) included the lowest number of total and optimized model parameters and yielded high statistical performance for all model applications (R(2)⩾0.97). Copyright © 2015 Elsevier Ltd. All rights reserved.
MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation
NASA Technical Reports Server (NTRS)
Charest, Leonard
1994-01-01
This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.
Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley
2004-01-01
Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...
Physiologically based kinetic (PBK) models are used widely throughout a number of working sectors, including academia and industry, to provide insight into the dosimetry related to observed adverse health effects in humans and other species. Use of these models has increased over...
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-12-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-09-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
NASA Astrophysics Data System (ADS)
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-12-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
NASA Astrophysics Data System (ADS)
Schroeter, Timon Sebastian; Schwaighofer, Anton; Mika, Sebastian; Ter Laak, Antonius; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-09-01
We investigate the use of different Machine Learning methods to construct models for aqueous solubility. Models are based on about 4000 compounds, including an in-house set of 632 drug discovery molecules of Bayer Schering Pharma. For each method, we also consider an appropriate method to obtain error bars, in order to estimate the domain of applicability (DOA) for each model. Here, we investigate error bars from a Bayesian model (Gaussian Process (GP)), an ensemble based approach (Random Forest), and approaches based on the Mahalanobis distance to training data (for Support Vector Machine and Ridge Regression models). We evaluate all approaches in terms of their prediction accuracy (in cross-validation, and on an external validation set of 536 molecules) and in how far the individual error bars can faithfully represent the actual prediction error.
ERIC Educational Resources Information Center
Thompson, Kate; Reimann, Peter
2010-01-01
A classification system that was developed for the use of agent-based models was applied to strategies used by school-aged students to interrogate an agent-based model and a system dynamics model. These were compared, and relationships between learning outcomes and the strategies used were also analysed. It was found that the classification system…
NASA Astrophysics Data System (ADS)
Cheng, K.; Guo, L. M.; Wang, Y. K.; Zafar, M. T.
2017-11-01
In order to select effective samples in the large number of data of PV power generation years and improve the accuracy of PV power generation forecasting model, this paper studies the application of clustering analysis in this field and establishes forecasting model based on neural network. Based on three different types of weather on sunny, cloudy and rainy days, this research screens samples of historical data by the clustering analysis method. After screening, it establishes BP neural network prediction models using screened data as training data. Then, compare the six types of photovoltaic power generation prediction models before and after the data screening. Results show that the prediction model combining with clustering analysis and BP neural networks is an effective method to improve the precision of photovoltaic power generation.
Forest management applications of Landsat data in a geographic information system
NASA Technical Reports Server (NTRS)
Maw, K. D.; Brass, J. A.
1982-01-01
The utility of land-cover data resulting from Landsat MSS classification can be greatly enhanced by use in combination with ancillary data. A demonstration forest management applications data base was constructed for Santa Cruz County, California, to demonstrate geographic information system applications of classified Landsat data. The data base contained detailed soils, digital terrain, land ownership, jurisdictional boundaries, fire events, and generalized land-use data, all registered to a UTM grid base. Applications models were developed from problems typical of fire management and reforestation planning.
NASA Astrophysics Data System (ADS)
Seoud, Ahmed; Kim, Juhwan; Ma, Yuansheng; Jayaram, Srividya; Hong, Le; Chae, Gyu-Yeol; Lee, Jeong-Woo; Park, Dae-Jin; Yune, Hyoung-Soon; Oh, Se-Young; Park, Chan-Ha
2018-03-01
Sub-resolution assist feature (SRAF) insertion techniques have been effectively used for a long time now to increase process latitude in the lithography patterning process. Rule-based SRAF and model-based SRAF are complementary solutions, and each has its own benefits, depending on the objectives of applications and the criticality of the impact on manufacturing yield, efficiency, and productivity. Rule-based SRAF provides superior geometric output consistency and faster runtime performance, but the associated recipe development time can be of concern. Model-based SRAF provides better coverage for more complicated pattern structures in terms of shapes and sizes, with considerably less time required for recipe development, although consistency and performance may be impacted. In this paper, we introduce a new model-assisted template extraction (MATE) SRAF solution, which employs decision tree learning in a model-based solution to provide the benefits of both rule-based and model-based SRAF insertion approaches. The MATE solution is designed to automate the creation of rules/templates for SRAF insertion, and is based on the SRAF placement predicted by model-based solutions. The MATE SRAF recipe provides optimum lithographic quality in relation to various manufacturing aspects in a very short time, compared to traditional methods of rule optimization. Experiments were done using memory device pattern layouts to compare the MATE solution to existing model-based SRAF and pixelated SRAF approaches, based on lithographic process window quality, runtime performance, and geometric output consistency.
An efficient and scalable deformable model for virtual reality-based medical applications.
Choi, Kup-Sze; Sun, Hanqiu; Heng, Pheng-Ann
2004-09-01
Modeling of tissue deformation is of great importance to virtual reality (VR)-based medical simulations. Considerable effort has been dedicated to the development of interactively deformable virtual tissues. In this paper, an efficient and scalable deformable model is presented for virtual-reality-based medical applications. It considers deformation as a localized force transmittal process which is governed by algorithms based on breadth-first search (BFS). The computational speed is scalable to facilitate real-time interaction by adjusting the penetration depth. Simulated annealing (SA) algorithms are developed to optimize the model parameters by using the reference data generated with the linear static finite element method (FEM). The mechanical behavior and timing performance of the model have been evaluated. The model has been applied to simulate the typical behavior of living tissues and anisotropic materials. Integration with a haptic device has also been achieved on a generic personal computer (PC) platform. The proposed technique provides a feasible solution for VR-based medical simulations and has the potential for multi-user collaborative work in virtual environment.
An enhanced lumped element electrical model of a double barrier memristive device
NASA Astrophysics Data System (ADS)
Solan, Enver; Dirkmann, Sven; Hansen, Mirko; Schroeder, Dietmar; Kohlstedt, Hermann; Ziegler, Martin; Mussenbrock, Thomas; Ochs, Karlheinz
2017-05-01
The massive parallel approach of neuromorphic circuits leads to effective methods for solving complex problems. It has turned out that resistive switching devices with a continuous resistance range are potential candidates for such applications. These devices are memristive systems—nonlinear resistors with memory. They are fabricated in nanotechnology and hence parameter spread during fabrication may aggravate reproducible analyses. This issue makes simulation models of memristive devices worthwhile. Kinetic Monte-Carlo simulations based on a distributed model of the device can be used to understand the underlying physical and chemical phenomena. However, such simulations are very time-consuming and neither convenient for investigations of whole circuits nor for real-time applications, e.g. emulation purposes. Instead, a concentrated model of the device can be used for both fast simulations and real-time applications, respectively. We introduce an enhanced electrical model of a valence change mechanism (VCM) based double barrier memristive device (DBMD) with a continuous resistance range. This device consists of an ultra-thin memristive layer sandwiched between a tunnel barrier and a Schottky-contact. The introduced model leads to very fast simulations by using usual circuit simulation tools while maintaining physically meaningful parameters. Kinetic Monte-Carlo simulations based on a distributed model and experimental data have been utilized as references to verify the concentrated model.
An access control model with high security for distributed workflow and real-time application
NASA Astrophysics Data System (ADS)
Han, Ruo-Fei; Wang, Hou-Xiang
2007-11-01
The traditional mandatory access control policy (MAC) is regarded as a policy with strict regulation and poor flexibility. The security policy of MAC is so compelling that few information systems would adopt it at the cost of facility, except some particular cases with high security requirement as military or government application. However, with the increasing requirement for flexibility, even some access control systems in military application have switched to role-based access control (RBAC) which is well known as flexible. Though RBAC can meet the demands for flexibility but it is weak in dynamic authorization and consequently can not fit well in the workflow management systems. The task-role-based access control (T-RBAC) is then introduced to solve the problem. It combines both the advantages of RBAC and task-based access control (TBAC) which uses task to manage permissions dynamically. To satisfy the requirement of system which is distributed, well defined with workflow process and critically for time accuracy, this paper will analyze the spirit of MAC, introduce it into the improved T&RBAC model which is based on T-RBAC. At last, a conceptual task-role-based access control model with high security for distributed workflow and real-time application (A_T&RBAC) is built, and its performance is simply analyzed.
Thoracic respiratory motion estimation from MRI using a statistical model and a 2-D image navigator.
King, A P; Buerger, C; Tsoumpas, C; Marsden, P K; Schaeffter, T
2012-01-01
Respiratory motion models have potential application for estimating and correcting the effects of motion in a wide range of applications, for example in PET-MR imaging. Given that motion cycles caused by breathing are only approximately repeatable, an important quality of such models is their ability to capture and estimate the intra- and inter-cycle variability of the motion. In this paper we propose and describe a technique for free-form nonrigid respiratory motion correction in the thorax. Our model is based on a principal component analysis of the motion states encountered during different breathing patterns, and is formed from motion estimates made from dynamic 3-D MRI data. We apply our model using a data-driven technique based on a 2-D MRI image navigator. Unlike most previously reported work in the literature, our approach is able to capture both intra- and inter-cycle motion variability. In addition, the 2-D image navigator can be used to estimate how applicable the current motion model is, and hence report when more imaging data is required to update the model. We also use the motion model to decide on the best positioning for the image navigator. We validate our approach using MRI data acquired from 10 volunteers and demonstrate improvements of up to 40.5% over other reported motion modelling approaches, which corresponds to 61% of the overall respiratory motion present. Finally we demonstrate one potential application of our technique: MRI-based motion correction of real-time PET data for simultaneous PET-MRI acquisition. Copyright © 2011 Elsevier B.V. All rights reserved.
Calibration Modeling Methodology to Optimize Performance for Low Range Applications
NASA Technical Reports Server (NTRS)
McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.
2010-01-01
Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.
Kovács, R; Miháltz, P; Csikor, Zs
2007-01-01
The application of an ASM1-based mathematical model for the modeling of autothermal thermophilic aerobic digestion is demonstrated. Based on former experimental results the original ASM1 was extended by the activation of facultative thermophiles from the feed sludge and a new component, the thermophilic biomass was introduced. The resulting model was calibrated in the temperature range of 20-60 degrees C. The temperature dependence of the growth and decay rates in the model is given in terms of the slightly modified Arrhenius and Topiwala-Sinclair equations. The capabilities of the calibrated model in realistic ATAD scenarios are demonstrated with a focus on autothermal properties of ATAD systems at different conditions.
A Field-Based Curriculum Model for Earth Science Teacher-Preparation Programs.
ERIC Educational Resources Information Center
Dubois, David D.
1979-01-01
This study proposed a model set of cognitive-behavioral objectives for field-based teacher education programs for earth science teachers. It describes field experience integration into teacher education programs. The model is also applicable for evaluation of earth science teacher education programs. (RE)
Bromochloromethane (BCM) is a volatile organic compound and a by-product of disinfection of water by chlorination. Physiologically based pharmacokinetic (PBPK) models are used in risk assessment applications and a PBPK model for BCM, Updated with F-344 specific input parameters,...
HexSim - A general purpose framework for spatially-explicit, individual-based modeling
HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...
NASA Astrophysics Data System (ADS)
Ulfa, Andi Maria; Sugiyarto, Kristian H.; Ikhsan, Jaslin
2017-05-01
Poor achievement of students' performance on Chemistry may result from unfavourable learning processes. Therefore, innovation on learning process must be created. Regarding fast development of mobile technology, learning process cannot ignore the crucial role of the technology. This research and development (R&D) studies was done to develop android based application and to study the effect of its integration in Learning together (LT) into the improvement of students' learning creativity and cognitive achievement. The development of the application was carried out by adapting Borg & Gall and Dick & Carey model. The developed-product was reviewed by chemist, learning media practitioners, peer reviewers, and educators. After the revision based on the reviews, the application was used in the LT model on the topic of Stoichiometry in a senior high school. The instruments were questionnaires to get comments and suggestion from the reviewers about the application, and the another questionnaire was to collect the data of learning creativity. Another instrument used was a set of test by which data of students' achievement was collected. The results showed that the use of the mobile based application on Learning Together can bring about significant improvement of students' performance including creativity and cognitive achievement.
General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations
NASA Technical Reports Server (NTRS)
Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark
2010-01-01
Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.
Linhart, S. Mike; Nania, Jon F.; Christiansen, Daniel E.; Hutchinson, Kasey J.; Sanders, Curtis L.; Archfield, Stacey A.
2013-01-01
A variety of individuals from water resource managers to recreational users need streamflow information for planning and decisionmaking at locations where there are no streamgages. To address this problem, two statistically based methods, the Flow Duration Curve Transfer method and the Flow Anywhere method, were developed for statewide application and the two physically based models, the Precipitation Runoff Modeling-System and the Soil and Water Assessment Tool, were only developed for application for the Cedar River Basin. Observed and estimated streamflows for the two methods and models were compared for goodness of fit at 13 streamgages modeled in the Cedar River Basin by using the Nash-Sutcliffe and the percent-bias efficiency values. Based on median and mean Nash-Sutcliffe values for the 13 streamgages the Precipitation Runoff Modeling-System and Soil and Water Assessment Tool models appear to have performed similarly and better than Flow Duration Curve Transfer and Flow Anywhere methods. Based on median and mean percent bias values, the Soil and Water Assessment Tool model appears to have generally overestimated daily mean streamflows, whereas the Precipitation Runoff Modeling-System model and statistical methods appear to have underestimated daily mean streamflows. The Flow Duration Curve Transfer method produced the lowest median and mean percent bias values and appears to perform better than the other models.
From impedance theory to needle electrode guidance in tissue
NASA Astrophysics Data System (ADS)
Kalvøy, Håvard; Høyum, Per; Grimnes, Sverre; Martinsen, Ørjan G.
2010-04-01
Fast access to blood vessels or other tissues/organs can be crucial in clinical or acute medical treatment. We have developed a method for needle guidance for use in different types of applications. The feasibility of an automatic application for fast access to blood vessels during acute cardiac arrest, based on this method, has been evaluated. Suited electrode setups were found by development of needle electrode models used in simulation and sensitivity analyses. In vitro measurements were done both to determine the fundamental properties of the electrodes for use in the models and to confirm the simulation results. Development of algorithms for tissue characterization and differentiation was based on in vivo impedance measurement in porcine models and confirmed in human tissue in vivo. Feasibility was proven by application prototyping and impedance data presented as invasive Electrical Impedance Tomography (iEIT). Our conclusion is that this method can be utilized in a wide range of clinical applications.
Probabilistic arithmetic automata and their applications.
Marschall, Tobias; Herms, Inke; Kaltenbach, Hans-Michael; Rahmann, Sven
2012-01-01
We present a comprehensive review on probabilistic arithmetic automata (PAAs), a general model to describe chains of operations whose operands depend on chance, along with two algorithms to numerically compute the distribution of the results of such probabilistic calculations. PAAs provide a unifying framework to approach many problems arising in computational biology and elsewhere. We present five different applications, namely 1) pattern matching statistics on random texts, including the computation of the distribution of occurrence counts, waiting times, and clump sizes under hidden Markov background models; 2) exact analysis of window-based pattern matching algorithms; 3) sensitivity of filtration seeds used to detect candidate sequence alignments; 4) length and mass statistics of peptide fragments resulting from enzymatic cleavage reactions; and 5) read length statistics of 454 and IonTorrent sequencing reads. The diversity of these applications indicates the flexibility and unifying character of the presented framework. While the construction of a PAA depends on the particular application, we single out a frequently applicable construction method: We introduce deterministic arithmetic automata (DAAs) to model deterministic calculations on sequences, and demonstrate how to construct a PAA from a given DAA and a finite-memory random text model. This procedure is used for all five discussed applications and greatly simplifies the construction of PAAs. Implementations are available as part of the MoSDi package. Its application programming interface facilitates the rapid development of new applications based on the PAA framework.
Harbison, K; Kelly, J; Burnell, L; Silva, J
1995-01-01
The Scenario-based Engineering Process (SEP) is a user-focused methodology for large and complex system design. This process supports new application development from requirements analysis with domain models to component selection, design and modification, implementation, integration, and archival placement. It is built upon object-oriented methodologies, domain modeling strategies, and scenario-based techniques to provide an analysis process for mapping application requirements to available components. We are using SEP in the health care applications that we are developing. The process has already achieved success in the manufacturing and military domains and is being adopted by many organizations. SEP should prove viable in any domain containing scenarios that can be decomposed into tasks.
2017-03-30
experimental evaluations for hosting DDDAS-like applications in public cloud infrastructures . Finally, we report on ongoing work towards using the DDDAS...developed and their experimental evaluations for hosting DDDAS-like applications in public cloud infrastructures . Finally, we report on ongoing work towards...Dynamic resource management, model learning, simulation-based optimizations, cloud infrastructures for DDDAS applications. I. INTRODUCTION Critical cyber
a Modeling Method of Fluttering Leaves Based on Point Cloud
NASA Astrophysics Data System (ADS)
Tang, J.; Wang, Y.; Zhao, Y.; Hao, W.; Ning, X.; Lv, K.; Shi, Z.; Zhao, M.
2017-09-01
Leaves falling gently or fluttering are common phenomenon in nature scenes. The authenticity of leaves falling plays an important part in the dynamic modeling of natural scenes. The leaves falling model has a widely applications in the field of animation and virtual reality. We propose a novel modeling method of fluttering leaves based on point cloud in this paper. According to the shape, the weight of leaves and the wind speed, three basic trajectories of leaves falling are defined, which are the rotation falling, the roll falling and the screw roll falling. At the same time, a parallel algorithm based on OpenMP is implemented to satisfy the needs of real-time in practical applications. Experimental results demonstrate that the proposed method is amenable to the incorporation of a variety of desirable effects.
Model Data Interoperability for the United States Integrated Ocean Observing System (IOOS)
NASA Astrophysics Data System (ADS)
Signell, Richard P.
2010-05-01
Model data interoperability for the United States Integrated Ocean Observing System (IOOS) was initiated with a focused one year project. The problem was that there were many regional and national providers of oceanographic model data; each had unique file conventions, distribution techniques and analysis tools that made it difficult to compare model results and observational data. To solve this problem, a distributed system was built utilizing a customized middleware layer and a common data model. This allowed each model data provider to keep their existing model and data files unchanged, yet deliver model data via web services in a common form. With standards-based applications that used these web services, end users then had a common way to access data from any of the models. These applications included: (1) a 2D mapping and animation using a web browser application, (2) an advanced 3D visualization and animation using a desktop application, and (3) a toolkit for a common scientific analysis environment. Due to the flexibility and low impact of the approach on providers, rapid progress was made. The system was implemented in all eleven US IOOS regions and at the NOAA National Coastal Data Development Center, allowing common delivery of regional and national oceanographic model forecast and archived results that cover all US waters. The system, based heavily on software technology from the NSF-sponsored Unidata Program Center, is applicable to any structured gridded data, not just oceanographic model data. There is a clear pathway to expand the system to include unstructured grid (e.g. triangular grid) data.
Integrating 3D geological information with a national physically-based hydrological modelling system
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Parkin, Geoff; Kessler, Holger; Whiteman, Mark
2016-04-01
Robust numerical models are an essential tool for informing flood and water management and policy around the world. Physically-based hydrological models have traditionally not been used for such applications due to prohibitively large data, time and computational resource requirements. Given recent advances in computing power and data availability, a robust, physically-based hydrological modelling system for Great Britain using the SHETRAN model and national datasets has been created. Such a model has several advantages over less complex systems. Firstly, compared with conceptual models, a national physically-based model is more readily applicable to ungauged catchments, in which hydrological predictions are also required. Secondly, the results of a physically-based system may be more robust under changing conditions such as climate and land cover, as physical processes and relationships are explicitly accounted for. Finally, a fully integrated surface and subsurface model such as SHETRAN offers a wider range of applications compared with simpler schemes, such as assessments of groundwater resources, sediment and nutrient transport and flooding from multiple sources. As such, SHETRAN provides a robust means of simulating numerous terrestrial system processes which will add physical realism when coupled to the JULES land surface model. 306 catchments spanning Great Britain have been modelled using this system. The standard configuration of this system performs satisfactorily (NSE > 0.5) for 72% of catchments and well (NSE > 0.7) for 48%. Many of the remaining 28% of catchments that performed relatively poorly (NSE < 0.5) are located in the chalk in the south east of England. As such, the British Geological Survey 3D geology model for Great Britain (GB3D) has been incorporated, for the first time in any hydrological model, to pave the way for improvements to be made to simulations of catchments with important groundwater regimes. This coupling has involved development of software to allow for easy incorporation of geological information into SHETRAN for any model setup. The addition of more realistic subsurface representation following this approach is shown to greatly improve model performance in areas dominated by groundwater processes. The resulting modelling system has great potential to be used as a resource at national, regional and local scales in an array of different applications, including climate change impact assessments, land cover change studies and integrated assessments of groundwater and surface water resources.
NASA Astrophysics Data System (ADS)
Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.
2014-12-01
The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.
NASA Astrophysics Data System (ADS)
Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.
2017-07-01
We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.
An OpenACC-Based Unified Programming Model for Multi-accelerator Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S
2015-01-01
This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.
Application of an IRT Polytomous Model for Measuring Health Related Quality of Life
ERIC Educational Resources Information Center
Tejada, Antonio J. Rojas; Rojas, Oscar M. Lozano
2005-01-01
Background: The Item Response Theory (IRT) has advantages for measuring Health Related Quality of Life (HRQOL) as opposed to the Classical Tests Theory (CTT). Objectives: To present the results of the application of a polytomous model based on IRT, specifically, the Rating Scale Model (RSM), to measure HRQOL with the EORTC QLQ-C30. Methods: 103…
First-order fire effects models for land Management: Overview and issues
Elizabeth D. Reinhardt; Matthew B. Dickinson
2010-01-01
We give an overview of the science application process at work in supporting fire management. First-order fire effects models, such as those discussed in accompanying papers, are the building blocks of software systems designed for application to landscapes over time scales from days to centuries. Fire effects may be modeled using empirical, rule based, or process...
ERIC Educational Resources Information Center
Fu, Jianbin
2016-01-01
The multidimensional item response theory (MIRT) models with covariates proposed by Haberman and implemented in the "mirt" program provide a flexible way to analyze data based on item response theory. In this report, we discuss applications of the MIRT models with covariates to longitudinal test data to measure skill differences at the…
CCl4 is a common environmental contaminant in water and superfund sites, and a model liver toxicant. One application of PBPK models used in risk assessment is simulation of internal dose for the metric involved with toxicity, particularly for different routes of exposure. Time-co...
SPRAYTRAN 1.0 User’s Guide: A GIS-Based Atmospheric Spray Droplet Dispersion Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allwine, K Jerry; Rutz, Frederick C.; Droppo, James G.
SPRAY TRANsport (SPRAYTRAN) is a comprehensive dispersion modeling system that is used to simulate the offsite drift of pesticides from spray applications. SPRAYTRAN functions as a console application within Environmental System Research Institute’s ArcMap Geographic Information System (Version 9.x) and integrates the widely-used, U.S. Environmental Protection Agency (EPA)-approved CALifornia PUFF (CALPUFF) dispersion model and model components to simulate longer-range transport and diffusion in variable terrain and spatially/temporally varying meteorological (e.g., wind) fields. Area sources, which are used to define spray blocks in SPRAYTRAN, are initialized using output files generated from a separate aerial-spray-application model called AGDISP (AGricultural DISPersal). The AGDISPmore » model is used for estimating the amount of pesticide deposited to the spray block based on spraying characteristics (e.g., pesticide type, spray nozzles, and aircraft type) and then simulating the near-field (less than 300-m) drift from a single pesticide application. The fraction of pesticide remaining airborne from the AGDISP near-field simulation is then used by SPRAYTRAN for simulating longer-range (greater than 300 m) drift and deposition of the pesticide.« less
On domain modelling of the service system with its application to enterprise information systems
NASA Astrophysics Data System (ADS)
Wang, J. W.; Wang, H. F.; Ding, J. L.; Furuta, K.; Kanno, T.; Ip, W. H.; Zhang, W. J.
2016-01-01
Information systems are a kind of service systems and they are throughout every element of a modern industrial and business system, much like blood in our body. Types of information systems are heterogeneous because of extreme uncertainty in changes in modern industrial and business systems. To effectively manage information systems, modelling of the work domain (or domain) of information systems is necessary. In this paper, a domain modelling framework for the service system is proposed and its application to the enterprise information system is outlined. The framework is defined based on application of a general domain modelling tool called function-context-behaviour-principle-state-structure (FCBPSS). The FCBPSS is based on a set of core concepts, namely: function, context, behaviour, principle, state and structure and system decomposition. Different from many other applications of FCBPSS in systems engineering, the FCBPSS is applied to both infrastructure and substance systems, which is novel and effective to modelling of service systems including enterprise information systems. It is to be noted that domain modelling of systems (e.g. enterprise information systems) is a key to integration of heterogeneous systems and to coping with unanticipated situations facing to systems.
Kaserer, Teresa; Beck, Katharina R; Akram, Muhammad; Odermatt, Alex; Schuster, Daniela
2015-12-19
Computational methods are well-established tools in the drug discovery process and can be employed for a variety of tasks. Common applications include lead identification and scaffold hopping, as well as lead optimization by structure-activity relationship analysis and selectivity profiling. In addition, compound-target interactions associated with potentially harmful effects can be identified and investigated. This review focuses on pharmacophore-based virtual screening campaigns specifically addressing the target class of hydroxysteroid dehydrogenases. Many members of this enzyme family are associated with specific pathological conditions, and pharmacological modulation of their activity may represent promising therapeutic strategies. On the other hand, unintended interference with their biological functions, e.g., upon inhibition by xenobiotics, can disrupt steroid hormone-mediated effects, thereby contributing to the development and progression of major diseases. Besides a general introduction to pharmacophore modeling and pharmacophore-based virtual screening, exemplary case studies from the field of short-chain dehydrogenase/reductase (SDR) research are presented. These success stories highlight the suitability of pharmacophore modeling for the various application fields and suggest its application also in futures studies.
Chang, Hsien-Yen; Weiner, Jonathan P
2010-01-18
Diagnosis-based risk adjustment is becoming an important issue globally as a result of its implications for payment, high-risk predictive modelling and provider performance assessment. The Taiwanese National Health Insurance (NHI) programme provides universal coverage and maintains a single national computerized claims database, which enables the application of diagnosis-based risk adjustment. However, research regarding risk adjustment is limited. This study aims to examine the performance of the Adjusted Clinical Group (ACG) case-mix system using claims-based diagnosis information from the Taiwanese NHI programme. A random sample of NHI enrollees was selected. Those continuously enrolled in 2002 were included for concurrent analyses (n = 173,234), while those in both 2002 and 2003 were included for prospective analyses (n = 164,562). Health status measures derived from 2002 diagnoses were used to explain the 2002 and 2003 health expenditure. A multivariate linear regression model was adopted after comparing the performance of seven different statistical models. Split-validation was performed in order to avoid overfitting. The performance measures were adjusted R2 and mean absolute prediction error of five types of expenditure at individual level, and predictive ratio of total expenditure at group level. The more comprehensive models performed better when used for explaining resource utilization. Adjusted R2 of total expenditure in concurrent/prospective analyses were 4.2%/4.4% in the demographic model, 15%/10% in the ACGs or ADGs (Aggregated Diagnosis Group) model, and 40%/22% in the models containing EDCs (Expanded Diagnosis Cluster). When predicting expenditure for groups based on expenditure quintiles, all models underpredicted the highest expenditure group and overpredicted the four other groups. For groups based on morbidity burden, the ACGs model had the best performance overall. Given the widespread availability of claims data and the superior explanatory power of claims-based risk adjustment models over demographics-only models, Taiwan's government should consider using claims-based models for policy-relevant applications. The performance of the ACG case-mix system in Taiwan was comparable to that found in other countries. This suggested that the ACG system could be applied to Taiwan's NHI even though it was originally developed in the USA. Many of the findings in this paper are likely to be relevant to other diagnosis-based risk adjustment methodologies.
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai
2015-01-01
Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368
Medical applications of model-based dynamic thermography
NASA Astrophysics Data System (ADS)
Nowakowski, Antoni; Kaczmarek, Mariusz; Ruminski, Jacek; Hryciuk, Marcin; Renkielska, Alicja; Grudzinski, Jacek; Siebert, Janusz; Jagielak, Dariusz; Rogowski, Jan; Roszak, Krzysztof; Stojek, Wojciech
2001-03-01
The proposal to use active thermography in medical diagnostics is promising in some applications concerning investigation of directly accessible parts of the human body. The combination of dynamic thermograms with thermal models of investigated structures gives attractive possibility to make internal structure reconstruction basing on different thermal properties of biological tissues. Measurements of temperature distribution synchronized with external light excitation allow registration of dynamic changes of local temperature dependent on heat exchange conditions. Preliminary results of active thermography applications in medicine are discussed. For skin and under- skin tissues an equivalent thermal model may be determined. For the assumed model its effective parameters may be reconstructed basing on the results of transient thermal processes. For known thermal diffusivity and conductivity of specific tissues the local thickness of a two or three layer structure may be calculated. Results of some medical cases as well as reference data of in vivo study on animals are presented. The method was also applied to evaluate the state of the human heart during the open chest cardio-surgical interventions. Reference studies of evoked heart infarct in pigs are referred, too. We see the proposed new in medical applications technique as a promising diagnostic tool. It is a fully non-invasive, clean, handy, fast and affordable method giving not only qualitative view of investigated surfaces but also an objective quantitative measurement result, accurate enough for many applications including fast screening of affected tissues.
Multi-scale Modeling and Analysis of Nano-RFID Systems on HPC Setup
NASA Astrophysics Data System (ADS)
Pathak, Rohit; Joshi, Satyadhar
In this paper we have worked out on some the complex modeling aspects such as Multi Scale modeling, MATLAB Sugar based modeling and have shown the complexities involved in the analysis of Nano RFID (Radio Frequency Identification) systems. We have shown the modeling and simulation and demonstrated some novel ideas and library development for Nano RFID. Multi scale modeling plays a very important role in nanotech enabled devices properties of which cannot be explained sometimes by abstraction level theories. Reliability and packaging still remains one the major hindrances in practical implementation of Nano RFID based devices. And to work on them modeling and simulation will play a very important role. CNTs is the future low power material that will replace CMOS and its integration with CMOS, MEMS circuitry will play an important role in realizing the true power in Nano RFID systems. RFID based on innovations in nanotechnology has been shown. MEMS modeling of Antenna, sensors and its integration in the circuitry has been shown. Thus incorporating this we can design a Nano-RFID which can be used in areas like human implantation and complex banking applications. We have proposed modeling of RFID using the concept of multi scale modeling to accurately predict its properties. Also we give the modeling of MEMS devices that are proposed recently that can see possible application in RFID. We have also covered the applications and the advantages of Nano RFID in various areas. RF MEMS has been matured and its devices are being successfully commercialized but taking it to limits of nano domains and integration with singly chip RFID needs a novel approach which is being proposed. We have modeled MEMS based transponder and shown the distribution for multi scale modeling for Nano RFID.
An Open Source Simulation Model for Soil and Sediment Bioturbation
Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin
2011-01-01
Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach. PMID:22162997
An open source simulation model for soil and sediment bioturbation.
Schiffers, Katja; Teal, Lorna Rachel; Travis, Justin Mark John; Solan, Martin
2011-01-01
Bioturbation is one of the most widespread forms of ecological engineering and has significant implications for the structure and functioning of ecosystems, yet our understanding of the processes involved in biotic mixing remains incomplete. One reason is that, despite their value and utility, most mathematical models currently applied to bioturbation data tend to neglect aspects of the natural complexity of bioturbation in favour of mathematical simplicity. At the same time, the abstract nature of these approaches limits the application of such models to a limited range of users. Here, we contend that a movement towards process-based modelling can improve both the representation of the mechanistic basis of bioturbation and the intuitiveness of modelling approaches. In support of this initiative, we present an open source modelling framework that explicitly simulates particle displacement and a worked example to facilitate application and further development. The framework combines the advantages of rule-based lattice models with the application of parameterisable probability density functions to generate mixing on the lattice. Model parameters can be fitted by experimental data and describe particle displacement at the spatial and temporal scales at which bioturbation data is routinely collected. By using the same model structure across species, but generating species-specific parameters, a generic understanding of species-specific bioturbation behaviour can be achieved. An application to a case study and comparison with a commonly used model attest the predictive power of the approach.
Methodology and application of combined watershed and ground-water models in Kansas
Sophocleous, M.; Perkins, S.P.
2000-01-01
Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling system much easier. This approach also enhances model calibration and thus the reliability of model results. (C) 2000 Elsevier Science B.V.Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and ve
Hyper-Book: A Formal Model for Electronic Books.
ERIC Educational Resources Information Center
Catenazzi, Nadia; Sommaruga, Lorenzo
1994-01-01
Presents a model for electronic books based on the paper book metaphor. Discussion includes how the book evolves under the effects of its functional components; the use and impact of the model for organizing and presenting electronic documents in the context of electronic publishing; and the possible applications of a system based on the model.…
USDA-ARS?s Scientific Manuscript database
AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic/water quality simulation components. The AgES-W model was previously evaluated for streamflow and recently has been enhanced with the addition of nitrogen (N) and sediment modeling compo...
Flexibility Support for Homecare Applications Based on Models and Multi-Agent Technology
Armentia, Aintzane; Gangoiti, Unai; Priego, Rafael; Estévez, Elisabet; Marcos, Marga
2015-01-01
In developed countries, public health systems are under pressure due to the increasing percentage of population over 65. In this context, homecare based on ambient intelligence technology seems to be a suitable solution to allow elderly people to continue to enjoy the comforts of home and help optimize medical resources. Thus, current technological developments make it possible to build complex homecare applications that demand, among others, flexibility mechanisms for being able to evolve as context does (adaptability), as well as avoiding service disruptions in the case of node failure (availability). The solution proposed in this paper copes with these flexibility requirements through the whole life-cycle of the target applications: from design phase to runtime. The proposed domain modeling approach allows medical staff to design customized applications, taking into account the adaptability needs. It also guides software developers during system implementation. The application execution is managed by a multi-agent based middleware, making it possible to meet adaptation requirements, assuring at the same time the availability of the system even for stateful applications. PMID:26694416
The Automated Geospatial Watershed Assessment (AGWA) tool is a desktop application that uses widely available standardized spatial datasets to derive inputs for multi-scale hydrologic models (Miller et al., 2007). The required data sets include topography (DEM data), soils, clima...
DOT National Transportation Integrated Search
2015-12-01
This report presents the methodology and results of the independent evaluation of safety applications for passenger vehicles in the 2012-2013 Safety Pilot Model Deployment, part of the United States Department of Transportations Intelligent Transp...
Applications of Diagnostic Classification Models: A Literature Review and Critical Commentary
ERIC Educational Resources Information Center
Sessoms, John; Henson, Robert A.
2018-01-01
Diagnostic classification models (DCMs) classify examinees based on the skills they have mastered given their test performance. This classification enables targeted feedback that can inform remedial instruction. Unfortunately, applications of DCMs have been criticized (e.g., no validity support). Generally, these evaluations have been brief and…
CCBuilder 2.0: Powerful and accessible coiled-coil modeling.
Wood, Christopher W; Woolfson, Derek N
2018-01-01
The increased availability of user-friendly and accessible computational tools for biomolecular modeling would expand the reach and application of biomolecular engineering and design. For protein modeling, one key challenge is to reduce the complexities of 3D protein folds to sets of parametric equations that nonetheless capture the salient features of these structures accurately. At present, this is possible for a subset of proteins, namely, repeat proteins. The α-helical coiled coil provides one such example, which represents ≈ 3-5% of all known protein-encoding regions of DNA. Coiled coils are bundles of α helices that can be described by a small set of structural parameters. Here we describe how this parametric description can be implemented in an easy-to-use web application, called CCBuilder 2.0, for modeling and optimizing both α-helical coiled coils and polyproline-based collagen triple helices. This has many applications from providing models to aid molecular replacement for X-ray crystallography, in silico model building and engineering of natural and designed protein assemblies, and through to the creation of completely de novo "dark matter" protein structures. CCBuilder 2.0 is available as a web-based application, the code for which is open-source and can be downloaded freely. http://coiledcoils.chm.bris.ac.uk/ccbuilder2. We have created CCBuilder 2.0, an easy to use web-based application that can model structures for a whole class of proteins, the α-helical coiled coil, which is estimated to account for 3-5% of all proteins in nature. CCBuilder 2.0 will be of use to a large number of protein scientists engaged in fundamental studies, such as protein structure determination, through to more-applied research including designing and engineering novel proteins that have potential applications in biotechnology. © 2017 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
CCBuilder 2.0: Powerful and accessible coiled‐coil modeling
Wood, Christopher W.
2017-01-01
Abstract The increased availability of user‐friendly and accessible computational tools for biomolecular modeling would expand the reach and application of biomolecular engineering and design. For protein modeling, one key challenge is to reduce the complexities of 3D protein folds to sets of parametric equations that nonetheless capture the salient features of these structures accurately. At present, this is possible for a subset of proteins, namely, repeat proteins. The α‐helical coiled coil provides one such example, which represents ≈ 3–5% of all known protein‐encoding regions of DNA. Coiled coils are bundles of α helices that can be described by a small set of structural parameters. Here we describe how this parametric description can be implemented in an easy‐to‐use web application, called CCBuilder 2.0, for modeling and optimizing both α‐helical coiled coils and polyproline‐based collagen triple helices. This has many applications from providing models to aid molecular replacement for X‐ray crystallography, in silico model building and engineering of natural and designed protein assemblies, and through to the creation of completely de novo “dark matter” protein structures. CCBuilder 2.0 is available as a web‐based application, the code for which is open‐source and can be downloaded freely. http://coiledcoils.chm.bris.ac.uk/ccbuilder2. Lay Summary We have created CCBuilder 2.0, an easy to use web‐based application that can model structures for a whole class of proteins, the α‐helical coiled coil, which is estimated to account for 3–5% of all proteins in nature. CCBuilder 2.0 will be of use to a large number of protein scientists engaged in fundamental studies, such as protein structure determination, through to more‐applied research including designing and engineering novel proteins that have potential applications in biotechnology. PMID:28836317
Development of solution-gated graphene transistor model for biosensors
NASA Astrophysics Data System (ADS)
Karimi, Hediyeh; Yusof, Rubiyah; Rahmani, Rasoul; Hosseinpour, Hoda; Ahmadi, Mohammad T.
2014-02-01
The distinctive properties of graphene, characterized by its high carrier mobility and biocompatibility, have stimulated extreme scientific interest as a promising nanomaterial for future nanoelectronic applications. In particular, graphene-based transistors have been developed rapidly and are considered as an option for DNA sensing applications. Recent findings in the field of DNA biosensors have led to a renewed interest in the identification of genetic risk factors associated with complex human diseases for diagnosis of cancers or hereditary diseases. In this paper, an analytical model of graphene-based solution gated field effect transistors (SGFET) is proposed to constitute an important step towards development of DNA biosensors with high sensitivity and selectivity. Inspired by this fact, a novel strategy for a DNA sensor model with capability of single-nucleotide polymorphism detection is proposed and extensively explained. First of all, graphene-based DNA sensor model is optimized using particle swarm optimization algorithm. Based on the sensing mechanism of DNA sensors, detective parameters ( I ds and V gmin) are suggested to facilitate the decision making process. Finally, the behaviour of graphene-based SGFET is predicted in the presence of single-nucleotide polymorphism with an accuracy of more than 98% which guarantees the reliability of the optimized model for any application of the graphene-based DNA sensor. It is expected to achieve the rapid, quick and economical detection of DNA hybridization which could speed up the realization of the next generation of the homecare sensor system.
Development of solution-gated graphene transistor model for biosensors
2014-01-01
The distinctive properties of graphene, characterized by its high carrier mobility and biocompatibility, have stimulated extreme scientific interest as a promising nanomaterial for future nanoelectronic applications. In particular, graphene-based transistors have been developed rapidly and are considered as an option for DNA sensing applications. Recent findings in the field of DNA biosensors have led to a renewed interest in the identification of genetic risk factors associated with complex human diseases for diagnosis of cancers or hereditary diseases. In this paper, an analytical model of graphene-based solution gated field effect transistors (SGFET) is proposed to constitute an important step towards development of DNA biosensors with high sensitivity and selectivity. Inspired by this fact, a novel strategy for a DNA sensor model with capability of single-nucleotide polymorphism detection is proposed and extensively explained. First of all, graphene-based DNA sensor model is optimized using particle swarm optimization algorithm. Based on the sensing mechanism of DNA sensors, detective parameters (Ids and Vgmin) are suggested to facilitate the decision making process. Finally, the behaviour of graphene-based SGFET is predicted in the presence of single-nucleotide polymorphism with an accuracy of more than 98% which guarantees the reliability of the optimized model for any application of the graphene-based DNA sensor. It is expected to achieve the rapid, quick and economical detection of DNA hybridization which could speed up the realization of the next generation of the homecare sensor system. PMID:24517158
Computational and mathematical methods in brain atlasing.
Nowinski, Wieslaw L
2017-12-01
Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.
Machine learning models for lipophilicity and their domain of applicability.
Schroeter, Timon; Schwaighofer, Anton; Mika, Sebastian; Laak, Antonius Ter; Suelzle, Detlev; Ganzer, Ursula; Heinrich, Nikolaus; Müller, Klaus-Robert
2007-01-01
Unfavorable lipophilicity and water solubility cause many drug failures; therefore these properties have to be taken into account early on in lead discovery. Commercial tools for predicting lipophilicity usually have been trained on small and neutral molecules, and are thus often unable to accurately predict in-house data. Using a modern Bayesian machine learning algorithm--a Gaussian process model--this study constructs a log D7 model based on 14,556 drug discovery compounds of Bayer Schering Pharma. Performance is compared with support vector machines, decision trees, ridge regression, and four commercial tools. In a blind test on 7013 new measurements from the last months (including compounds from new projects) 81% were predicted correctly within 1 log unit, compared to only 44% achieved by commercial software. Additional evaluations using public data are presented. We consider error bars for each method (model based error bars, ensemble based, and distance based approaches), and investigate how well they quantify the domain of applicability of each model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.
2015-02-15
Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less
Optimizing Tsunami Forecast Model Accuracy
NASA Astrophysics Data System (ADS)
Whitmore, P.; Nyland, D. L.; Huang, P. Y.
2015-12-01
Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.
NASA Technical Reports Server (NTRS)
Liu, Donhang
2014-01-01
This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.
Role of Imaging Specrometer Data for Model-based Cross-calibration of Imaging Sensors
NASA Technical Reports Server (NTRS)
Thome, Kurtis John
2014-01-01
Site characterization benefits from imaging spectrometry to determine spectral bi-directional reflectance of a well-understood surface. Cross calibration approaches, uncertainties, role of imaging spectrometry, model-based site characterization, and application to product validation.
Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R; Nutaro, James J
This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigmmore » to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.« less
GRace: a MATLAB-based application for fitting the discrimination-association model.
Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio
2014-10-28
The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.
NASA Astrophysics Data System (ADS)
Missif, Lial Raja; Kadhum, Mohammad M.
2017-09-01
Wireless Sensor Network (WSN) has been widely used for monitoring where sensors are deployed to operate independently to sense abnormal phenomena. Most of the proposed environmental monitoring systems are designed based on a predetermined sensing range which does not reflect the sensor reliability, event characteristics, and the environment conditions. Measuring of the capability of a sensor node to accurately detect an event within a sensing field is of great important for monitoring applications. This paper presents an efficient mechanism for even detection based on probabilistic sensing model. Different models have been presented theoretically in this paper to examine their adaptability and applicability to the real environment applications. The numerical results of the experimental evaluation have showed that the probabilistic sensing model provides accurate observation and delectability of an event, and it can be utilized for different environment scenarios.
A Review of Diagnostic Techniques for ISHM Applications
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna
2005-01-01
System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.
Bannwarth, M A; Grovermann, C; Schreinemachers, P; Ingwersen, J; Lamers, M; Berger, T; Streck, T
2016-01-01
Pesticide application rates are high and increasing in upland agricultural systems in Thailand producing vegetables, fruits and ornamental crops, leading to the pollution of stream water with pesticide residues. The objective of this study was to determine the maximum per hectare application rates of two widely used pesticides that would achieve non-hazardous pesticide concentrations in the stream water and to evaluate how farm household incomes would be affected if farmers complied with these restricted application rates. For this purpose we perform an integrated modeling approach of a hydrological solute transport model (the Soil and Water Assessment Tool, SWAT) and an agent-based farm decision model (Mathematical Programming-based Multi-Agent Systems, MPMAS). SWAT was used to simulate the pesticide fate and behavior. The model was calibrated to a 77 km(2) watershed in northern Thailand. The results show that to stay under a pre-defined eco-toxicological threshold, the current average application of chlorothalonil (0.80 kg/ha) and cypermethrin (0.53 kg/ha) would have to be reduced by 80% and 99%, respectively. The income effect of such reductions was simulated using MPMAS. The results suggest that if farm households complied with the application thresholds then their income would reduce by 17.3% in the case of chlorothalonil and by 38.3% in the case of cypermethrin. Less drastic income effects can be expected if methods of integrated pest management were more widely available. The novelty of this study is to combine two models from distinctive disciplines to evaluate pesticide reduction scenarios based on real-world data from a single study site. Copyright © 2014 Elsevier Ltd. All rights reserved.
A generic biokinetic model for noble gases with application to radon.
Leggett, Rich; Marsh, James; Gregoratto, Demetrio; Blanchardon, Eric
2013-06-01
To facilitate the estimation of radiation doses from intake of radionuclides, the International Commission on Radiological Protection (ICRP) publishes dose coefficients (dose per unit intake) based on reference biokinetic and dosimetric models. The ICRP generally has not provided biokinetic models or dose coefficients for intake of noble gases, but plans to provide such information for (222)Rn and other important radioisotopes of noble gases in a forthcoming series of reports on occupational intake of radionuclides (OIR). This paper proposes a generic biokinetic model framework for noble gases and develops parameter values for radon. The framework is tailored to applications in radiation protection and is consistent with a physiologically based biokinetic modelling scheme adopted for the OIR series. Parameter values for a noble gas are based largely on a blood flow model and physical laws governing transfer of a non-reactive and soluble gas between materials. Model predictions for radon are shown to be consistent with results of controlled studies of its biokinetics in human subjects.
Ben-David, Avishai; Embury, Janon F; Davidson, Charles E
2006-09-10
A comprehensive analytical radiative transfer model for isothermal aerosols and vapors for passive infrared remote sensing applications (ground-based and airborne sensors) has been developed. The theoretical model illustrates the qualitative difference between an aerosol cloud and a chemical vapor cloud. The model is based on two and two/four stream approximations and includes thermal emission-absorption by the aerosols; scattering of diffused sky radiances incident from all sides on the aerosols (downwelling, upwelling, left, and right); and scattering of aerosol thermal emission. The model uses moderate resolution transmittance ambient atmospheric radiances as boundary conditions and provides analytical expressions for the information on the aerosol cloud that is contained in remote sensing measurements by using thermal contrasts between the aerosols and diffused sky radiances. Simulated measurements of a ground-based sensor viewing Bacillus subtilis var. niger bioaerosols and kaolin aerosols are given and discussed to illustrate the differences between a vapor-only model (i.e., only emission-absorption effects) and a complete model that adds aerosol scattering effects.
A physiologically based model for tramadol pharmacokinetics in horses.
Abbiati, Roberto Andrea; Cagnardi, Petra; Ravasio, Giuliano; Villa, Roberto; Manca, Davide
2017-09-21
This work proposes an application of a minimal complexity physiologically based pharmacokinetic model to predict tramadol concentration vs time profiles in horses. Tramadol is an opioid analgesic also used for veterinary treatments. Researchers and medical doctors can profit from the application of mathematical models as supporting tools to optimize the pharmacological treatment of animal species. The proposed model is based on physiology but adopts the minimal compartmental architecture necessary to describe the experimental data. The model features a system of ordinary differential equations, where most of the model parameters are either assigned or individualized for a given horse, using literature data and correlations. Conversely, residual parameters, whose value is unknown, are regressed exploiting experimental data. The model proved capable of simulating pharmacokinetic profiles with accuracy. In addition, it provides further insights on un-observable tramadol data, as for instance tramadol concentration in the liver or hepatic metabolism and renal excretion extent. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gnansounou, Edgard; Raman, Jegannathan Kenthorai
2018-04-24
Among the renewables, non-food and wastelands based biofuels are essential for the transport sector to achieve country's climate mitigation targets. With the growing interest in biorefineries, setting policy requirements for other coproducts along with biofuels is necessary to improve the products portfolio of biorefinery, increase the bioproducts perception by the consumers and push the technology forward. Towards this context, Claiming-Based allocation models were used in comparative life cycle assessment of multiple products from wheat straw biorefinery and vetiver biorefinery. Vetiver biorefinery shows promising Greenhouse gas emission savings (181-213%) compared to the common crop based lignocellulose (wheat straw) biorefinery. Assistance of Claiming-Based Allocation models favors to find out the affordable allocation limit (0-80%) among the coproducts in order to achieve the individual prospective policy targets. Such models show promising application in multiproduct life cycle assessment studies where appropriate allocation is challenging to achieve the individual products emission subject to policy targets. Copyright © 2018 Elsevier Ltd. All rights reserved.
Information of Complex Systems and Applications in Agent Based Modeling.
Bao, Lei; Fritchman, Joseph C
2018-04-18
Information about a system's internal interactions is important to modeling the system's dynamics. This study examines the finer categories of the information definition and explores the features of a type of local information that describes the internal interactions of a system. Based on the results, a dual-space agent and information modeling framework (AIM) is developed by explicitly distinguishing an information space from the material space. The two spaces can evolve both independently and interactively. The dual-space framework can provide new analytic methods for agent based models (ABMs). Three examples are presented including money distribution, individual's economic evolution, and artificial stock market. The results are analyzed in the dual-space, which more clearly shows the interactions and evolutions within and between the information and material spaces. The outcomes demonstrate the wide-ranging applicability of using the dual-space AIMs to model and analyze a broad range of interactive and intelligent systems.
Sohl, Terry L.; Sayler, Kristi L.; Drummond, Mark A.; Loveland, Thomas R.
2007-01-01
A wide variety of ecological applications require spatially explicit, historic, current, and projected land use and land cover data. The U.S. Land Cover Trends project is analyzing contemporary (1973–2000) land-cover change in the conterminous United States. The newly developed FORE-SCE model used Land Cover Trends data and theoretical, statistical, and deterministic modeling techniques to project future land cover change through 2020 for multiple plausible scenarios. Projected proportions of future land use were initially developed, and then sited on the lands with the highest potential for supporting that land use and land cover using a statistically based stochastic allocation procedure. Three scenarios of 2020 land cover were mapped for the western Great Plains in the US. The model provided realistic, high-resolution, scenario-based land-cover products suitable for multiple applications, including studies of climate and weather variability, carbon dynamics, and regional hydrology.
Kianmehr, Keivan; Alhajj, Reda
2008-09-01
In this study, we aim at building a classification framework, namely the CARSVM model, which integrates association rule mining and support vector machine (SVM). The goal is to benefit from advantages of both, the discriminative knowledge represented by class association rules and the classification power of the SVM algorithm, to construct an efficient and accurate classifier model that improves the interpretability problem of SVM as a traditional machine learning technique and overcomes the efficiency issues of associative classification algorithms. In our proposed framework: instead of using the original training set, a set of rule-based feature vectors, which are generated based on the discriminative ability of class association rules over the training samples, are presented to the learning component of the SVM algorithm. We show that rule-based feature vectors present a high-qualified source of discrimination knowledge that can impact substantially the prediction power of SVM and associative classification techniques. They provide users with more conveniences in terms of understandability and interpretability as well. We have used four datasets from UCI ML repository to evaluate the performance of the developed system in comparison with five well-known existing classification methods. Because of the importance and popularity of gene expression analysis as real world application of the classification model, we present an extension of CARSVM combined with feature selection to be applied to gene expression data. Then, we describe how this combination will provide biologists with an efficient and understandable classifier model. The reported test results and their biological interpretation demonstrate the applicability, efficiency and effectiveness of the proposed model. From the results, it can be concluded that a considerable increase in classification accuracy can be obtained when the rule-based feature vectors are integrated in the learning process of the SVM algorithm. In the context of applicability, according to the results obtained from gene expression analysis, we can conclude that the CARSVM system can be utilized in a variety of real world applications with some adjustments.
Formalism Challenges of the Cougaar Model Driven Architecture
NASA Technical Reports Server (NTRS)
Bohner, Shawn A.; George, Boby; Gracanin, Denis; Hinchey, Michael G.
2004-01-01
The Cognitive Agent Architecture (Cougaar) is one of the most sophisticated distributed agent architectures developed today. As part of its research and evolution, Cougaar is being studied for application to large, logistics-based applications for the Department of Defense (DoD). Anticipiting future complex applications of Cougaar, we are investigating the Model Driven Architecture (MDA) approach to understand how effective it would be for increasing productivity in Cougar-based development efforts. Recognizing the sophistication of the Cougaar development environment and the limitations of transformation technologies for agents, we have systematically developed an approach that combines component assembly in the large and transformation in the small. This paper describes some of the key elements that went into the Cougaar Model Driven Architecture approach and the characteristics that drove the approach.
A numerical identifiability test for state-space models--application to optimal experimental design.
Hidalgo, M E; Ayesa, E
2001-01-01
This paper describes a mathematical tool for identifiability analysis, easily applicable to high order non-linear systems modelled in state-space and implementable in simulators with a time-discrete approach. This procedure also permits a rigorous analysis of the expected estimation errors (average and maximum) in calibration experiments. The methodology is based on the recursive numerical evaluation of the information matrix during the simulation of a calibration experiment and in the setting-up of a group of information parameters based on geometric interpretations of this matrix. As an example of the utility of the proposed test, the paper presents its application to an optimal experimental design of ASM Model No. 1 calibration, in order to estimate the maximum specific growth rate microH and the concentration of heterotrophic biomass XBH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balanin, A. L.; Boyarinov, V. F.; Glushkov, E. S.
The application of experimental information on measured axial distributions of fission reaction rates for development of 3D numerical models of the ASTRA critical facility taking into account azimuthal asymmetry of the assembly simulating a HTGR with annular core is substantiated. Owing to the presence of the bottom reflector and the absence of the top reflector, the application of 2D models based on experimentally determined buckling is impossible for calculation of critical assemblies of the ASTRA facility; therefore, an alternative approach based on the application of the extrapolated assembly height is proposed. This approach is exemplified by the numerical analysis ofmore » experiments on measurement of efficiency of control rods mockups and protection system (CPS).« less
Oliveri, Paolo; López, M Isabel; Casolino, M Chiara; Ruisánchez, Itziar; Callao, M Pilar; Medini, Luca; Lanteri, Silvia
2014-12-03
A new class-modeling method, referred to as partial least squares density modeling (PLS-DM), is presented. The method is based on partial least squares (PLS), using a distance-based sample density measurement as the response variable. Potential function probability density is subsequently calculated on PLS scores and used, jointly with residual Q statistics, to develop efficient class models. The influence of adjustable model parameters on the resulting performances has been critically studied by means of cross-validation and application of the Pareto optimality criterion. The method has been applied to verify the authenticity of olives in brine from cultivar Taggiasca, based on near-infrared (NIR) spectra recorded on homogenized solid samples. Two independent test sets were used for model validation. The final optimal model was characterized by high efficiency and equilibrate balance between sensitivity and specificity values, if compared with those obtained by application of well-established class-modeling methods, such as soft independent modeling of class analogy (SIMCA) and unequal dispersed classes (UNEQ). Copyright © 2014 Elsevier B.V. All rights reserved.
Discussion on accuracy degree evaluation of accident velocity reconstruction model
NASA Astrophysics Data System (ADS)
Zou, Tiefang; Dai, Yingbiao; Cai, Ming; Liu, Jike
In order to investigate the applicability of accident velocity reconstruction model in different cases, a method used to evaluate accuracy degree of accident velocity reconstruction model is given. Based on pre-crash velocity in theory and calculation, an accuracy degree evaluation formula is obtained. With a numerical simulation case, Accuracy degrees and applicability of two accident velocity reconstruction models are analyzed; results show that this method is feasible in practice.
Karnon, Jonathan; Stahl, James; Brennan, Alan; Caro, J Jaime; Mar, Javier; Möller, Jörgen
2012-01-01
Discrete event simulation (DES) is a form of computer-based modeling that provides an intuitive and flexible approach to representing complex systems. It has been used in a wide range of health care applications. Most early applications involved analyses of systems with constrained resources, where the general aim was to improve the organization of delivered services. More recently, DES has increasingly been applied to evaluate specific technologies in the context of health technology assessment. The aim of this article was to provide consensus-based guidelines on the application of DES in a health care setting, covering the range of issues to which DES can be applied. The article works through the different stages of the modeling process: structural development, parameter estimation, model implementation, model analysis, and representation and reporting. For each stage, a brief description is provided, followed by consideration of issues that are of particular relevance to the application of DES in a health care setting. Each section contains a number of best practice recommendations that were iterated among the authors, as well as among the wider modeling task force. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Chen, Yunjin; Pock, Thomas
2017-06-01
Image restoration is a long-standing problem in low-level computer vision with many interesting applications. We describe a flexible learning framework based on the concept of nonlinear reaction diffusion models for various image restoration problems. By embodying recent improvements in nonlinear diffusion models, we propose a dynamic nonlinear reaction diffusion model with time-dependent parameters (i.e., linear filters and influence functions). In contrast to previous nonlinear diffusion models, all the parameters, including the filters and the influence functions, are simultaneously learned from training data through a loss based approach. We call this approach TNRD-Trainable Nonlinear Reaction Diffusion. The TNRD approach is applicable for a variety of image restoration tasks by incorporating appropriate reaction force. We demonstrate its capabilities with three representative applications, Gaussian image denoising, single image super resolution and JPEG deblocking. Experiments show that our trained nonlinear diffusion models largely benefit from the training of the parameters and finally lead to the best reported performance on common test datasets for the tested applications. Our trained models preserve the structural simplicity of diffusion models and take only a small number of diffusion steps, thus are highly efficient. Moreover, they are also well-suited for parallel computation on GPUs, which makes the inference procedure extremely fast.
Wei, Gaofeng; Tang, Gang; Fu, Zengliang; Sun, Qiuming; Tian, Feng
2010-10-01
The China Mechanical Virtual Human (CMVH) is a human musculoskeletal biomechanical simulation platform based on China Visible Human slice images; it has great realistic application significance. In this paper is introduced the construction method of CMVH 3D models. Then a simulation system solution based on Creator/Vega is put forward for the complex and gigantic data characteristics of the 3D models. At last, combined with MFC technology, the CMVH simulation system is developed and a running simulation scene is given. This paper provides a new way for the virtual reality application of CMVH.
Design-based modeling of magnetically actuated soft diaphragm materials
NASA Astrophysics Data System (ADS)
Jayaneththi, V. R.; Aw, K. C.; McDaid, A. J.
2018-04-01
Magnetic polymer composites (MPC) have shown promise for emerging biomedical applications such as lab-on-a-chip and implantable drug delivery. These soft material actuators are capable of fast response, large deformation and wireless actuation. Existing MPC modeling approaches are computationally expensive and unsuitable for rapid design prototyping and real-time control applications. This paper proposes a macro-scale 1-DOF model capable of predicting force and displacement of an MPC diaphragm actuator. Model validation confirmed both blocked force and displacement can be accurately predicted in a variety of working conditions i.e. different magnetic field strengths, static/dynamic fields, and gap distances. The contribution of this work includes a comprehensive experimental investigation of a macro-scale diaphragm actuator; the derivation and validation of a new phenomenological model to describe MPC actuation; and insights into the proposed model’s design-based functionality i.e. scalability and generalizability in terms of magnetic filler concentration and diaphragm diameter. Due to the lumped element modeling approach, the proposed model can also be adapted to alternative actuator configurations, and thus presents a useful tool for design, control and simulation of novel MPC applications.
A web GIS based integrated flood assessment modeling tool for coastal urban watersheds
NASA Astrophysics Data System (ADS)
Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.
2014-03-01
Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.
NASA Technical Reports Server (NTRS)
Stephan, Amy; Erikson, Carol A.
1991-01-01
As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.
Abstracting event-based control models for high autonomy systems
NASA Technical Reports Server (NTRS)
Luh, Cheng-Jye; Zeigler, Bernard P.
1993-01-01
A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
Modeling languages for biochemical network simulation: reaction vs equation based approaches.
Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya
2010-01-01
Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.
NASA Astrophysics Data System (ADS)
Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef
2016-12-01
Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.
Social Learning among Organic Farmers and the Application of the Communities of Practice Framework
ERIC Educational Resources Information Center
Morgan, Selyf Lloyd
2011-01-01
The paper examines social learning processes among organic farmers and explores the application of the Community of Practice (CoP) model in this context. The analysis employed utilises an approach based on the CoP model, and considers how, or whether, this approach may be useful to understand social learning among farmers. The CoP model is applied…
Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert
2009-01-01
The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.
Wiggins, Paul A
2015-07-21
This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Kaganov, A Sh; Kir'yanov, P A
2015-01-01
The objective of the present publication was to discuss the possibility of application of cybernetic modeling methods to overcome the apparent discrepancy between two kinds of the speech records, viz. initial ones (e.g. obtained in the course of special investigation activities) and the voice prints obtained from the persons subjected to the criminalistic examination. The paper is based on the literature sources and the materials of original criminalistics expertises performed by the authors.
ERIC Educational Resources Information Center
Technology & Learning, 2005
2005-01-01
In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…
Smith, York R.; Ray, Rupashree S.; Carlson, Krista; Sarma, Biplab; Misra, Mano
2013-01-01
Metal oxide nanotubes have become a widely investigated material, more specifically, self-organized titania nanotube arrays synthesized by electrochemical anodization. As a highly investigated material with a wide gamut of applications, the majority of published literature focuses on the solar-based applications of this material. The scope of this review summarizes some of the recent advances made using metal oxide nanotube arrays formed via anodization in solar-based applications. A general methodology for theoretical modeling of titania surfaces in solar applications is also presented. PMID:28811415
Model-based vision for space applications
NASA Technical Reports Server (NTRS)
Chaconas, Karen; Nashman, Marilyn; Lumia, Ronald
1992-01-01
This paper describes a method for tracking moving image features by combining spatial and temporal edge information with model based feature information. The algorithm updates the two-dimensional position of object features by correlating predicted model features with current image data. The results of the correlation process are used to compute an updated model. The algorithm makes use of a high temporal sampling rate with respect to spatial changes of the image features and operates in a real-time multiprocessing environment. Preliminary results demonstrate successful tracking for image feature velocities between 1.1 and 4.5 pixels every image frame. This work has applications for docking, assembly, retrieval of floating objects and a host of other space-related tasks.
NASA Technical Reports Server (NTRS)
Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael
1992-01-01
Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.
Technology-Based Content through Virtual and Physical Modeling: A National Research Study
ERIC Educational Resources Information Center
Ernst, Jeremy V.; Clark, Aaron C.
2009-01-01
Visualization is becoming more prevalent as an application in science, engineering, and technology related professions. The analysis of static and dynamic graphical visualization provides data solutions and understandings that go beyond traditional forms of communication. The study of technology-based content and the application of conceptual…
NASA Astrophysics Data System (ADS)
Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret
2003-12-01
A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.
Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Montgomery, Todd; Callahan, John R.; Whetten, Brian
1996-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
Stochastic-field cavitation model
NASA Astrophysics Data System (ADS)
Dumond, J.; Magagnato, F.; Class, A.
2013-07-01
Nonlinear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally, the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian "particles" or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and, in particular, to cavitating flow. To validate the proposed stochastic-field cavitation model, two applications are considered. First, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.
Stochastic-field cavitation model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumond, J., E-mail: julien.dumond@areva.com; AREVA GmbH, Erlangen, Paul-Gossen-Strasse 100, D-91052 Erlangen; Magagnato, F.
2013-07-15
Nonlinear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally, the simulation of pdf transport requires Monte-Carlo codes based on Lagrangian “particles” or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic-field method solving pdf transport based on Eulerian fields has been proposed which eliminates the necessity to mix Eulerian and Lagrangian techniques or prescribed pdf assumptions. In the present work, for the first time the stochastic-field method is applied to multi-phase flow and, in particular, to cavitating flow. To validate the proposed stochastic-fieldmore » cavitation model, two applications are considered. First, sheet cavitation is simulated in a Venturi-type nozzle. The second application is an innovative fluidic diode which exhibits coolant flashing. Agreement with experimental results is obtained for both applications with a fixed set of model constants. The stochastic-field cavitation model captures the wide range of pdf shapes present at different locations.« less
Skog, Alexander; Peyre, Sarah E; Pozner, Charles N; Thorndike, Mary; Hicks, Gloria; Dellaripa, Paul F
2012-01-01
The situational leadership model suggests that an effective leader adapts leadership style depending on the followers' level of competency. We assessed the applicability and reliability of the situational leadership model when observing residents in simulated hospital floor-based scenarios. Resident teams engaged in clinical simulated scenarios. Video recordings were divided into clips based on Emergency Severity Index v4 acuity scores. Situational leadership styles were identified in clips by two physicians. Interrater reliability was determined through descriptive statistical data analysis. There were 114 participants recorded in 20 sessions, and 109 clips were reviewed and scored. There was a high level of interrater reliability (weighted kappa r = .81) supporting situational leadership model's applicability to medical teams. A suggestive correlation was found between frequency of changes in leadership style and the ability to effectively lead a medical team. The situational leadership model represents a unique tool to assess medical leadership performance in the context of acuity changes.
Aircraft applications of fault detection and isolation techniques
NASA Astrophysics Data System (ADS)
Marcos Esteban, Andres
In this thesis the problems of fault detection & isolation and fault tolerant systems are studied from the perspective of LTI frequency-domain, model-based techniques. Emphasis is placed on the applicability of these LTI techniques to nonlinear models, especially to aerospace systems. Two applications of Hinfinity LTI fault diagnosis are given using an open-loop (no controller) design approach: one for the longitudinal motion of a Boeing 747-100/200 aircraft, the other for a turbofan jet engine. An algorithm formalizing a robust identification approach based on model validation ideas is also given and applied to the previous jet engine. A general linear fractional transformation formulation is given in terms of the Youla and Dual Youla parameterizations for the integrated (control and diagnosis filter) approach. This formulation provides better insight into the trade-off between the control and the diagnosis objectives. It also provides the basic groundwork towards the development of nested schemes for the integrated approach. These nested structures allow iterative improvements on the control/filter Youla parameters based on successive identification of the system uncertainty (as given by the Dual Youla parameter). The thesis concludes with an application of Hinfinity LTI techniques to the integrated design for the longitudinal motion of the previous Boeing 747-100/200 model.
A markup language for electrocardiogram data acquisition and analysis (ecgML)
Wang, Haiying; Azuaje, Francisco; Jung, Benjamin; Black, Norman
2003-01-01
Background The storage and distribution of electrocardiogram data is based on different formats. There is a need to promote the development of standards for their exchange and analysis. Such models should be platform-/ system- and application-independent, flexible and open to every member of the scientific community. Methods A minimum set of information for the representation and storage of electrocardiogram signals has been synthesised from existing recommendations. This specification is encoded into an XML-vocabulary. The model may aid in a flexible exchange and analysis of electrocardiogram information. Results Based on advantages of XML technologies, ecgML has the ability to present a system-, application- and format-independent solution for representation and exchange of electrocardiogram data. The distinction between the proposal developed by the U.S Food and Drug Administration and ecgML model is given. A series of tools, which aim to facilitate ecgML-based applications, are presented. Conclusions The models proposed here can facilitate the generation of a data format, which opens ways for better and clearer interpretation by both humans and machines. Its structured and transparent organisation will allow researchers to expand and test its capabilities in different application domains. The specification and programs for this protocol are publicly available. PMID:12735790
Application of IFT and SPSA to servo system control.
Rădac, Mircea-Bogdan; Precup, Radu-Emil; Petriu, Emil M; Preitl, Stefan
2011-12-01
This paper treats the application of two data-based model-free gradient-based stochastic optimization techniques, i.e., iterative feedback tuning (IFT) and simultaneous perturbation stochastic approximation (SPSA), to servo system control. The representative case of controlled processes modeled by second-order systems with an integral component is discussed. New IFT and SPSA algorithms are suggested to tune the parameters of the state feedback controllers with an integrator in the linear-quadratic-Gaussian (LQG) problem formulation. An implementation case study concerning the LQG-based design of an angular position controller for a direct current servo system laboratory equipment is included to highlight the pros and cons of IFT and SPSA from an application's point of view. The comparison of IFT and SPSA algorithms is focused on an insight into their implementation.
Panchal, Mitesh B; Upadhyay, Sanjay H
2014-09-01
The unprecedented dynamic characteristics of nanoelectromechanical systems make them suitable for nanoscale mass sensing applications. Owing to superior biocompatibility, boron nitride nanotubes (BNNTs) are being increasingly used for such applications. In this study, the feasibility of single walled BNNT (SWBNNT)-based bio-sensor has been explored. Molecular structural mechanics-based finite element (FE) modelling approach has been used to analyse the dynamic behaviour of SWBNNT-based biosensors. The application of an SWBNNT-based mass sensing for zeptogram level of mass has been reported. Also, the effect of size of the nanotube in terms of length as well as different chiral atomic structures of SWBNNT has been analysed for their sensitivity analysis. The vibrational behaviour of SWBNNT has been analysed for higher-order modes of vibrations to identify the intermediate landing position of biological object of zeptogram scale. The present molecular structural mechanics-based FE modelling approach is found to be very effectual to incorporate different chiralities of the atomic structures. Also, different boundary conditions can be effectively simulated using the present approach to analyse the dynamic behaviour of the SWBNNT-based mass sensor. The presented study has explored the potential of SWBNNT, as a nanobiosensor having the capability of zeptogram level mass sensing.
The AgESGUI geospatial simulation system for environmental model application and evaluation
USDA-ARS?s Scientific Manuscript database
Practical decision making in spatially-distributed environmental assessment and management is increasingly being based on environmental process-based models linked to geographical information systems (GIS). Furthermore, powerful computers and Internet-accessible assessment tools are providing much g...
Application of Consider Covariance to the Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Lundberg, John B.
1996-01-01
The extended Kalman filter (EKF) is the basis for many applications of filtering theory to real-time problems where estimates of the state of a dynamical system are to be computed based upon some set of observations. The form of the EKF may vary somewhat from one application to another, but the fundamental principles are typically unchanged among these various applications. As is the case in many filtering applications, models of the dynamical system (differential equations describing the state variables) and models of the relationship between the observations and the state variables are created. These models typically employ a set of constants whose values are established my means of theory or experimental procedure. Since the estimates of the state are formed assuming that the models are perfect, any modeling errors will affect the accuracy of the computed estimates. Note that the modeling errors may be errors of commission (errors in terms included in the model) or omission (errors in terms excluded from the model). Consequently, it becomes imperative when evaluating the performance of real-time filters to evaluate the effect of modeling errors on the estimates of the state.
RuleMonkey: software for stochastic simulation of rule-based models
2010-01-01
Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321
Cluster-based analysis of multi-model climate ensembles
NASA Astrophysics Data System (ADS)
Hyde, Richard; Hossaini, Ryan; Leeson, Amber A.
2018-06-01
Clustering - the automated grouping of similar data - can provide powerful and unique insight into large and complex data sets, in a fast and computationally efficient manner. While clustering has been used in a variety of fields (from medical image processing to economics), its application within atmospheric science has been fairly limited to date, and the potential benefits of the application of advanced clustering techniques to climate data (both model output and observations) has yet to be fully realised. In this paper, we explore the specific application of clustering to a multi-model climate ensemble. We hypothesise that clustering techniques can provide (a) a flexible, data-driven method of testing model-observation agreement and (b) a mechanism with which to identify model development priorities. We focus our analysis on chemistry-climate model (CCM) output of tropospheric ozone - an important greenhouse gas - from the recent Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Tropospheric column ozone from the ACCMIP ensemble was clustered using the Data Density based Clustering (DDC) algorithm. We find that a multi-model mean (MMM) calculated using members of the most-populous cluster identified at each location offers a reduction of up to ˜ 20 % in the global absolute mean bias between the MMM and an observed satellite-based tropospheric ozone climatology, with respect to a simple, all-model MMM. On a spatial basis, the bias is reduced at ˜ 62 % of all locations, with the largest bias reductions occurring in the Northern Hemisphere - where ozone concentrations are relatively large. However, the bias is unchanged at 9 % of all locations and increases at 29 %, particularly in the Southern Hemisphere. The latter demonstrates that although cluster-based subsampling acts to remove outlier model data, such data may in fact be closer to observed values in some locations. We further demonstrate that clustering can provide a viable and useful framework in which to assess and visualise model spread, offering insight into geographical areas of agreement among models and a measure of diversity across an ensemble. Finally, we discuss caveats of the clustering techniques and note that while we have focused on tropospheric ozone, the principles underlying the cluster-based MMMs are applicable to other prognostic variables from climate models.
NASA Astrophysics Data System (ADS)
Sundaramoorthy, Kumaravel
2017-02-01
The hybrid energy systems (HESs) based electricity generation system has become a more attractive solution for rural electrification nowadays. Economically feasible and technically reliable HESs are solidly based on an optimisation stage. This article discusses about the optimal unit sizing model with the objective function to minimise the total cost of the HES. Three typical rural sites from southern part of India have been selected for the application of the developed optimisation methodology. Feasibility studies and sensitivity analysis on the optimal HES are discussed elaborately in this article. A comparison has been carried out with the Hybrid Optimization Model for Electric Renewable optimisation model for three sites. The optimal HES is found with less total net present rate and rate of energy compared with the existing method
Application and study of land-reclaim based on Arc/Info
NASA Astrophysics Data System (ADS)
Zhao, Jun; Zhang, Ruiju; Wang, Zhian; Li, Shiyong
2005-10-01
This paper firstly puts forward the evaluation models of land-reclaim, which is derived from the thoery of Fuzzy associative memory nerve network and corresponding supplemental CASE tools, based on the model the mode of land reclaim can determined, and then the elements of land-reclaim are displayed and synthesized visually and virtually by virtue of Arc/Info software. In the process of land reclaim, it is particularly important to build the model of land-reclaim and to map the distribution of soil elements. In this way rational and feasible schemes are adopted in order to instruct the project of land reclaim. This thesis mainly takes the fourth mining area of East Beach as an example and puts this model into practice. Based on Arc/Info software the application of land-reclaim is studied and good results are achieved.
Introduction: Special issue on advances in topobathymetric mapping, models, and applications
Gesch, Dean B.; Brock, John C.; Parrish, Christopher E.; Rogers, Jeffrey N.; Wright, C. Wayne
2016-01-01
Detailed knowledge of near-shore topography and bathymetry is required for many geospatial data applications in the coastal environment. New data sources and processing methods are facilitating development of seamless, regional-scale topobathymetric digital elevation models. These elevation models integrate disparate multi-sensor, multi-temporal topographic and bathymetric datasets to provide a coherent base layer for coastal science applications such as wetlands mapping and monitoring, sea-level rise assessment, benthic habitat mapping, erosion monitoring, and storm impact assessment. The focus of this special issue is on recent advances in the source data, data processing and integration methods, and applications of topobathymetric datasets.
Remote sensing sensors and applications in environmental resources mapping and modeling
Melesse, Assefa M.; Weng, Qihao; Thenkabail, Prasad S.; Senay, Gabriel B.
2007-01-01
The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling.
The Development and Application of the Explanatory Model of School Dysfunctions
ERIC Educational Resources Information Center
Bergman, Manfred Max; Bergman, Zinette; Gravett, Sarah
2011-01-01
This article develops the Explanatory Model of School Dysfunctions based on 80 essays of school principals and their representatives in Gauteng. It reveals the degree and kinds of school dysfunctions, as well as their interconnectedness with actors, networks, and domains. The model provides a basis for theory-based analyses of specific…
A Model for E-Education: Extended Teaching Spaces and Extended Learning Spaces
ERIC Educational Resources Information Center
Jung, Insung; Latchem, Colin
2011-01-01
The paper proposes a model for e-education in instruction, training, initiation and induction based upon the concept of extended teaching spaces involving execution, facilitation and liberation, and extended learning spaces used for acquisition, application and construction cemented by dialogue and reflection. The proposed model is based upon…
Traffic model for advanced satellite designs and experiments for ISDN services
NASA Technical Reports Server (NTRS)
Pepin, Gerard R.; Hager, E. Paul
1991-01-01
The data base structure and fields for categorizing and storing Integrated Services Digital Network (ISDN) user characteristics is outlined. This traffic model data base will be used to exercise models of the ISDN Advanced Communication Satellite to determine design parameters and performance for the NASA Satellite Communications Applications Research (SCAR) Program.
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four report volumes. Moreover, the tests are generally applicable to other model evaluation problem...
Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon
2017-03-01
This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.
Model based rib-cage unfolding for trauma CT
NASA Astrophysics Data System (ADS)
von Berg, Jens; Klinder, Tobias; Lorenz, Cristian
2018-03-01
A CT rib-cage unfolding method is proposed that does not require to determine rib centerlines but determines the visceral cavity surface by model base segmentation. Image intensities are sampled across this surface that is flattened using a model based 3D thin-plate-spline registration. An average rib centerline model projected onto this surface serves as a reference system for registration. The flattening registration is designed so that ribs similar to the centerline model are mapped onto parallel lines preserving their relative length. Ribs deviating from this model appear deviating from straight parallel ribs in the unfolded view, accordingly. As the mapping is continuous also the details in intercostal space and those adjacent to the ribs are rendered well. The most beneficial application area is Trauma CT where a fast detection of rib fractures is a crucial task. Specifically in trauma, automatic rib centerline detection may not be guaranteed due to fractures and dislocations. The application by visual assessment on the large public LIDC data base of lung CT proved general feasibility of this early work.
Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon
2017-01-01
This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model. PMID:28257067
Evaluation of Applicability of a Flare Trigger Model Based on a Comparison of Geometric Structures
NASA Astrophysics Data System (ADS)
Bamba, Yumi; Kusano, Kanya
2018-03-01
The triggering mechanism(s) and critical condition(s) of solar flares are still not completely clarified, although various studies have attempted to elucidate them. We have also proposed a theoretical flare-trigger model based on MHD simulations in which two types of small-scale bipole fields, the so-called opposite polarity (OP) and reversed shear (RS), can trigger flares. In this study, we evaluated the applicability of our flare-trigger model to the observation of 32 flares that were observed by the Solar Dynamics Observatory, by focusing on geometrical structures. We classified the events into six types, including the OP and RS types, based on photospheric magnetic field configuration, presence of precursor brightenings, and shape of the initial flare ribbons. As a result, we found that approximately 30% of the flares were consistent with our flare-trigger model, and the number of RS-type triggered flares is larger than that of the OP type. We found that none of the sampled events contradict our flare model; though, we cannot clearly determine the trigger mechanism of 70% of the flares in this study. We carefully investigated the applicability of our flare-trigger model and the possibility that other models can explain the other 70% of the events. Consequently, we concluded that our flare-trigger model has certainly proposed important conditions for flare-triggering.
Modeling of salt and pH gradient elution in ion-exchange chromatography.
Schmidt, Michael; Hafner, Mathias; Frech, Christian
2014-01-01
The separation of proteins by internally and externally generated pH gradients in chromatofocusing on ion-exchange columns is a well-established analytical method with a large number of applications. In this work, a stoichiometric displacement model was used to describe the retention behavior of lysozyme on SP Sepharose FF and a monoclonal antibody on Fractogel SO3 (S) in linear salt and pH gradient elution. The pH dependence of the binding charge B in the linear gradient elution model is introduced using a protein net charge model, while the pH dependence of the equilibrium constant is based on a thermodynamic approach. The model parameter and pH dependences are calculated from linear salt gradient elutions at different pH values as well as from linear pH gradient elutions at different fixed salt concentrations. The application of the model for the well-characterized protein lysozyme resulted in almost identical model parameters based on either linear salt or pH gradient elution data. For the antibody, only the approach based on linear pH gradients is feasible because of the limited pH range useful for salt gradient elution. The application of the model for the separation of an acid variant of the antibody from the major monomeric form is discussed. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
iCrowd: agent-based behavior modeling and crowd simulator
NASA Astrophysics Data System (ADS)
Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.
2016-05-01
Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.
NASA Astrophysics Data System (ADS)
Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.
2017-08-01
The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.
Model-based design of experiments for cellular processes.
Chakrabarty, Ankush; Buzzard, Gregery T; Rundell, Ann E
2013-01-01
Model-based design of experiments (MBDOE) assists in the planning of highly effective and efficient experiments. Although the foundations of this field are well-established, the application of these techniques to understand cellular processes is a fertile and rapidly advancing area as the community seeks to understand ever more complex cellular processes and systems. This review discusses the MBDOE paradigm along with applications and challenges within the context of cellular processes and systems. It also provides a brief tutorial on Fisher information matrix (FIM)-based and Bayesian experiment design methods along with an overview of existing software packages and computational advances that support MBDOE application and adoption within the Systems Biology community. As cell-based products and biologics progress into the commercial sector, it is anticipated that MBDOE will become an essential practice for design, quality control, and production. Copyright © 2013 Wiley Periodicals, Inc.
ME science as mobile learning based on virtual reality
NASA Astrophysics Data System (ADS)
Fradika, H. D.; Surjono, H. D.
2018-04-01
The purpose of this article described about ME Science (Mobile Education Science) as mobile learning application learning of Fisika Inti. ME Science is a product of research and development (R&D) that was using Alessi and Trollip model. Alessi and Trollip model consists three stages that are: (a) planning include analysis of problems, goals, need, and idea of development product, (b) designing includes collecting of materials, designing of material content, creating of story board, evaluating and review product, (c) developing includes development of product, alpha testing, revision of product, validation of product, beta testing, and evaluation of product. The article describes ME Science only to development of product which include development stages. The result of development product has been generates mobile learning application based on virtual reality that can be run on android-based smartphone. These application consist a brief description of learning material, quizzes, video of material summery, and learning material based on virtual reality.
40 CFR 1036.620 - Alternate CO2 standards based on model year 2011 compression-ignition engines.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Alternate CO2 standards based on model... the following criteria: (1) It must have been certified to all applicable emission standards in model... set and model year in which you certify engines to the standards of this section. You may not bank any...
40 CFR 1036.620 - Alternate CO2 standards based on model year 2011 compression-ignition engines.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Alternate CO2 standards based on model... the following criteria: (1) It must have been certified to all applicable emission standards in model... set and model year in which you certify engines to the standards of this section. You may not bank any...
40 CFR 1036.620 - Alternate CO2 standards based on model year 2011 compression-ignition engines.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Alternate CO2 standards based on model... the following criteria: (1) It must have been certified to all applicable emission standards in model... set and model year in which you certify engines to the standards of this section. You may not bank any...
Theory, development, and applicability of the surface water hydrologic model CASC2D
NASA Astrophysics Data System (ADS)
Downer, Charles W.; Ogden, Fred L.; Martin, William D.; Harmon, Russell S.
2002-02-01
Numerical tests indicate that Hortonian runoff mechanisms benefit from scaling effects that non-Hortonian runoff mechanisms do not share. This potentially makes Hortonian watersheds more amenable to physically based modelling provided that the physically based model employed properly accounts for rainfall distribution and initial soil moisture conditions, to which these types of model are highly sensitive. The distributed Hortonian runoff model CASC2D has been developed and tested for the US Army over the past decade. The purpose of the model is to provide the Army with superior predictions of runoff and stream-flow compared with the standard lumped parameter model HEC-1. The model is also to be used to help minimize negative effects on the landscape caused by US armed forces training activities. Development of the CASC2D model is complete and the model has been tested and applied at several locations. These applications indicate that the model can realistically reproduce hydrographs when properly applied. These applications also indicate that there may be many situations where the model is inadequate. Because of this, the Army is pursuing development of a new model, GSSHA, that will provide improved numerical stability and incorporate additional stream-flow-producing mechanisms and improved hydraulics.
Choi, Young Joon; Constantino, Jason; Vedula, Vijay; Trayanova, Natalia; Mittal, Rajat
2015-01-01
A methodology for the simulation of heart function that combines an MRI-based model of cardiac electromechanics (CE) with a Navier–Stokes-based hemodynamics model is presented. The CE model consists of two coupled components that simulate the electrical and the mechanical functions of the heart. Accurate representations of ventricular geometry and fiber orientations are constructed from the structural magnetic resonance and the diffusion tensor MR images, respectively. The deformation of the ventricle obtained from the electromechanical model serves as input to the hemodynamics model in this one-way coupled approach via imposed kinematic wall velocity boundary conditions and at the same time, governs the blood flow into and out of the ventricular volume. The time-dependent endocardial surfaces are registered using a diffeomorphic mapping algorithm, while the intraventricular blood flow patterns are simulated using a sharp-interface immersed boundary method-based flow solver. The utility of the combined heart-function model is demonstrated by comparing the hemodynamic characteristics of a normal canine heart beating in sinus rhythm against that of the dyssynchronously beating failing heart. We also discuss the potential of coupled CE and hemodynamics models for various clinical applications. PMID:26442254
An automation simulation testbed
NASA Technical Reports Server (NTRS)
Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel
1988-01-01
The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.
A Component-based Programming Model for Composite, Distributed Applications
NASA Technical Reports Server (NTRS)
Eidson, Thomas M.; Bushnell, Dennis M. (Technical Monitor)
2001-01-01
The nature of scientific programming is evolving to larger, composite applications that are composed of smaller element applications. These composite applications are more frequently being targeted for distributed, heterogeneous networks of computers. They are most likely programmed by a group of developers. Software component technology and computational frameworks are being proposed and developed to meet the programming requirements of these new applications. Historically, programming systems have had a hard time being accepted by the scientific programming community. In this paper, a programming model is outlined that attempts to organize the software component concepts and fundamental programming entities into programming abstractions that will be better understood by the application developers. The programming model is designed to support computational frameworks that manage many of the tedious programming details, but also that allow sufficient programmer control to design an accurate, high-performance application.
Nowinski, Wieslaw L; Thirunavuukarasuu, Arumugam; Ananthasubramaniam, Anand; Chua, Beng Choon; Qian, Guoyu; Nowinska, Natalia G; Marchenko, Yevgen; Volkau, Ihar
2009-10-01
Preparation of tests and student's assessment by the instructor are time consuming. We address these two tasks in neuroanatomy education by employing a digital media application with a three-dimensional (3D), interactive, fully segmented, and labeled brain atlas. The anatomical and vascular models in the atlas are linked to Terminologia Anatomica. Because the cerebral models are fully segmented and labeled, our approach enables automatic and random atlas-derived generation of questions to test location and naming of cerebral structures. This is done in four steps: test individualization by the instructor, test taking by the students at their convenience, automatic student assessment by the application, and communication of the individual assessment to the instructor. A computer-based application with an interactive 3D atlas and a preliminary mobile-based application were developed to realize this approach. The application works in two test modes: instructor and student. In the instructor mode, the instructor customizes the test by setting the scope of testing and student performance criteria, which takes a few seconds. In the student mode, the student is tested and automatically assessed. Self-testing is also feasible at any time and pace. Our approach is automatic both with respect to test generation and student assessment. It is also objective, rapid, and customizable. We believe that this approach is novel from computer-based, mobile-based, and atlas-assisted standpoints.
From design to manufacturing of asymmetric teeth gears using computer application
NASA Astrophysics Data System (ADS)
Suciu, F.; Dascalescu, A.; Ungureanu, M.
2017-05-01
The asymmetric cylindrical gears, with involutes teeth profiles having different base circle diameters, are nonstandard gears, used with the aim to obtain better function parameters for the active profile. We will expect that the manufacturing of these gears became possible only after the design and realization of some specific tools. The paper present how the computer aided design and applications developed in MATLAB, for obtain the geometrical parameters, in the same time for calculation some functional parameters like stress and displacements, transmission error, efficiency of the gears and the 2D models, generated with AUTOLISP applications, are used for computer aided manufacturing of asymmetric gears with standard tools. So the specific tools considered one of the disadvantages of these gears are not necessary and implicitly the expected supplementary costs are reduced. The calculus algorithm established for the asymmetric gear design application use the „direct design“ of the spur gears. This method offers the possibility of determining first the parameters of the gears, followed by the determination of the asymmetric gear rack’s parameters, based on those of the gears. Using original design method and computer applications have been determined the geometrical parameters, the 2D and 3D models of the asymmetric gears and on the base of these models have been manufacturing on CNC machine tool asymmetric gears.
Moss, Darren Michael; Marzolini, Catia; Rajoli, Rajith K R; Siccardi, Marco
2015-01-01
The pharmacokinetic properties of anti-infective drugs are a determinant part of treatment success. Pathogen replication is inhibited if adequate drug levels are achieved in target sites, whereas excessive drug concentrations linked to toxicity are to be avoided. Anti-infective distribution can be predicted by integrating in vitro drug properties and mathematical descriptions of human anatomy in physiologically based pharmacokinetic models. This method reduces the need for animal and human studies and is used increasingly in drug development and simulation of clinical scenario such as, for instance, drug-drug interactions, dose optimization, novel formulations and pharmacokinetics in special populations. We have assessed the relevance of physiologically based pharmacokinetic modeling in the anti-infective research field, giving an overview of mechanisms involved in model design and have suggested strategies for future applications of physiologically based pharmacokinetic models. Physiologically based pharmacokinetic modeling provides a powerful tool in anti-infective optimization, and there is now no doubt that both industry and regulatory bodies have recognized the importance of this technology. It should be acknowledged, however, that major challenges remain to be addressed and that information detailing disease group physiology and anti-infective pharmacodynamics is required if a personalized medicine approach is to be achieved.
ERIC Educational Resources Information Center
Downey, Thomas E.
Continuous quality improvement (CQI) models, which were first applied in business, are critical to making new technology-based learning paradigms and flexible learning environments a reality. The following are among the factors that have facilitated CQI's application in education: increased operating costs; increased competition from private…
WEPP Model applications for evaluations of best management practices
D. C. Flanagan; W. J. Elliott; J. R. Frankenberger; C. Huang
2010-01-01
The Water Erosion Prediction Project (WEPP) model is a process-based erosion prediction technology for application to small watersheds and hillslope profiles, under agricultural, forested, rangeland, and other land management conditions. Developed by the United States Department of Agriculture (USDA) over the past 25 years, WEPP simulates many of the physical processes...
Constituents Make the Difference: Improving the Value of Rehabilitation Research.
ERIC Educational Resources Information Center
Menz, Fredrick E.
The participatory research model used by the Rehabilitation Research and Training Center at the University of Wisconsin-Stout is discussed, with a focus on the value added to the research process and relevance of research applications when research is rehabilitation-need based and the research-to-applications process model is used. Information is…
YADBrowser: A Browser for Web-Based Educational Applications
ERIC Educational Resources Information Center
Zaldivar, Vicente Arturo Romero; Arandia, Jon Ander Elorriaga; Brito, Mateo Lezcano
2005-01-01
In this article, the main characteristics of the educational browser YADBrowser are described. One of the main objectives of this project is to define new languages and object models which facilitate the creation of educational applications for the Internet. The fundamental characteristics of the object model of the browser are also described.…
Dynamic Emulation Modelling (DEMo) of large physically-based environmental models
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.
2012-12-01
In environmental modelling large, spatially-distributed, physically-based models are widely adopted to describe the dynamics of physical, social and economic processes. Such an accurate process characterization comes, however, to a price: the computational requirements of these models are considerably high and prevent their use in any problem requiring hundreds or thousands of model runs to be satisfactory solved. Typical examples include optimal planning and management, data assimilation, inverse modelling and sensitivity analysis. An effective approach to overcome this limitation is to perform a top-down reduction of the physically-based model by identifying a simplified, computationally efficient emulator, constructed from and then used in place of the original model in highly resource-demanding tasks. The underlying idea is that not all the process details in the original model are equally important and relevant to the dynamics of the outputs of interest for the type of problem considered. Emulation modelling has been successfully applied in many environmental applications, however most of the literature considers non-dynamic emulators (e.g. metamodels, response surfaces and surrogate models), where the original dynamical model is reduced to a static map between input and the output of interest. In this study we focus on Dynamic Emulation Modelling (DEMo), a methodological approach that preserves the dynamic nature of the original physically-based model, with consequent advantages in a wide variety of problem areas. In particular, we propose a new data-driven DEMo approach that combines the many advantages of data-driven modelling in representing complex, non-linear relationships, but preserves the state-space representation typical of process-based models, which is both particularly effective in some applications (e.g. optimal management and data assimilation) and facilitates the ex-post physical interpretation of the emulator structure, thus enhancing the credibility of the model to stakeholders and decision-makers. Numerical results from the application of the approach to the reduction of 3D coupled hydrodynamic-ecological models in several real world case studies, including Marina Reservoir (Singapore) and Googong Reservoir (Australia), are illustrated.
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
NASA Technical Reports Server (NTRS)
Langtry, R. B.; Menter, F. R.; Likki, S. R.; Suzen, Y. B.; Huang, P. G.; Volker, S.
2006-01-01
A new correlation-based transition model has been developed, which is built strictly on local variables. As a result, the transition model is compatible with modern computational fluid dynamics (CFD) methods using unstructured grids and massive parallel execution. The model is based on two transport equations, one for the intermittency and one for the transition onset criteria in terms of momentum thickness Reynolds number. The proposed transport equations do not attempt to model the physics of the transition process (unlike, e.g., turbulence models), but form a framework for the implementation of correlation-based models into general-purpose CFD methods.
Data Intensive Systems (DIS) Benchmark Performance Summary
2003-08-01
models assumed by today’s conventional architectures. Such applications include model- based Automatic Target Recognition (ATR), synthetic aperture...radar (SAR) codes, large scale dynamic databases/battlefield integration, dynamic sensor- based processing, high-speed cryptanalysis, high speed...distributed interactive and data intensive simulations, data-oriented problems characterized by pointer- based and other highly irregular data structures
A modeling framework was developed that can be applied in conjunction with field based monitoring efforts (e.g., through effects-based monitoring programs) to link chemically-induced alterations in molecular and biochemical endpoints to adverse outcomes in whole organisms and pop...
NASA Technical Reports Server (NTRS)
Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai
2011-01-01
A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai
2012-01-01
A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.
Underwater 3D Surface Measurement Using Fringe Projection Based Scanning Devices
Bräuer-Burchardt, Christian; Heinze, Matthias; Schmidt, Ingo; Kühmstedt, Peter; Notni, Gunther
2015-01-01
In this work we show the principle of optical 3D surface measurements based on the fringe projection technique for underwater applications. The challenges of underwater use of this technique are shown and discussed in comparison with the classical application. We describe an extended camera model which takes refraction effects into account as well as a proposal of an effective, low-effort calibration procedure for underwater optical stereo scanners. This calibration technique combines a classical air calibration based on the pinhole model with ray-based modeling and requires only a few underwater recordings of an object of known length and a planar surface. We demonstrate a new underwater 3D scanning device based on the fringe projection technique. It has a weight of about 10 kg and the maximal water depth for application of the scanner is 40 m. It covers an underwater measurement volume of 250 mm × 200 mm × 120 mm. The surface of the measurement objects is captured with a lateral resolution of 150 μm in a third of a second. Calibration evaluation results are presented and examples of first underwater measurements are given. PMID:26703624
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT
Kim, Jonghyuk
2018-01-01
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market. PMID:29570684
Sensor-Based Optimization Model for Air Quality Improvement in Home IoT.
Kim, Jonghyuk; Hwangbo, Hyunwoo
2018-03-23
We introduce current home Internet of Things (IoT) technology and present research on its various forms and applications in real life. In addition, we describe IoT marketing strategies as well as specific modeling techniques for improving air quality, a key home IoT service. To this end, we summarize the latest research on sensor-based home IoT, studies on indoor air quality, and technical studies on random data generation. In addition, we develop an air quality improvement model that can be readily applied to the market by acquiring initial analytical data and building infrastructures using spectrum/density analysis and the natural cubic spline method. Accordingly, we generate related data based on user behavioral values. We integrate the logic into the existing home IoT system to enable users to easily access the system through the Web or mobile applications. We expect that the present introduction of a practical marketing application method will contribute to enhancing the expansion of the home IoT market.
NASA Astrophysics Data System (ADS)
Shugart, Herman H.; Wang, Bin; Fischer, Rico; Ma, Jianyong; Fang, Jing; Yan, Xiaodong; Huth, Andreas; Armstrong, Amanda H.
2018-03-01
Individual-based models (IBMs) of complex systems emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. Ecological IBMs arose with seemingly independent origins out of the tradition of understanding the ecosystems dynamics of ecosystems from a ‘bottom-up’ accounting of the interactions of the parts. Individual trees are principal among the parts of forests. Because these models are computationally demanding, they have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. This review will focus on a class of forest IBMs called gap models. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on a small plot of land. The summation of these plots comprise a forest (or set of sample plots on a forested landscape or region). Other, more aggregated forest IBMs have been used in global applications including cohort-based models, ecosystem demography models, etc. Gap models have been used to provide the parameters for these bulk models. Currently, gap models have grown from local-scale to continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. Our objective in this review is to provide the reader with an overview of the history, motivation and applications, including theoretical applications, of these models. In a time of concern over global changes, gap models are essential tools to understand forest responses to climate change, modified disturbance regimes and other change agents. Development of forest surveys to provide the starting points for simulations and better estimates of the behavior of the diversity of tree species in response to the environment are continuing needs for improvement for these and other IBMs.
Modern proposal of methodology for retrieval of characteristic synthetic rainfall hyetographs
NASA Astrophysics Data System (ADS)
Licznar, Paweł; Burszta-Adamiak, Ewa; Łomotowski, Janusz; Stańczyk, Justyna
2017-11-01
Modern engineering workshop of designing and modelling complex drainage systems is based on hydrodynamic modelling and has a probabilistic character. Its practical application requires a change regarding rainfall models accepted at the input. Previously used artificial rainfall models of simplified form, e.g. block precipitation or Euler's type II model rainfall are no longer sufficient. It is noticeable that urgent clarification is needed as regards the methodology of standardized rainfall hyetographs that would take into consideration the specifics of local storm rainfall temporal dynamics. The aim of the paper is to present a proposal for innovative methodology for determining standardized rainfall hyetographs, based on statistical processing of the collection of actual local precipitation characteristics. Proposed methodology is based on the classification of standardized rainfall hyetographs with the use of cluster analysis. Its application is presented on the example of selected rain gauges localized in Poland. Synthetic rainfall hyetographs achieved as a final result may be used for hydrodynamic modelling of sewerage systems, including probabilistic detection of necessary capacity of retention reservoirs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler A; Schimpe, Michael; von Kuepach, Markus Edler
For reliable lifetime predictions of lithium-ion batteries, models for cell degradation are required. A comprehensive semi-empirical model based on a reduced set of internal cell parameters and physically justified degradation functions for the capacity loss is developed and presented for a commercial lithium iron phosphate/graphite cell. One calendar and several cycle aging effects are modeled separately. Emphasis is placed on the varying degradation at different temperatures. Degradation mechanisms for cycle aging at high and low temperatures as well as the increased cycling degradation at high state of charge are calculated separately.For parameterization, a lifetime test study is conducted including storagemore » and cycle tests. Additionally, the model is validated through a dynamic current profile based on real-world application in a stationary energy storage system revealing the accuracy. The model error for the cell capacity loss in the application-based tests is at the end of testing below 1 % of the original cell capacity.« less
Data-Flow Based Model Analysis
NASA Technical Reports Server (NTRS)
Saad, Christian; Bauer, Bernhard
2010-01-01
The concept of (meta) modeling combines an intuitive way of formalizing the structure of an application domain with a high expressiveness that makes it suitable for a wide variety of use cases and has therefore become an integral part of many areas in computer science. While the definition of modeling languages through the use of meta models, e.g. in Unified Modeling Language (UML), is a well-understood process, their validation and the extraction of behavioral information is still a challenge. In this paper we present a novel approach for dynamic model analysis along with several fields of application. Examining the propagation of information along the edges and nodes of the model graph allows to extend and simplify the definition of semantic constraints in comparison to the capabilities offered by e.g. the Object Constraint Language. Performing a flow-based analysis also enables the simulation of dynamic behavior, thus providing an "abstract interpretation"-like analysis method for the modeling domain.
Physiologically-based kinetic modelling in risk assessment
The European Union Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) hosted a two-day workshop with an aim to discuss the role and application of Physiologically Based Kinetic (PBK) models in regulatory decision making. The EURL ECVAM strategy document on Toxic...
Physiologically based pharmacokinetic (PBPK) modeling considering methylated trivalent arsenicals
PBPK modeling provides a quantitative biologically-based framework to integrate diverse types of information for application to risk analysis. For example, genetic polymorphisms in arsenic metabolizing enzymes (AS3MT) can lead to differences in target tissue dosimetry for key tri...
A Primer for Agent-Based Simulation and Modeling in Transportation Applications
DOT National Transportation Integrated Search
2013-11-01
Agent-based modeling and simulation (ABMS) methods have been applied in a spectrum of research domains. This primer focuses on ABMS in the transportation interdisciplinary domain, describes the basic concepts of ABMS and the recent progress of ABMS i...
Agent-Based Computational Modeling to Examine How Individual Cell Morphology Affects Dosimetry
Cell-based models utilizing high-content screening (HCS) data have applications for predictive toxicology. Evaluating concentration-dependent effects on cell fate and state response is a fundamental utilization of HCS data.Although HCS assays may capture quantitative readouts at ...
MaMR: High-performance MapReduce programming model for material cloud applications
NASA Astrophysics Data System (ADS)
Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng
2017-02-01
With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.
A large-grain mapping approach for multiprocessor systems through data flow model. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Kim, Hwa-Soo
1991-01-01
A large-grain level mapping method is presented of numerical oriented applications onto multiprocessor systems. The method is based on the large-grain data flow representation of the input application and it assumes a general interconnection topology of the multiprocessor system. The large-grain data flow model was used because such representation best exhibits inherited parallelism in many important applications, e.g., CFD models based on partial differential equations can be presented in large-grain data flow format, very effectively. A generalized interconnection topology of the multiprocessor architecture is considered, including such architectural issues as interprocessor communication cost, with the aim to identify the 'best matching' between the application and the multiprocessor structure. The objective is to minimize the total execution time of the input algorithm running on the target system. The mapping strategy consists of the following: (1) large-grain data flow graph generation from the input application using compilation techniques; (2) data flow graph partitioning into basic computation blocks; and (3) physical mapping onto the target multiprocessor using a priority allocation scheme for the computation blocks.
ERIC Educational Resources Information Center
Mbaziira, Alex Vincent
2017-01-01
Cybercriminals are increasingly using Internet-based text messaging applications to exploit their victims. Incidents of deceptive cybercrime in text-based communication are increasing and include fraud, scams, as well as favorable and unfavorable fake reviews. In this work, we use a text-based deception detection approach to train models for…
The web system for operative description of air quality in the city
NASA Astrophysics Data System (ADS)
Barth, A. A.; Starchenko, A. V.; Fazliev, A. Z.
2009-04-01
Development and implementation of information-computational system (ICS) is described. The system is oriented on the collective usage of the calculation's facilities in order to determine the air quality on the basis of photochemical model. The ICS has been implemented on the basis of the middleware of ATMOS web-portal [1, 2]. The data and calculation layer of this ICS includes: Mathematical model of pollution transport based on transport differential equations. The model describes propagation, scattering and chemical transformation of the pollutants in the atmosphere [3]. The model may use averaged data value for city or forecast results obtained with help of the Chaser model.[4] Atmospheric boundary layer model (ABLM) [3] is used for operative numerical prediction of the meteorological parameters. These are such parameters as speed and direction of the wind, humidity and temperature of the air, which are necessary for the transport impurity model to operate. The model may use data assimilation of meteorological measurements data (including land based observations and the results of remote sensing of vertical structure of the atmosphere) or the weather forecast results obtained with help of the Semi-Lagrange model [5]. Applications for manipulation of data: An application for downloading parameters of atmospheric surface layer and remote sensing of vertical structure of the atmosphere from the web sites (http://meteo.infospace.ru and http://weather.uwyo.edu); An application for uploading these data into the ICS database; An application for transformation of the uploaded data into the internal data format of the system. At present this ICS is a part of "Climate" web site located in ATMOS portal [5]. The database is based on the data schemes providing the calculation in ICS workflow. The applications manipulated with the data are working in automatic regime. The workflow oriented on computation of physical parameters contains: The application for the calculation of geostrophic wind components on the base of Eckman equations; The applications for solution of the equations derived from ABL and transport of impurity models. The application for representation of calculation results in tabular and graphical forms. "Cyberia" cluster [6] located in Tomsk State University is used for computation of the impurity transport equations. References: Gordov E.P., V. N. Lykosov, and A. Z. Fazliev, Web portal on environmental sciences "ATMOS"// Advances in Geoscience, 2006, v. 8, p. 33-38. ATMOS web-portal http://atmos.iao.ru/middleware/ Belikov D.A., Starchenko A.V. Numerical investigation of secondary air pollutions formation near industrial center // Computational technologies. 2005. V. 10. Special issue. Proceedings of the International Conference and the School of Young Scientists "Computational and informational technologies for environmental sciences" (CITES 2005). Tomsk, 13-23 March 2005. Part 2. P. 99-105 Sudo, K., Takahashi M., Kurokawa J., Akimoto H. CHASER: A global chemical model of the troposphere. Model description, J. Geophys. Res., 2002, Vol.107(D17), P. 4339. Tolstykh M.A., Fadeev R.Y. Semi-Lagrangian variable-resolution weather prediction model and its further development // Computational technologies. 2006. V. 11. Special issue. P. 176-184 ATMOS web-portal http://climate.atmos.math.tsu.ru/ Tomsk state university, Interregional computational center http://skif.tsu.ru
Mikell, Justin K; Klopp, Ann H; Price, Michael; Mourtada, Firas
2013-01-01
We sought to commission a gynecologic shielded colpostat analytic model provided from a treatment planning system (TPS) library. We have reported retrospectively the dosimetric impact of this applicator model in a cohort of patients. A commercial TPS with a grid-based Boltzmann solver (GBBS) was commissioned for (192)Ir high-dose-rate (HDR) brachytherapy for cervical cancer with stainless steel-shielded colpostats. Verification of the colpostat analytic model was verified using a radiograph and vendor schematics. MCNPX v2.6 Monte Carlo simulations were performed to compare dose distributions around the applicator in water with the TPS GBBS dose predictions. Retrospectively, the dosimetric impact was assessed over 24 cervical cancer patients' HDR plans. Applicator (TPS ID #AL13122005) shield dimensions were within 0.4 mm of the independent shield dimensions verification. GBBS profiles in planes bisecting the cap around the applicator agreed with Monte Carlo simulations within 2% at most locations; differing screw representations resulted in differences of up to 9%. For the retrospective study, the GBBS doses differed from TG-43 as follows (mean value ± standard deviation [min, max]): International Commission on Radiation units [ICRU]rectum (-8.4 ± 2.5% [-14.1, -4.1%]), ICRUbladder (-7.2 ± 3.6% [-15.7, -2.1%]), D2cc-rectum (-6.2 ± 2.6% [-11.9, -0.8%]), D2cc-sigmoid (-5.6 ± 2.6% [-9.3, -2.0%]), and D2cc-bladder (-3.4 ± 1.9% [-7.2, -1.1%]). As brachytherapy TPSs implement advanced model-based dose calculations, the analytic applicator models stored in TPSs should be independently validated before clinical use. For this cohort, clinically meaningful differences (>5%) from TG-43 were observed. Accurate dosimetric modeling of shielded applicators may help to refine organ toxicity studies. Copyright © 2013 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
10 CFR 609.6 - Submission of Applications.
Code of Federal Regulations, 2011 CFR
2011-01-01
... checklist for the equity and debt to the extent available; (16) Applicant's business plan on which the project is based and Applicant's financial model presenting project pro forma statements for the proposed... backing, together with business and financial interests of controlling or commonly controlled...
Roy, Kunal; Mitra, Indrani
2011-07-01
Quantitative structure-activity relationships (QSARs) have important applications in drug discovery research, environmental fate modeling, property prediction, etc. Validation has been recognized as a very important step for QSAR model development. As one of the important objectives of QSAR modeling is to predict activity/property/toxicity of new chemicals falling within the domain of applicability of the developed models and QSARs are being used for regulatory decisions, checking reliability of the models and confidence of their predictions is a very important aspect, which can be judged during the validation process. One prime application of a statistically significant QSAR model is virtual screening for molecules with improved potency based on the pharmacophoric features and the descriptors appearing in the QSAR model. Validated QSAR models may also be utilized for design of focused libraries which may be subsequently screened for the selection of hits. The present review focuses on various metrics used for validation of predictive QSAR models together with an overview of the application of QSAR models in the fields of virtual screening and focused library design for diverse series of compounds with citation of some recent examples.
[Employment and urban growth; an application of Czamanski's model to the Mexican case].
Verduzco Chavez, B
1991-01-01
The author applies the 1964 model developed by Stanislaw Czamanski, based on theories of urban growth and industrial localization, to the analysis of urban growth in Mexico. "The advantages of this model in its application as a support instrument in the process of urban planning when the information available is incomplete are...discussed...." Census data for 44 cities in Mexico are used. (SUMMARY IN ENG) excerpt
A Comparative Analysis of Computer End-User Support in the Air Force and Civilian Organizations
1991-12-01
This explanation implies a further stratification of end users based on the specific tasks they perform, a new model of application combinations, and a...its support efforts to meet the needs of its end-uiser clientele iore closely. 79 INTEGRATED .9 VERBAL ANALYTIC Figure 14. Test Model of Applications ...The IC Model : IEM, Canada. ...............19 Proliferation of ICs ... ............... 20 Services ... ..................... 21 IC States
Roybal, H; Baxendale, S J; Gupta, M
1999-01-01
Activity-based costing and the theory of constraints have been applied successfully in many manufacturing organizations. Recently, those concepts have been applied in service organizations. This article describes the application of activity-based costing and the theory of constraints in a managed care mental health and substance abuse organization. One of the unique aspects of this particular application was the integration of activity-based costing and the theory of constraints to guide process improvement efforts. This article describes the activity-based costing model and the application of the theory of constraint's focusing steps with an emphasis on unused capacities of activities in the organization.
Design and Application of an Ontology for Component-Based Modeling of Water Systems
NASA Astrophysics Data System (ADS)
Elag, M.; Goodall, J. L.
2012-12-01
Many Earth system modeling frameworks have adopted an approach of componentizing models so that a large model can be assembled by linking a set of smaller model components. These model components can then be more easily reused, extended, and maintained by a large group of model developers and end users. While there has been a notable increase in component-based model frameworks in the Earth sciences in recent years, there has been less work on creating framework-agnostic metadata and ontologies for model components. Well defined model component metadata is needed, however, to facilitate sharing, reuse, and interoperability both within and across Earth system modeling frameworks. To address this need, we have designed an ontology for the water resources community named the Water Resources Component (WRC) ontology in order to advance the application of component-based modeling frameworks across water related disciplines. Here we present the design of the WRC ontology and demonstrate its application for integration of model components used in watershed management. First we show how the watershed modeling system Soil and Water Assessment Tool (SWAT) can be decomposed into a set of hydrological and ecological components that adopt the Open Modeling Interface (OpenMI) standard. Then we show how the components can be used to estimate nitrogen losses from land to surface water for the Baltimore Ecosystem study area. Results of this work are (i) a demonstration of how the WRC ontology advances the conceptual integration between components of water related disciplines by handling the semantic and syntactic heterogeneity present when describing components from different disciplines and (ii) an investigation of a methodology by which large models can be decomposed into a set of model components that can be well described by populating metadata according to the WRC ontology.
Web-based three-dimensional geo-referenced visualization
NASA Astrophysics Data System (ADS)
Lin, Hui; Gong, Jianhua; Wang, Freeman
1999-12-01
This paper addresses several approaches to implementing web-based, three-dimensional (3-D), geo-referenced visualization. The discussion focuses on the relationship between multi-dimensional data sets and applications, as well as the thick/thin client and heavy/light server structure. Two models of data sets are addressed in this paper. One is the use of traditional 3-D data format such as 3-D Studio Max, Open Inventor 2.0, Vis5D and OBJ. The other is modelled by a web-based language such as VRML. Also, traditional languages such as C and C++, as well as web-based programming tools such as Java, Java3D and ActiveX, can be used for developing applications. The strengths and weaknesses of each approach are elaborated. Four practical solutions for using VRML and Java, Java and Java3D, VRML and ActiveX and Java wrapper classes (Java and C/C++), to develop applications are presented for web-based, real-time interactive and explorative visualization.
NASA Astrophysics Data System (ADS)
Straka, Mika J.; Caldarelli, Guido; Squartini, Tiziano; Saracco, Fabio
2018-04-01
Bipartite networks provide an insightful representation of many systems, ranging from mutualistic networks of species interactions to investment networks in finance. The analyses of their topological structures have revealed the ubiquitous presence of properties which seem to characterize many—apparently different—systems. Nestedness, for example, has been observed in biological plant-pollinator as well as in country-product exportation networks. Due to the interdisciplinary character of complex networks, tools developed in one field, for example ecology, can greatly enrich other areas of research, such as economy and finance, and vice versa. With this in mind, we briefly review several entropy-based bipartite null models that have been recently proposed and discuss their application to real-world systems. The focus on these models is motivated by the fact that they show three very desirable features: analytical character, general applicability, and versatility. In this respect, entropy-based methods have been proven to perform satisfactorily both in providing benchmarks for testing evidence-based null hypotheses and in reconstructing unknown network configurations from partial information. Furthermore, entropy-based models have been successfully employed to analyze ecological as well as economic systems. As an example, the application of entropy-based null models has detected early-warning signals, both in economic and financial systems, of the 2007-2008 world crisis. Moreover, they have revealed a statistically-significant export specialization phenomenon of country export baskets in international trade, a result that seems to reconcile Ricardo's hypothesis in classical economics with recent findings on the (empirical) diversification industrial production at the national level. Finally, these null models have shown that the information contained in the nestedness is already accounted for by the degree sequence of the corresponding graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacon, Diana Holford; Locke II, Randall A.; Keating, Elizabeth
The National Risk Assessment Partnership (NRAP) has developed a suite of tools to assess and manage risk at CO2 sequestration sites (1). The NRAP tool suite includes the Aquifer Impact Model (AIM), based on reduced order models developed using site-specific data from two aquifers (alluvium and carbonate). The models accept aquifer parameters as a range of variable inputs so they may have more broad applicability. Guidelines have been developed for determining the aquifer types for which the ROMs should be applicable. This paper considers the applicability of the aquifer models in AIM to predicting the impact of CO2 or Brinemore » leakage were it to occur at the Illinois Basin Decatur Project (IBDP). Based on the results of the sensitivity analysis, the hydraulic parameters and leakage source term magnitude are more sensitive than clay fraction or cation exchange capacity. Sand permeability was the only hydraulic parameter measured at the IBDP site. More information on the other hydraulic parameters, such as sand fraction and sand/clay correlation lengths, could reduce uncertainty in risk estimates. Some non-adjustable parameters, such as the initial pH and TDS and the pH no-impact threshold, are significantly different for the ROM than for the observations at the IBDP site. The reduced order model could be made more useful to a wider range of sites if the initial conditions and no-impact threshold values were adjustable parameters.« less
Schäuble, Sascha; Stavrum, Anne-Kristin; Bockwoldt, Mathias; Puntervoll, Pål; Heiland, Ines
2017-06-24
Systems Biology Markup Language (SBML) is the standard model representation and description language in systems biology. Enriching and analysing systems biology models by integrating the multitude of available data, increases the predictive power of these models. This may be a daunting task, which commonly requires bioinformatic competence and scripting. We present SBMLmod, a Python-based web application and service, that automates integration of high throughput data into SBML models. Subsequent steady state analysis is readily accessible via the web service COPASIWS. We illustrate the utility of SBMLmod by integrating gene expression data from different healthy tissues as well as from a cancer dataset into a previously published model of mammalian tryptophan metabolism. SBMLmod is a user-friendly platform for model modification and simulation. The web application is available at http://sbmlmod.uit.no , whereas the WSDL definition file for the web service is accessible via http://sbmlmod.uit.no/SBMLmod.wsdl . Furthermore, the entire package can be downloaded from https://github.com/MolecularBioinformatics/sbml-mod-ws . We envision that SBMLmod will make automated model modification and simulation available to a broader research community.
Note: Model-based identification method of a cable-driven wearable device for arm rehabilitation
NASA Astrophysics Data System (ADS)
Cui, Xiang; Chen, Weihai; Zhang, Jianbin; Wang, Jianhua
2015-09-01
Cable-driven exoskeletons have used active cables to actuate the system and are worn on subjects to provide motion assistance. However, this kind of wearable devices usually contains uncertain kinematic parameters. In this paper, a model-based identification method has been proposed for a cable-driven arm exoskeleton to estimate its uncertainties. The identification method is based on the linearized error model derived from the kinematics of the exoskeleton. Experiment has been conducted to demonstrate the feasibility of the proposed model-based method in practical application.
Image-based 3D reconstruction and virtual environmental walk-through
NASA Astrophysics Data System (ADS)
Sun, Jifeng; Fang, Lixiong; Luo, Ying
2001-09-01
We present a 3D reconstruction method, which combines geometry-based modeling, image-based modeling and rendering techniques. The first component is an interactive geometry modeling method which recovery of the basic geometry of the photographed scene. The second component is model-based stereo algorithm. We discus the image processing problems and algorithms of walking through in virtual space, then designs and implement a high performance multi-thread wandering algorithm. The applications range from architectural planning and archaeological reconstruction to virtual environments and cinematic special effects.
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps
2009-11-12
Service (IaaS) Software -as-a- Service ( SaaS ) Cloud Computing Types Platform-as-a- Service (PaaS) Based on Type of Capability Based on access Based...Mellon University Software -as-a- Service ( SaaS ) Application-specific capabilities, e.g., service that provides customer management Allows organizations...as a Service ( SaaS ) Model of software deployment in which a provider licenses an application to customers for use as a service on
Evaluation of theoretical and empirical water vapor sorption isotherm models for soils
NASA Astrophysics Data System (ADS)
Arthur, Emmanuel; Tuller, Markus; Moldrup, Per; de Jonge, Lis W.
2016-01-01
The mathematical characterization of water vapor sorption isotherms of soils is crucial for modeling processes such as volatilization of pesticides and diffusive and convective water vapor transport. Although numerous physically based and empirical models were previously proposed to describe sorption isotherms of building materials, food, and other industrial products, knowledge about the applicability of these functions for soils is noticeably lacking. We present an evaluation of nine models for characterizing adsorption/desorption isotherms for a water activity range from 0.03 to 0.93 based on measured data of 207 soils with widely varying textures, organic carbon contents, and clay mineralogy. In addition, the potential applicability of the models for prediction of sorption isotherms from known clay content was investigated. While in general, all investigated models described measured adsorption and desorption isotherms reasonably well, distinct differences were observed between physical and empirical models and due to the different degrees of freedom of the model equations. There were also considerable differences in model performance for adsorption and desorption data. While regression analysis relating model parameters and clay content and subsequent model application for prediction of measured isotherms showed promise for the majority of investigated soils, for soils with distinct kaolinitic and smectitic clay mineralogy predicted isotherms did not closely match the measurements.
OpenMP parallelization of a gridded SWAT (SWATG)
NASA Astrophysics Data System (ADS)
Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin
2017-12-01
Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.
NASA Astrophysics Data System (ADS)
Amezquita-Brooks, Luis; Liceaga-Castro, Eduardo; Gonzalez-Sanchez, Mario; Garcia-Salazar, Octavio; Martinez-Vazquez, Daniel
2017-11-01
Applications based on quad-rotor-vehicles (QRV) are becoming increasingly wide-spread. Many of these applications require accurate mathematical representations for control design, simulation and estimation. However, there is no consensus on a standardized model for these purposes. In this article a review of the most common elements included in QRV models reported in the literature is presented. This survey shows that some elements are recurrent for typical non-aerobatic QRV applications; in particular, for control design and high-performance simulation. By synthesising the common features of the reviewed models a standard generic model SGM is proposed. The SGM is cast as a typical state-space model without memory-less transformations, a structure which is useful for simulation and controller design. The survey also shows that many QRV applications use simplified representations, which may be considered simplifications of the SGM here proposed. In order to assess the effectiveness of the simplified models, a comprehensive comparison based on digital simulations is presented. With this comparison, it is possible to determine the accuracy of each model under particular operating ranges. Such information is useful for the selection of a model according to a particular application. In addition to the models found in the literature, in this article a novel simplified model is derived. The main characteristics of this model are that its inner dynamics are linear, it has low complexity and it has a high level of accuracy in all the studied operating ranges, a characteristic found only in more complex representations. To complement the article the main elements of the SGM are evaluated with the aid of experimental data and the computational complexity of all surveyed models is briefly analysed. Finally, the article presents a discussion on how the structural characteristics of the models are useful to suggest particular QRV control structures.
XAL Application Framework and Bricks GUI Builder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelaia II, Tom
2007-01-01
The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.
Skinner, Harvey A; Maley, Oonagh; Norman, Cameron D
2006-10-01
Health education and health promotion have a tradition of using information and communication technology (ICT). In recent years, the rapid growth of the Internet has created innovative opportunities for Web-based health education and behavior change applications-termed eHealth promotion. However, many eHealth promotion applications are developed without an explicit model to guide the design, evaluation, and ongoing improvement of the program. The spiral technology action research (STAR) model was developed to address this need. The model comprises five cycles (listen, plan, do, study, act) that weave together technological development, community involvement, and continuous improvement. The model is illustrated by a case study describing the development of the Smoking Zine (www.SmokingZine.org), a youth smoking prevention and cessation Web site.
Competency-Based Curriculum Development: A Pragmatic Approach
ERIC Educational Resources Information Center
Broski, David; And Others
1977-01-01
Examines the concept of competency-based education, describes an experience-based model for its development, and discusses some empirically derived rules-of-thumb for its application in allied health. (HD)
Tech Prep SCANS Lesson Development. Region 10 Tech Prep.
ERIC Educational Resources Information Center
Region 10 Tech Prep Consortium, Bloomington, IN.
This document contains 50 applications-based lessons developed during the 1993-94 school year as part of the Indiana Region 10 Tech Prep Project. The lessons were developed by 91 secondary and postsecondary educators and are modeled around the SCANS (Secretary's Commission on Achieving Necessary Skills) competencies. The applications-based lessons…
Development of a real time activity monitoring Android application utilizing SmartStep.
Hegde, Nagaraj; Melanson, Edward; Sazonov, Edward
2016-08-01
Footwear based activity monitoring systems are becoming popular in academic research as well as consumer industry segments. In our previous work, we had presented developmental aspects of an insole based activity and gait monitoring system-SmartStep, which is a socially acceptable, fully wireless and versatile insole. The present work describes the development of an Android application that captures the SmartStep data wirelessly over Bluetooth Low energy (BLE), computes features on the received data, runs activity classification algorithms and provides real time feedback. The development of activity classification methods was based on the the data from a human study involving 4 participants. Participants were asked to perform activities of sitting, standing, walking, and cycling while they wore SmartStep insole system. Multinomial Logistic Discrimination (MLD) was utilized in the development of machine learning model for activity prediction. The resulting classification model was implemented in an Android Smartphone. The Android application was benchmarked for power consumption and CPU loading. Leave one out cross validation resulted in average accuracy of 96.9% during model training phase. The Android application for real time activity classification was tested on a human subject wearing SmartStep resulting in testing accuracy of 95.4%.
V and V Efforts of Auroral Precipitation Models: Preliminary Results
NASA Technical Reports Server (NTRS)
Zheng, Yihua; Kuznetsova, Masha; Rastaetter, Lutz; Hesse, Michael
2011-01-01
Auroral precipitation models have been valuable both in terms of space weather applications and space science research. Yet very limited testing has been performed regarding model performance. A variety of auroral models are available, including empirical models that are parameterized by geomagnetic indices or upstream solar wind conditions, now casting models that are based on satellite observations, or those derived from physics-based, coupled global models. In this presentation, we will show our preliminary results regarding V&V efforts of some of the models.
An Application of Artificial Intelligence to the Implementation of Electronic Commerce
NASA Astrophysics Data System (ADS)
Srivastava, Anoop Kumar
In this paper, we present an application of Artificial Intelligence (AI) to the implementation of Electronic Commerce. We provide a multi autonomous agent based framework. Our agent based architecture leads to flexible design of a spectrum of multiagent system (MAS) by distributing computation and by providing a unified interface to data and programs. Autonomous agents are intelligent enough and provide autonomy, simplicity of communication, computation, and a well developed semantics. The steps of design and implementation are discussed in depth, structure of Electronic Marketplace, an ontology, the agent model, and interaction pattern between agents is given. We have developed mechanisms for coordination between agents using a language, which is called Virtual Enterprise Modeling Language (VEML). VEML is a integration of Java and Knowledge Query and Manipulation Language (KQML). VEML provides application programmers with potential to globally develop different kinds of MAS based on their requirements and applications. We have implemented a multi autonomous agent based system called VE System. We demonstrate efficacy of our system by discussing experimental results and its salient features.
Arar, Nedal; Knight, Sara J; Modell, Stephen M; Issa, Amalia M
2011-03-01
The main mission of the Genomic Applications in Practice and Prevention Network™ is to advance collaborative efforts involving partners from across the public health sector to realize the promise of genomics in healthcare and disease prevention. We introduce a new framework that supports the Genomic Applications in Practice and Prevention Network mission and leverages the characteristics of the complex adaptive systems approach. We call this framework the Genome-based Knowledge Management in Cycles model (G-KNOMIC). G-KNOMIC proposes that the collaborative work of multidisciplinary teams utilizing genome-based applications will enhance translating evidence-based genomic findings by creating ongoing knowledge management cycles. Each cycle consists of knowledge synthesis, knowledge evaluation, knowledge implementation and knowledge utilization. Our framework acknowledges that all the elements in the knowledge translation process are interconnected and continuously changing. It also recognizes the importance of feedback loops, and the ability of teams to self-organize within a dynamic system. We demonstrate how this framework can be used to improve the adoption of genomic technologies into practice using two case studies of genomic uptake.
Applicability of western chemical dietary exposure models to the Chinese population.
Zhao, Shizhen; Price, Oliver; Liu, Zhengtao; Jones, Kevin C; Sweetman, Andrew J
2015-07-01
A range of exposure models, which have been developed in Europe and North America, are playing an increasingly important role in priority setting and the risk assessment of chemicals. However, the applicability of these tools, which are based on Western dietary exposure pathways, to estimate chemical exposure to the Chinese population to support the development of a risk-based environment and exposure assessment, is unclear. Three frequently used modelling tools, EUSES, RAIDAR and ACC-HUMANsteady, have been evaluated in terms of human dietary exposure estimation by application to a range of chemicals with different physicochemical properties under both model default and Chinese dietary scenarios. Hence, the modelling approaches were assessed by considering dietary pattern differences only. The predicted dietary exposure pathways were compared under both scenarios using a range of hypothetical and current emerging contaminants. Although the differences across models are greater than those between dietary scenarios, model predictions indicated that dietary preference can have a significant impact on human exposure, with the relatively high consumption of vegetables and cereals resulting in higher exposure via plants-based foodstuffs under Chinese consumption patterns compared to Western diets. The selected models demonstrated a good ability to identify key dietary exposure pathways which can be used for screening purposes and an evaluative risk assessment. However, some model adaptations will be required to cover a number of important Chinese exposure pathways, such as freshwater farmed-fish, grains and pork. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Choi, Jeong Hoon; Meisenheimer, Jessica M.; McCart, Amy B.; Sailor, Wayne
2017-01-01
The present investigation examines the schoolwide applications model (SAM) as a potentially effective school reform model for increasing equity-based inclusive education practices while enhancing student reading and math achievement for all students. A 3-year quasi-experimental comparison group analysis using latent growth modeling (LGM) was used…
Douglas Allen; William Dietrich; Peter Baker; Frank Ligon; Bruce Orr
2007-01-01
We describe a mechanistically-based stream model, BasinTemp, which assumes that direct shortwave radiation moderated by riparian and topographic shading, controls stream temperatures during the hottest part of the year. The model was developed to support a temperature TMDL for the South Fork Eel basin in Northern California and couples a GIS and a 1-D energy balance...
Steen Magnussen; Ronald E. McRoberts; Erkki O. Tomppo
2009-01-01
New model-based estimators of the uncertainty of pixel-level and areal k-nearest neighbour (knn) predictions of attribute Y from remotely-sensed ancillary data X are presented. Non-parametric functions predict Y from scalar 'Single Index Model' transformations of X. Variance functions generated...
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...
The theory and programming of statistical tests for evaluating the Real-Time Air-Quality Model (RAM) using the Regional Air Pollution Study (RAPS) data base are fully documented in four volumes. Moreover, the tests are generally applicable to other model evaluation problems. Volu...
KINEROS2-AGWA: Model Use, Calibration, and Validation
NASA Technical Reports Server (NTRS)
Goodrich, D C.; Burns, I. S.; Unkrich, C. L.; Semmens, D. J.; Guertin, D. P.; Hernandez, M.; Yatheendradas, S.; Kennedy, J. R.; Levick, L. R..
2013-01-01
KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.
KINEROS2/AGWA: Model use, calibration and validation
Goodrich, D.C.; Burns, I.S.; Unkrich, C.L.; Semmens, Darius J.; Guertin, D.P.; Hernandez, M.; Yatheendradas, S.; Kennedy, Jeffrey R.; Levick, Lainie R.
2012-01-01
KINEROS (KINematic runoff and EROSion) originated in the 1960s as a distributed event-based model that conceptualizes a watershed as a cascade of overland flow model elements that flow into trapezoidal channel model elements. KINEROS was one of the first widely available watershed models that interactively coupled a finite difference approximation of the kinematic overland flow equations to a physically based infiltration model. Development and improvement of KINEROS continued from the 1960s on a variety of projects for a range of purposes, which has resulted in a suite of KINEROS-based modeling tools. This article focuses on KINEROS2 (K2), a spatially distributed, event-based watershed rainfall-runoff and erosion model, and the companion ArcGIS-based Automated Geospatial Watershed Assessment (AGWA) tool. AGWA automates the time-consuming tasks of watershed delineation into distributed model elements and initial parameterization of these elements using commonly available, national GIS data layers. A variety of approaches have been used to calibrate and validate K2 successfully across a relatively broad range of applications (e.g., urbanization, pre- and post-fire, hillslope erosion, erosion from roads, runoff and recharge, and manure transport). The case studies presented in this article (1) compare lumped to stepwise calibration and validation of runoff and sediment at plot, hillslope, and small watershed scales; and (2) demonstrate an uncalibrated application to address relative change in watershed response to wildfire.
Probabilistic inversion with graph cuts: Application to the Boise Hydrogeophysical Research Site
NASA Astrophysics Data System (ADS)
Pirot, Guillaume; Linde, Niklas; Mariethoz, Grégoire; Bradford, John H.
2017-02-01
Inversion methods that build on multiple-point statistics tools offer the possibility to obtain model realizations that are not only in agreement with field data, but also with conceptual geological models that are represented by training images. A recent inversion approach based on patch-based geostatistical resimulation using graph cuts outperforms state-of-the-art multiple-point statistics methods when applied to synthetic inversion examples featuring continuous and discontinuous property fields. Applications of multiple-point statistics tools to field data are challenging due to inevitable discrepancies between actual subsurface structure and the assumptions made in deriving the training image. We introduce several amendments to the original graph cut inversion algorithm and present a first-ever field application by addressing porosity estimation at the Boise Hydrogeophysical Research Site, Boise, Idaho. We consider both a classical multi-Gaussian and an outcrop-based prior model (training image) that are in agreement with available porosity data. When conditioning to available crosshole ground-penetrating radar data using Markov chain Monte Carlo, we find that the posterior realizations honor overall both the characteristics of the prior models and the geophysical data. The porosity field is inverted jointly with the measurement error and the petrophysical parameters that link dielectric permittivity to porosity. Even though the multi-Gaussian prior model leads to posterior realizations with higher likelihoods, the outcrop-based prior model shows better convergence. In addition, it offers geologically more realistic posterior realizations and it better preserves the full porosity range of the prior.
Meteorological Processes Affecting Air Quality – Research and Model Development Needs
Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...
NASA Astrophysics Data System (ADS)
Quartel, Dick; Steen, Maarten W. A.; Lankhorst, Marc M.
2012-05-01
This article describes an architecture-based approach to IT valuation. This approach offers organisations an instrument to valuate their application and project portfolios and to make well-balanced decisions about IT investments. The value of a software application is assessed in terms of its contribution to a selection of business goals. Based on such assessments, the value of different applications can be compared, and requirements for innovation, development, maintenance and phasing out can be identified. IT projects are proposed to realise the requirements. The value of each project is assessed in terms of the value it adds to one or more applications. This value can be obtained by relating the 'as-is' application portfolio to the 'to-be' portfolio that is being proposed by the project portfolio. In this way, projects can be ranked according to their added value, given a certain selection of business goals. The approach uses ArchiMate to model the relationship between software applications, business processes, services and products. In addition, two language extensions are used to model the relationship of these elements to business goals and requirements and to projects and project portfolios. The approach is illustrated using the portfolio method of Bedell and has been implemented in BiZZdesign Architect.
Brief introductory guide to agent-based modeling and an illustration from urban health research.
Auchincloss, Amy H; Garcia, Leandro Martin Totaro
2015-11-01
There is growing interest among urban health researchers in addressing complex problems using conceptual and computation models from the field of complex systems. Agent-based modeling (ABM) is one computational modeling tool that has received a lot of interest. However, many researchers remain unfamiliar with developing and carrying out an ABM, hindering the understanding and application of it. This paper first presents a brief introductory guide to carrying out a simple agent-based model. Then, the method is illustrated by discussing a previously developed agent-based model, which explored inequalities in diet in the context of urban residential segregation.
Brief introductory guide to agent-based modeling and an illustration from urban health research
Auchincloss, Amy H.; Garcia, Leandro Martin Totaro
2017-01-01
There is growing interest among urban health researchers in addressing complex problems using conceptual and computation models from the field of complex systems. Agent-based modeling (ABM) is one computational modeling tool that has received a lot of interest. However, many researchers remain unfamiliar with developing and carrying out an ABM, hindering the understanding and application of it. This paper first presents a brief introductory guide to carrying out a simple agent-based model. Then, the method is illustrated by discussing a previously developed agent-based model, which explored inequalities in diet in the context of urban residential segregation. PMID:26648364
Verification of Autonomous Systems for Space Applications
NASA Technical Reports Server (NTRS)
Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.
2006-01-01
Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.
Study on Dissemination Patterns in Location-Aware Gossiping Networks
NASA Astrophysics Data System (ADS)
Kami, Nobuharu; Baba, Teruyuki; Yoshikawa, Takashi; Morikawa, Hiroyuki
We study the properties of information dissemination over location-aware gossiping networks leveraging location-based real-time communication applications. Gossiping is a promising method for quickly disseminating messages in a large-scale system, but in its application to information dissemination for location-aware applications, it is important to consider the network topology and patterns of spatial dissemination over the network in order to achieve effective delivery of messages to potentially interested users. To this end, we propose a continuous-space network model extended from Kleinberg's small-world model applicable to actual location-based applications. Analytical and simulation-based study shows that the proposed network achieves high dissemination efficiency resulting from geographically neutral dissemination patterns as well as selective dissemination to proximate users. We have designed a highly scalable location management method capable of promptly updating the network topology in response to node movement and have implemented a distributed simulator to perform dynamic target pursuit experiments as one example of applications that are the most sensitive to message forwarding delay. The experimental results show that the proposed network surpasses other types of networks in pursuit efficiency and achieves the desirable dissemination patterns.
NASA Astrophysics Data System (ADS)
Shiri, Jalal
2018-06-01
Among different reference evapotranspiration (ETo) modeling approaches, mass transfer-based methods have been less studied. These approaches utilize temperature and wind speed records. On the other hand, the empirical equations proposed in this context generally produce weak simulations, except when a local calibration is used for improving their performance. This might be a crucial drawback for those equations in case of local data scarcity for calibration procedure. So, application of heuristic methods can be considered as a substitute for improving the performance accuracy of the mass transfer-based approaches. However, given that the wind speed records have usually higher variation magnitudes than the other meteorological parameters, application of a wavelet transform for coupling with heuristic models would be necessary. In the present paper, a coupled wavelet-random forest (WRF) methodology was proposed for the first time to improve the performance accuracy of the mass transfer-based ETo estimation approaches using cross-validation data management scenarios in both local and cross-station scales. The obtained results revealed that the new coupled WRF model (with the minimum scatter index values of 0.150 and 0.192 for local and external applications, respectively) improved the performance accuracy of the single RF models as well as the empirical equations to great extent.
NASA Astrophysics Data System (ADS)
Nemirsky, Kristofer Kevin
In this thesis, the history and evolution of rotor aircraft with simulated annealing-based PID application were reviewed and quadcopter dynamics are presented. The dynamics of a quadcopter were then modeled, analyzed, and linearized. A cascaded loop architecture with PID controllers was used to stabilize the plant dynamics, which was improved upon through the application of simulated annealing (SA). A Simulink model was developed to test the controllers and verify the functionality of the proposed control system design. In addition, the data that the Simulink model provided were compared with flight data to present the validity of derived dynamics as a proper mathematical model representing the true dynamics of the quadcopter system. Then, the SA-based global optimization procedure was applied to obtain optimized PID parameters. It was observed that the tuned gains through the SA algorithm produced a better performing PID controller than the original manually tuned one. Next, we investigated the uncertain dynamics of the quadcopter setup. After adding uncertainty to the gyroscopic effects associated with pitch-and-roll rate dynamics, the controllers were shown to be robust against the added uncertainty. A discussion follows to summarize SA-based algorithm PID controller design and performance outcomes. Lastly, future work on SA application on multi-input-multi-output (MIMO) systems is briefly discussed.
Petascale Simulation Initiative Tech Base: FY2007 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, J; Chen, R; Jefferson, D
The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending on when Peloton becomes available); (2) Improve SARS's robustness and ease-of-use, and develop user documentation; and (3) Work with LLNL code teams to help them determine how Symponents could benefit their applications. The original funding request was $296,000 for the year, and we eventually received $252,000. The remainder of this report describes our efforts and accomplishments for each of the goals listed above.« less
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
Information driving force and its application in agent-based modeling
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2018-04-01
Exploring the scientific impact of online big-data has attracted much attention of researchers from different fields in recent years. Complex financial systems are typical open systems profoundly influenced by the external information. Based on the large-scale data in the public media and stock markets, we first define an information driving force, and analyze how it affects the complex financial system. The information driving force is observed to be asymmetric in the bull and bear market states. As an application, we then propose an agent-based model driven by the information driving force. Especially, all the key parameters are determined from the empirical analysis rather than from statistical fitting of the simulation results. With our model, both the stationary properties and non-stationary dynamic behaviors are simulated. Considering the mean-field effect of the external information, we also propose a few-body model to simulate the financial market in the laboratory.
Accuracy assessment for a multi-parameter optical calliper in on line automotive applications
NASA Astrophysics Data System (ADS)
D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.
2017-08-01
In this work, a methodological approach based on the evaluation of the measurement uncertainty is applied to an experimental test case, related to the automotive sector. The uncertainty model for different measurement procedures of a high-accuracy optical gauge is discussed in order to individuate the best measuring performances of the system for on-line applications and when the measurement requirements are becoming more stringent. In particular, with reference to the industrial production and control strategies of high-performing turbochargers, two uncertainty models are proposed, discussed and compared, to be used by the optical calliper. Models are based on an integrated approach between measurement methods and production best practices to emphasize their mutual coherence. The paper shows the possible advantages deriving from the considerations that the measurement uncertainty modelling provides, in order to keep control of the uncertainty propagation on all the indirect measurements useful for production statistical control, on which basing further improvements.
NASA Astrophysics Data System (ADS)
McMillan, Mitchell; Hu, Zhiyong
2017-10-01
Streambank erosion is a major source of fluvial sediment, but few large-scale, spatially distributed models exist to quantify streambank erosion rates. We introduce a spatially distributed model for streambank erosion applicable to sinuous, single-thread channels. We argue that such a model can adequately characterize streambank erosion rates, measured at the outsides of bends over a 2-year time period, throughout a large region. The model is based on the widely-used excess-velocity equation and comprised three components: a physics-based hydrodynamic model, a large-scale 1-dimensional model of average monthly discharge, and an empirical bank erodibility parameterization. The hydrodynamic submodel requires inputs of channel centerline, slope, width, depth, friction factor, and a scour factor A; the large-scale watershed submodel utilizes watershed-averaged monthly outputs of the Noah-2.8 land surface model; bank erodibility is based on tree cover and bank height as proxies for root density. The model was calibrated with erosion rates measured in sand-bed streams throughout the northern Gulf of Mexico coastal plain. The calibrated model outperforms a purely empirical model, as well as a model based only on excess velocity, illustrating the utility of combining a physics-based hydrodynamic model with an empirical bank erodibility relationship. The model could be improved by incorporating spatial variability in channel roughness and the hydrodynamic scour factor, which are here assumed constant. A reach-scale application of the model is illustrated on ∼1 km of a medium-sized, mixed forest-pasture stream, where the model identifies streambank erosion hotspots on forested and non-forested bends.
New V and V Tools for Diagnostic Modeling Environment (DME)
NASA Technical Reports Server (NTRS)
Pecheur, Charles; Nelson, Stacy; Merriam, Marshall (Technical Monitor)
2002-01-01
The purpose of this report is to provide correctness and reliability criteria for verification and validation (V&V) of Second Generation Reusable Launch Vehicle (RLV) Diagnostic Modeling Environment, describe current NASA Ames Research Center tools for V&V of Model Based Reasoning systems, and discuss the applicability of Advanced V&V to DME. This report is divided into the following three sections: (1) correctness and reliability criteria; (2) tools for V&V of Model Based Reasoning; and (3) advanced V&V applicable to DME. The Executive Summary includes an overview of the main points from each section. Supporting details, diagrams, figures, and other information are included in subsequent sections. A glossary, acronym list, appendices, and references are included at the end of this report.
NASA Astrophysics Data System (ADS)
Voldoire, Aurore; Decharme, Bertrand; Pianezze, Joris; Lebeaupin Brossier, Cindy; Sevault, Florence; Seyfried, Léo; Garnier, Valérie; Bielli, Soline; Valcke, Sophie; Alias, Antoinette; Accensi, Mickael; Ardhuin, Fabrice; Bouin, Marie-Noëlle; Ducrocq, Véronique; Faroux, Stéphanie; Giordani, Hervé; Léger, Fabien; Marsaleix, Patrick; Rainaud, Romain; Redelsperger, Jean-Luc; Richard, Evelyne; Riette, Sébastien
2017-11-01
This study presents the principles of the new coupling interface based on the SURFEX multi-surface model and the OASIS3-MCT coupler. As SURFEX can be plugged into several atmospheric models, it can be used in a wide range of applications, from global and regional coupled climate systems to high-resolution numerical weather prediction systems or very fine-scale models dedicated to process studies. The objective of this development is to build and share a common structure for the atmosphere-surface coupling of all these applications, involving on the one hand atmospheric models and on the other hand ocean, ice, hydrology, and wave models. The numerical and physical principles of SURFEX interface between the different component models are described, and the different coupled systems in which the SURFEX OASIS3-MCT-based coupling interface is already implemented are presented.
Genetic demographic networks: Mathematical model and applications.
Kimmel, Marek; Wojdyła, Tomasz
2016-10-01
Recent improvement in the quality of genetic data obtained from extinct human populations and their ancestors encourages searching for answers to basic questions regarding human population history. The most common and successful are model-based approaches, in which genetic data are compared to the data obtained from the assumed demography model. Using such approach, it is possible to either validate or adjust assumed demography. Model fit to data can be obtained based on reverse-time coalescent simulations or forward-time simulations. In this paper we introduce a computational method based on mathematical equation that allows obtaining joint distributions of pairs of individuals under a specified demography model, each of them characterized by a genetic variant at a chosen locus. The two individuals are randomly sampled from either the same or two different populations. The model assumes three types of demographic events (split, merge and migration). Populations evolve according to the time-continuous Moran model with drift and Markov-process mutation. This latter process is described by the Lyapunov-type equation introduced by O'Brien and generalized in our previous works. Application of this equation constitutes an original contribution. In the result section of the paper we present sample applications of our model to both simulated and literature-based demographies. Among other we include a study of the Slavs-Balts-Finns genetic relationship, in which we model split and migrations between the Balts and Slavs. We also include another example that involves the migration rates between farmers and hunters-gatherers, based on modern and ancient DNA samples. This latter process was previously studied using coalescent simulations. Our results are in general agreement with the previous method, which provides validation of our approach. Although our model is not an alternative to simulation methods in the practical sense, it provides an algorithm to compute pairwise distributions of alleles, in the case of haploid non-recombining loci such as mitochondrial and Y-chromosome loci in humans. Copyright © 2016 Elsevier Inc. All rights reserved.
A basis for solid modeling of gear teeth with application in design and manufacture
NASA Technical Reports Server (NTRS)
Huston, Ronald L.; Mavriplis, Dimitrios; Oswald, Fred B.; Liu, Yung Sheng
1992-01-01
A new approach to modeling gear tooth surfaces is discussed. A computer graphics solid modeling procedure is used to simulate the tooth fabrication process. This procedure is based on the principles of differential geometry that pertain to envelopes of curves and surfaces. The procedure is illustrated with the modeling of spur, helical, bevel, spiral bevel, and hypoid gear teeth. Applications in design and manufacturing are discussed. Extensions to nonstandard tooth forms, to cams, and to rolling element bearings are proposed.
A Basis for Solid Modeling of Gear Teeth with Application in Design and Manufacture
NASA Technical Reports Server (NTRS)
Huston, Ronald L.; Mavriplis, Dimitrios; Oswald, Fred B.; Liu, Yung Sheng
1994-01-01
This paper discusses a new approach to modeling gear tooth surfaces. A computer graphics solid modeling procedure is used to simulate the tooth fabrication processes. This procedure is based on the principles of differential geometry that pertain to envelopes of curves and surfaces. The procedure is illustrated with the modeling of spur, helical, bevel, spiral bevel and hypoid gear teeth. Applications in design and manufacturing arc discussed. Extensions to nonstandard tooth forms, to cams, and to rolling element hearings are proposed.
Kim, Young Kwan; Kameo, Yoshitaka; Tanaka, Sakae; Adachi, Taiji
2017-10-01
To understand Wolff's law, bone adaptation by remodeling at the cellular and tissue levels has been discussed extensively through experimental and simulation studies. For the clinical application of a bone remodeling simulation, it is significant to establish a macroscopic model that incorporates clarified microscopic mechanisms. In this study, we proposed novel macroscopic models based on the microscopic mechanism of osteocytic mechanosensing, in which the flow of fluid in the lacuno-canalicular porosity generated by fluid pressure gradients plays an important role, and theoretically evaluated the proposed models, taking biological rationales of bone adaptation into account. The proposed models were categorized into two groups according to whether the remodeling equilibrium state was defined globally or locally, i.e., the global or local uniformity models. Each remodeling stimulus in the proposed models was quantitatively evaluated through image-based finite element analyses of a swine cancellous bone, according to two introduced criteria associated with the trabecular volume and orientation at remodeling equilibrium based on biological rationales. The evaluation suggested that nonuniformity of the mean stress gradient in the local uniformity model, one of the proposed stimuli, has high validity. Furthermore, the adaptive potential of each stimulus was discussed based on spatial distribution of a remodeling stimulus on the trabecular surface. The theoretical consideration of a remodeling stimulus based on biological rationales of bone adaptation would contribute to the establishment of a clinically applicable and reliable simulation model of bone remodeling.
USDA-ARS?s Scientific Manuscript database
The development and application of cropping system simulation models for cotton production has a long and rich history, beginning in the southeast United States in the 1960's and now expanded to major cotton production regions globally. This paper briefly reviews the history of cotton simulation mo...
Cam Design Projects in an Advanced CAD Course for Mechanical Engineers
ERIC Educational Resources Information Center
Ault, H. K.
2009-01-01
The objective of this paper is to present applications of solid modeling aimed at modeling of complex geometries such as splines and blended surfaces in advanced CAD courses. These projects, in CAD-based Mechanical Engineering courses, are focused on the use of the CAD system to solve design problems for applications in machine design, namely the…
On the combined gradient-stochastic plasticity model: Application to Mo-micropillar compression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Konstantinidis, A. A., E-mail: akonsta@civil.auth.gr; Zhang, X., E-mail: zhangxu26@126.com; Aifantis, E. C., E-mail: mom@mom.gen.auth.gr
2015-02-17
A formulation for addressing heterogeneous material deformation is proposed. It is based on the use of a stochasticity-enhanced gradient plasticity model implemented through a cellular automaton. The specific application is on Mo-micropillar compression, for which the irregularities of the strain bursts observed have been experimentally measured and theoretically interpreted through Tsallis' q-statistics.
Internal Model-Based Robust Tracking Control Design for the MEMS Electromagnetic Micromirror.
Tan, Jiazheng; Sun, Weijie; Yeow, John T W
2017-05-26
The micromirror based on micro-electro-mechanical systems (MEMS) technology is widely employed in different areas, such as scanning, imaging and optical switching. This paper studies the MEMS electromagnetic micromirror for scanning or imaging application. In these application scenarios, the micromirror is required to track the command sinusoidal signal, which can be converted to an output regulation problem theoretically. In this paper, based on the internal model principle, the output regulation problem is solved by designing a robust controller that is able to force the micromirror to track the command signal accurately. The proposed controller relies little on the accuracy of the model. Further, the proposed controller is implemented, and its effectiveness is examined by experiments. The experimental results demonstrate that the performance of the proposed controller is satisfying.
Internal Model-Based Robust Tracking Control Design for the MEMS Electromagnetic Micromirror
Tan, Jiazheng; Sun, Weijie; Yeow, John T. W.
2017-01-01
The micromirror based on micro-electro-mechanical systems (MEMS) technology is widely employed in different areas, such as scanning, imaging and optical switching. This paper studies the MEMS electromagnetic micromirror for scanning or imaging application. In these application scenarios, the micromirror is required to track the command sinusoidal signal, which can be converted to an output regulation problem theoretically. In this paper, based on the internal model principle, the output regulation problem is solved by designing a robust controller that is able to force the micromirror to track the command signal accurately. The proposed controller relies little on the accuracy of the model. Further, the proposed controller is implemented, and its effectiveness is examined by experiments. The experimental results demonstrate that the performance of the proposed controller is satisfying. PMID:28587105
Three novel approaches to structural identifiability analysis in mixed-effects models.
Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D
2016-05-06
Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Development of Integrated Modular Avionics Application Based on Simulink and XtratuM
NASA Astrophysics Data System (ADS)
Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons
2013-08-01
This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.
A new region-edge based level set model with applications to image segmentation
NASA Astrophysics Data System (ADS)
Zhi, Xuhao; Shen, Hong-Bin
2018-04-01
Level set model has advantages in handling complex shapes and topological changes, and is widely used in image processing tasks. The image segmentation oriented level set models can be grouped into region-based models and edge-based models, both of which have merits and drawbacks. Region-based level set model relies on fitting to color intensity of separated regions, but is not sensitive to edge information. Edge-based level set model evolves by fitting to local gradient information, but can get easily affected by noise. We propose a region-edge based level set model, which considers saliency information into energy function and fuses color intensity with local gradient information. The evolution of the proposed model is implemented by a hierarchical two-stage protocol, and the experimental results show flexible initialization, robust evolution and precise segmentation.
A Novel Grid SINS/DVL Integrated Navigation Algorithm for Marine Application
Kang, Yingyao; Zhao, Lin; Cheng, Jianhua; Fan, Xiaoliang
2018-01-01
Integrated navigation algorithms under the grid frame have been proposed based on the Kalman filter (KF) to solve the problem of navigation in some special regions. However, in the existing study of grid strapdown inertial navigation system (SINS)/Doppler velocity log (DVL) integrated navigation algorithms, the Earth models of the filter dynamic model and the SINS mechanization are not unified. Besides, traditional integrated systems with the KF based correction scheme are susceptible to measurement errors, which would decrease the accuracy and robustness of the system. In this paper, an adaptive robust Kalman filter (ARKF) based hybrid-correction grid SINS/DVL integrated navigation algorithm is designed with the unified reference ellipsoid Earth model to improve the navigation accuracy in middle-high latitude regions for marine application. Firstly, to unify the Earth models, the mechanization of grid SINS is introduced and the error equations are derived based on the same reference ellipsoid Earth model. Then, a more accurate grid SINS/DVL filter model is designed according to the new error equations. Finally, a hybrid-correction scheme based on the ARKF is proposed to resist the effect of measurement errors. Simulation and experiment results show that, compared with the traditional algorithms, the proposed navigation algorithm can effectively improve the navigation performance in middle-high latitude regions by the unified Earth models and the ARKF based hybrid-correction scheme. PMID:29373549
Extending Geographic Weights of Evidence Models for Use in Location Based Services
ERIC Educational Resources Information Center
Sonwalkar, Mukul Dinkar
2012-01-01
This dissertation addresses the use and modeling of spatio-temporal data for the purposes of providing applications for location based services. One of the major issues in dealing with spatio-temporal data for location based services is the availability and sparseness of such data. Other than the hardware costs associated with collecting movement…
Atmospheric, climatic and environmental research
NASA Technical Reports Server (NTRS)
Broecker, Wallace S.; Gornitz, Vivien M.
1992-01-01
Work performed on the three tasks during the report period is summarized. The climate and atmospheric modeling studies included work on climate model development and applications, paleoclimate studies, climate change applications, and SAGE II. Climate applications of Earth and planetary observations included studies on cloud climatology and planetary studies. Studies on the chemistry of the Earth and the environment are briefly described. Publications based on the above research are listed; two of these papers are included in the appendices.
Aspects of intelligent electronic device based switchgear control training model application
NASA Astrophysics Data System (ADS)
Bogdanov, Dimitar; Popov, Ivaylo
2018-02-01
The design of the protection and control equipment for electrical power sector application was object of extensive advance in the last several decades. The modern technologies offer a wide range of multifunctional flexible applications, making the protection and control of facilities more sophisticated. In the same time, the advance of technology imposes the necessity of simulators, training models and tutorial laboratory equipment to be used for adequate training of students and field specialists
Remote Sensing Sensors and Applications in Environmental Resources Mapping and Modelling
Melesse, Assefa M.; Weng, Qihao; S.Thenkabail, Prasad; Senay, Gabriel B.
2007-01-01
The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling. PMID:28903290
A Bridging Opportunities Work-frame to develop mobile applications for clinical decision making
van Rooij, Tibor; Rix, Serena; Moore, James B; Marsh, Sharon
2015-01-01
Background: Mobile applications (apps) providing clinical decision support (CDS) may show the greatest promise when created by and for frontline clinicians. Our aim was to create a generic model enabling healthcare providers to direct the development of CDS apps. Methods: We combined Change Management with a three-tier information technology architecture to stimulate CDS app development. Results: A Bridging Opportunities Work-frame model was developed. A test case was used to successfully develop an app. Conclusion: Healthcare providers can re-use this globally applicable model to actively create and manage regional decision support applications to translate evidence-based medicine in the use of emerging medication or novel treatment regimens. PMID:28031883
Home Energy Scoring Tools (website) and Application Programming Interfaces, APIs (aka HEScore)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Evan; Bourassa, Norm; Rainer, Leo
A web-based residential energy rating tool with APIs that runs the LBNL website: Provides customized estimates of residential energy use and energy bills based on building description information provided by the user. Energy use is estimated using engineering models developed at LBNL. Space heating and cooling use is based on the DOE-2. 1E building simulation model. Other end-users (water heating, appliances, lighting, and misc. equipment) are based on engineering models developed by LBNL.
Home Energy Scoring Tools (website) and Application Programming Interfaces, APIs (aka HEScore)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Evan; Bourassa, Norm; Rainer, Leo
2016-04-22
A web-based residential energy rating tool with APIs that runs the LBNL website: Provides customized estimates of residential energy use and energy bills based on building description information provided by the user. Energy use is estimated using engineering models developed at LBNL. Space heating and cooling use is based on the DOE-2. 1E building simulation model. Other end-users (water heating, appliances, lighting, and misc. equipment) are based on engineering models developed by LBNL.
Rate-Based Model Predictive Control of Turbofan Engine Clearance
NASA Technical Reports Server (NTRS)
DeCastro, Jonathan A.
2006-01-01
An innovative model predictive control strategy is developed for control of nonlinear aircraft propulsion systems and sub-systems. At the heart of the controller is a rate-based linear parameter-varying model that propagates the state derivatives across the prediction horizon, extending prediction fidelity to transient regimes where conventional models begin to lose validity. The new control law is applied to a demanding active clearance control application, where the objectives are to tightly regulate blade tip clearances and also anticipate and avoid detrimental blade-shroud rub occurrences by optimally maintaining a predefined minimum clearance. Simulation results verify that the rate-based controller is capable of satisfying the objectives during realistic flight scenarios where both a conventional Jacobian-based model predictive control law and an unconstrained linear-quadratic optimal controller are incapable of doing so. The controller is evaluated using a variety of different actuators, illustrating the efficacy and versatility of the control approach. It is concluded that the new strategy has promise for this and other nonlinear aerospace applications that place high importance on the attainment of control objectives during transient regimes.
Application of postured human model for SAR measurements
NASA Astrophysics Data System (ADS)
Vuchkovikj, M.; Munteanu, I.; Weiland, T.
2013-07-01
In the last two decades, the increasing number of electronic devices used in day-to-day life led to a growing interest in the study of the electromagnetic field interaction with biological tissues. The design of medical devices and wireless communication devices such as mobile phones benefits a lot from the bio-electromagnetic simulations in which digital human models are used. The digital human models currently available have an upright position which limits the research activities in realistic scenarios, where postured human bodies must be considered. For this reason, a software application called "BodyFlex for CST STUDIO SUITE" was developed. In its current version, this application can deform the voxel-based human model named HUGO (Dipp GmbH, 2010) to allow the generation of common postures that people use in normal life, ensuring the continuity of tissues and conserving the mass to an acceptable level. This paper describes the enhancement of the "BodyFlex" application, which is related to the movements of the forearm and the wrist of a digital human model. One of the electromagnetic applications in which the forearm and the wrist movement of a voxel based human model has a significant meaning is the measurement of the specific absorption rate (SAR) when a model is exposed to a radio frequency electromagnetic field produced by a mobile phone. Current SAR measurements of the exposure from mobile phones are performed with the SAM (Specific Anthropomorphic Mannequin) phantom which is filled with a dispersive but homogeneous material. We are interested what happens with the SAR values if a realistic inhomogeneous human model is used. To this aim, two human models, a homogeneous and an inhomogeneous one, in two simulation scenarios are used, in order to examine and observe the differences in the results for the SAR values.
Euler-Lagrange CFD modelling of unconfined gas mixing in anaerobic digestion.
Dapelo, Davide; Alberini, Federico; Bridgeman, John
2015-11-15
A novel Euler-Lagrangian (EL) computational fluid dynamics (CFD) finite volume-based model to simulate the gas mixing of sludge for anaerobic digestion is developed and described. Fluid motion is driven by momentum transfer from bubbles to liquid. Model validation is undertaken by assessing the flow field in a labscale model with particle image velocimetry (PIV). Conclusions are drawn about the upscaling and applicability of the model to full-scale problems, and recommendations are given for optimum application. Copyright © 2015 Elsevier Ltd. All rights reserved.
2015-06-01
very coarse architectural model proposed in Section 2.4 into something that might be implemented . Figure 11 shows the model we have created based ...interoperability through common data models . So many of the pieces are either in place or are being developed currently. However, SEA still needs: • A core...of knowledge derived through the scientific method. In NATO, S&T is addressed using different business models , namely a collaborative business model
Damage evaluation by a guided wave-hidden Markov model based method
NASA Astrophysics Data System (ADS)
Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin
2016-02-01
Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.
An Action-Based Fine-Grained Access Control Mechanism for Structured Documents and Its Application
Su, Mang; Li, Fenghua; Tang, Zhi; Yu, Yinyan; Zhou, Bo
2014-01-01
This paper presents an action-based fine-grained access control mechanism for structured documents. Firstly, we define a describing model for structured documents and analyze the application scenarios. The describing model could support the permission management on chapters, pages, sections, words, and pictures of structured documents. Secondly, based on the action-based access control (ABAC) model, we propose a fine-grained control protocol for structured documents by introducing temporal state and environmental state. The protocol covering different stages from document creation, to permission specification and usage control are given by using the Z-notation. Finally, we give the implementation of our mechanism and make the comparisons between the existing methods and our mechanism. The result shows that our mechanism could provide the better solution of fine-grained access control for structured documents in complicated networks. Moreover, it is more flexible and practical. PMID:25136651
An action-based fine-grained access control mechanism for structured documents and its application.
Su, Mang; Li, Fenghua; Tang, Zhi; Yu, Yinyan; Zhou, Bo
2014-01-01
This paper presents an action-based fine-grained access control mechanism for structured documents. Firstly, we define a describing model for structured documents and analyze the application scenarios. The describing model could support the permission management on chapters, pages, sections, words, and pictures of structured documents. Secondly, based on the action-based access control (ABAC) model, we propose a fine-grained control protocol for structured documents by introducing temporal state and environmental state. The protocol covering different stages from document creation, to permission specification and usage control are given by using the Z-notation. Finally, we give the implementation of our mechanism and make the comparisons between the existing methods and our mechanism. The result shows that our mechanism could provide the better solution of fine-grained access control for structured documents in complicated networks. Moreover, it is more flexible and practical.
Evidence-based dentistry: a model for clinical practice.
Faggion, Clóvis M; Tu, Yu-Kang
2007-06-01
Making decisions in dentistry should be based on the best evidence available. The objective of this study was to demonstrate a practical procedure and model that clinicians can use to apply the results of well-conducted studies to patient care by critically appraising the evidence with checklists and letter grade scales. To demonstrate application of this model for critically appraising the quality of research evidence, a hypothetical case involving an adult male with chronic periodontitis is used as an example. To determine the best clinical approach for this patient, a four-step, evidence-based model is demonstrated, consisting of the following: definition of a research question using the PICO format, search and selection of relevant literature, critical appraisal of identified research reports using checklists, and the application of evidence. In this model, the quality of research evidence was assessed quantitatively based on different levels of quality that are assigned letter grades of A, B, and C by evaluating the studies against the QUOROM (Quality of Reporting Meta-Analyses) and CONSORT (Consolidated Standards of Reporting Trials) checklists in a tabular format. For this hypothetical periodontics case, application of the model identified the best available evidence for clinical decision making, i.e., one randomized controlled trial and one systematic review of randomized controlled trials. Both studies showed similar answers for the research question. The use of a letter grade scale allowed an objective analysis of the quality of evidence. A checklist-driven model that assesses and applies evidence to dental practice may substantially improve dentists' decision making skill.
Landguth, Erin L; Bearlin, Andrew; Day, Casey; Dunham, Jason B.
2016-01-01
1. Combining landscape demographic and genetics models offers powerful methods for addressing questions for eco-evolutionary applications.2. Using two illustrative examples, we present Cost–Distance Meta-POPulation, a program to simulate changes in neutral and/or selection-driven genotypes through time as a function of individual-based movement, complex spatial population dynamics, and multiple and changing landscape drivers.3. Cost–Distance Meta-POPulation provides a novel tool for questions in landscape genetics by incorporating population viability analysis, while linking directly to conservation applications.
Dataset for petroleum based stock markets and GAUSS codes for SAMEM.
Khalifa, Ahmed A A; Bertuccelli, Pietro; Otranto, Edoardo
2017-02-01
This article includes a unique data set of a balanced daily (Monday, Tuesday and Wednesday) for oil and natural gas volatility and the oil rich economies' stock markets for Saudi Arabia, Qatar, Kuwait, Abu Dhabi, Dubai, Bahrain and Oman, using daily data over the period spanning Oct. 18, 2006-July 30, 2015. Additionally, we have included unique GAUSS codes for estimating the spillover asymmetric multiplicative error model (SAMEM) with application to Petroleum-Based Stock Market. The data, the model and the codes have many applications in business and social science.
NASA Astrophysics Data System (ADS)
Irwan; Gustientiedina; Sunarti; Desnelita, Yenny
2017-12-01
The purpose of this study is to design a counseling model application for a decision-maker and consultation system. This application as an alternative guidance and individual career development for students, that include career knowledge, planning and alternative options from an expert tool based on knowledge and rule to provide the solutions on student’s career decisions. This research produces a counseling model application to obtain the important information about student career development and facilitating individual student’s development through the service form, to connect their plan with their career according to their talent, interest, ability, knowledge, personality and other supporting factors. This application model can be used as tool to get information faster and flexible for the student’s guidance and counseling. So, it can help students in doing selection and making decision that appropriate with their choice of works.
Rapid Prototyping in Technology Education.
ERIC Educational Resources Information Center
Flowers, Jim; Moniz, Matt
2002-01-01
Describes how technology education majors are using a high-tech model builder, called a fused deposition modeling machine, to develop their models directly from computer-based designs without any machining. Gives examples of applications in technology education. (JOW)
Sun, Wenchao; Ishidaira, Hiroshi; Bastola, Satish; Yu, Jingshan
2015-05-01
Lacking observation data for calibration constrains applications of hydrological models to estimate daily time series of streamflow. Recent improvements in remote sensing enable detection of river water-surface width from satellite observations, making possible the tracking of streamflow from space. In this study, a method calibrating hydrological models using river width derived from remote sensing is demonstrated through application to the ungauged Irrawaddy Basin in Myanmar. Generalized likelihood uncertainty estimation (GLUE) is selected as a tool for automatic calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. Nash-Sutcliffe efficiency is 95.7% for the simulated streamflow at the 50% quantile. These results indicate that application to the target basin is generally successful. Beyond evaluating the method in a basin lacking streamflow data, difficulties and possible solutions for applications in the real world are addressed to promote future use of the proposed method in more ungauged basins. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Cui, Yiqian; Shi, Junyou; Wang, Zili
2015-11-01
Quantum Neural Networks (QNN) models have attracted great attention since it innovates a new neural computing manner based on quantum entanglement. However, the existing QNN models are mainly based on the real quantum operations, and the potential of quantum entanglement is not fully exploited. In this paper, we proposes a novel quantum neuron model called Complex Quantum Neuron (CQN) that realizes a deep quantum entanglement. Also, a novel hybrid networks model Complex Rotation Quantum Dynamic Neural Networks (CRQDNN) is proposed based on Complex Quantum Neuron (CQN). CRQDNN is a three layer model with both CQN and classical neurons. An infinite impulse response (IIR) filter is embedded in the Networks model to enable the memory function to process time series inputs. The Levenberg-Marquardt (LM) algorithm is used for fast parameter learning. The networks model is developed to conduct time series predictions. Two application studies are done in this paper, including the chaotic time series prediction and electronic remaining useful life (RUL) prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.
BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.
Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing
2012-03-01
BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.
Geomorphically based predictive mapping of soil thickness in upland watersheds
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.; Rasmussen, Craig
2009-09-01
The hydrologic response of upland watersheds is strongly controlled by soil (regolith) thickness. Despite the need to quantify soil thickness for input into hydrologic models, there is currently no widely used, geomorphically based method for doing so. In this paper we describe and illustrate a new method for predictive mapping of soil thicknesses using high-resolution topographic data, numerical modeling, and field-based calibration. The model framework works directly with input digital elevation model data to predict soil thicknesses assuming a long-term balance between soil production and erosion. Erosion rates in the model are quantified using one of three geomorphically based sediment transport models: nonlinear slope-dependent transport, nonlinear area- and slope-dependent transport, and nonlinear depth- and slope-dependent transport. The model balances soil production and erosion locally to predict a family of solutions corresponding to a range of values of two unconstrained model parameters. A small number of field-based soil thickness measurements can then be used to calibrate the local value of those unconstrained parameters, thereby constraining which solution is applicable at a particular study site. As an illustration, the model is used to predictively map soil thicknesses in two small, ˜0.1 km2, drainage basins in the Marshall Gulch watershed, a semiarid drainage basin in the Santa Catalina Mountains of Pima County, Arizona. Field observations and calibration data indicate that the nonlinear depth- and slope-dependent sediment transport model is the most appropriate transport model for this site. The resulting framework provides a generally applicable, geomorphically based tool for predictive mapping of soil thickness using high-resolution topographic data sets.
A dynamic subgrid scale model for Large Eddy Simulations based on the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric J.; Duraisamy, Karthik
2017-11-01
The development of reduced models for complex multiscale problems remains one of the principal challenges in computational physics. The optimal prediction framework of Chorin et al. [1], which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived reduced models of dynamical systems. Several promising models have emerged from the optimal prediction community and have found application in molecular dynamics and turbulent flows. In this work, a new M-Z-based closure model that addresses some of the deficiencies of existing methods is developed. The model is constructed by exploiting similarities between two levels of coarse-graining via the Germano identity of fluid mechanics and by assuming that memory effects have a finite temporal support. The appeal of the proposed model, which will be referred to as the 'dynamic-MZ-τ' model, is that it is parameter-free and has a structural form imposed by the mathematics of the coarse-graining process (rather than the phenomenological assumptions made by the modeler, such as in classical subgrid scale models). To promote the applicability of M-Z models in general, two procedures are presented to compute the resulting model form, helping to bypass the tedious error-prone algebra that has proven to be a hindrance to the construction of M-Z-based models for complex dynamical systems. While the new formulation is applicable to the solution of general partial differential equations, demonstrations are presented in the context of Large Eddy Simulation closures for the Burgers equation, decaying homogeneous turbulence, and turbulent channel flow. The performance of the model and validity of the underlying assumptions are investigated in detail.
Adaptive Long-Term Monitoring at Environmental Restoration Sites (ER-0629)
2009-05-01
Figures Figure 2-1 General Flowchart of Software Application Figure 2-2 Overview of the Genetic Algorithm Approach Figure 2-3 Example of a...and Model Builder) are highlighted on Figure 2-1, which is a general flowchart illustrating the application of the software. The software is applied...monitoring event (e.g., contaminant mass based on interpolation) that modeling is provided by Model Builder. 4 Figure 2-1. General Flowchart of Software
ERIC Educational Resources Information Center
Saavedra, Pedro; Kuchak, JoAnn
An error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications was developed, based on interviews conducted with a quality control sample of 1,791 students during 1978-1979. The model was designed to identify corrective methods appropriate for different types of…
An Interdisciplinary Model for Teaching Evolutionary Ecology.
ERIC Educational Resources Information Center
Coletta, John
1992-01-01
Describes a general systems evolutionary model and demonstrates how a previously established ecological model is a function of its past development based on the evolution of the rock, nutrient, and water cycles. Discusses the applications of the model in environmental education. (MDH)
Mulugeta, Lealem; Drach, Andrew; Erdemir, Ahmet; Hunt, C. A.; Horner, Marc; Ku, Joy P.; Myers Jr., Jerry G.; Vadigepalli, Rajanikanth; Lytton, William W.
2018-01-01
Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips), from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins) and scenario-based simulations rather than on numerical simulations. PMID:29713272
Mulugeta, Lealem; Drach, Andrew; Erdemir, Ahmet; Hunt, C A; Horner, Marc; Ku, Joy P; Myers, Jerry G; Vadigepalli, Rajanikanth; Lytton, William W
2018-01-01
Modeling and simulation in computational neuroscience is currently a research enterprise to better understand neural systems. It is not yet directly applicable to the problems of patients with brain disease. To be used for clinical applications, there must not only be considerable progress in the field but also a concerted effort to use best practices in order to demonstrate model credibility to regulatory bodies, to clinics and hospitals, to doctors, and to patients. In doing this for neuroscience, we can learn lessons from long-standing practices in other areas of simulation (aircraft, computer chips), from software engineering, and from other biomedical disciplines. In this manuscript, we introduce some basic concepts that will be important in the development of credible clinical neuroscience models: reproducibility and replicability; verification and validation; model configuration; and procedures and processes for credible mechanistic multiscale modeling. We also discuss how garnering strong community involvement can promote model credibility. Finally, in addition to direct usage with patients, we note the potential for simulation usage in the area of Simulation-Based Medical Education, an area which to date has been primarily reliant on physical models (mannequins) and scenario-based simulations rather than on numerical simulations.
The application of muscle wrapping to voxel-based finite element models of skeletal structures.
Liu, Jia; Shi, Junfen; Fitton, Laura C; Phillips, Roger; O'Higgins, Paul; Fagan, Michael J
2012-01-01
Finite elements analysis (FEA) is now used routinely to interpret skeletal form in terms of function in both medical and biological applications. To produce accurate predictions from FEA models, it is essential that the loading due to muscle action is applied in a physiologically reasonable manner. However, it is common for muscle forces to be represented as simple force vectors applied at a few nodes on the model's surface. It is certainly rare for any wrapping of the muscles to be considered, and yet wrapping not only alters the directions of muscle forces but also applies an additional compressive load from the muscle belly directly to the underlying bone surface. This paper presents a method of applying muscle wrapping to high-resolution voxel-based finite element (FE) models. Such voxel-based models have a number of advantages over standard (geometry-based) FE models, but the increased resolution with which the load can be distributed over a model's surface is particularly advantageous, reflecting more closely how muscle fibre attachments are distributed. In this paper, the development, application and validation of a muscle wrapping method is illustrated using a simple cylinder. The algorithm: (1) calculates the shortest path over the surface of a bone given the points of origin and ultimate attachment of the muscle fibres; (2) fits a Non-Uniform Rational B-Spline (NURBS) curve from the shortest path and calculates its tangent, normal vectors and curvatures so that normal and tangential components of the muscle force can be calculated and applied along the fibre; and (3) automatically distributes the loads between adjacent fibres to cover the bone surface with a fully distributed muscle force, as is observed in vivo. Finally, we present a practical application of this approach to the wrapping of the temporalis muscle around the cranium of a macaque skull.
Carbon tetrachloride (CC4) and trichloroethylene (TCE) are hepatotoxic volatile organic compounds (VOCs) and environmental contaminants. Previous physiologically based pharmacokinetic (PBPK) models describe the kinetics ofindividual chemical disposition and metabolic clearance fo...
Bromochloromethane (BCM) is a volatile compound and a by-product of disinfection of water by ofchlorination. Physiologically based pharmacokinetic (PBPK) models are used in risk assessment applications. An updated PBPKmodel for BCM is generated and applied to hypotheses testing c...
2014-10-14
applications. By developing both inversion-based and projection -based strategies to enable 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...REPORT TYPE 17. LIMITATION OF ABSTRACT 15. NUMBER OF PAGES 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 5c. PROGRAM ELEMENT...constraint that excluded essentially all condensed-phase and reactive chemical applications. By developing both inversion-based and projection -based
Design and application of BIM based digital sand table for construction management
NASA Astrophysics Data System (ADS)
Fuquan, JI; Jianqiang, LI; Weijia, LIU
2018-05-01
This paper explores the design and application of BIM based digital sand table for construction management. Aiming at the demands and features of construction management plan for bridge and tunnel engineering, the key functional features of digital sand table should include three-dimensional GIS, model navigation, virtual simulation, information layers, and data exchange, etc. That involving the technology of 3D visualization and 4D virtual simulation of BIM, breakdown structure of BIM model and project data, multi-dimensional information layers, and multi-source data acquisition and interaction. Totally, the digital sand table is a visual and virtual engineering information integrated terminal, under the unified data standard system. Also, the applications shall contain visual constructing scheme, virtual constructing schedule, and monitoring of construction, etc. Finally, the applicability of several basic software to the digital sand table is analyzed.
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Nativi, S.; Verlato, M.; Angelini, V.
2009-04-01
In the context of the EU co-funded project CYCLOPS (http://www.cyclops-project.eu) the problem of designing an advanced e-Infrastructure for Civil Protection (CP) applications has been addressed. As a preliminary step, some studies about European CP systems and operational applications were performed in order to define their specific system requirements. At a higher level it was verified that CP applications are usually conceived to map CP Business Processes involving different levels of processing including data access, data processing, and output visualization. At their core they usually run one or more Earth Science models for information extraction. The traditional approach based on the development of monolithic applications presents some limitations related to flexibility (e.g. the possibility of running the same models with different input data sources, or different models with the same data sources) and scalability (e.g. launching several runs for different scenarios, or implementing more accurate and computing-demanding models). Flexibility can be addressed adopting a modular design based on a SOA and standard services and models, such as OWS and ISO for geospatial services. Distributed computing and storage solutions could improve scalability. Basing on such considerations an architectural framework has been defined. It is made of a Web Service layer providing advanced services for CP applications (e.g. standard geospatial data sharing and processing services) working on the underlying Grid platform. This framework has been tested through the development of prototypes as proof-of-concept. These theoretical studies and proof-of-concept demonstrated that although Grid and geospatial technologies would be able to provide significant benefits to CP applications in terms of scalability and flexibility, current platforms are designed taking into account requirements different from CP. In particular CP applications have strict requirements in terms of: a) Real-Time capabilities, privileging time-of-response instead of accuracy, b) Security services to support complex data policies and trust relationships, c) Interoperability with existing or planned infrastructures (e.g. e-Government, INSPIRE compliant, etc.). Actually these requirements are the main reason why CP applications differ from Earth Science applications. Therefore further research is required to design and implement an advanced e-Infrastructure satisfying those specific requirements. In particular five themes where further research is required were identified: Grid Infrastructure Enhancement, Advanced Middleware for CP Applications, Security and Data Policies, CP Applications Enablement, and Interoperability. For each theme several research topics were proposed and detailed. They are targeted to solve specific problems for the implementation of an effective operational European e-Infrastructure for CP applications.
Quantitative Modeling of Cerenkov Light Production Efficiency from Medical Radionuclides
Beattie, Bradley J.; Thorek, Daniel L. J.; Schmidtlein, Charles R.; Pentlow, Keith S.; Humm, John L.; Hielscher, Andreas H.
2012-01-01
There has been recent and growing interest in applying Cerenkov radiation (CR) for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use. PMID:22363636
Design Virtual Reality Scene Roam for Tour Animations Base on VRML and Java
NASA Astrophysics Data System (ADS)
Cao, Zaihui; hu, Zhongyan
Virtual reality has been involved in a wide range of academic and commercial applications. It can give users a natural feeling of the environment by creating realistic virtual worlds. Implementing a virtual tour through a model of a tourist area on the web has become fashionable. In this paper, we present a web-based application that allows a user to, walk through, see, and interact with a fully three-dimensional model of the tourist area. Issues regarding navigation and disorientation areaddressed and we suggest a combination of the metro map and an intuitive navigation system. Finally we present a prototype which implements our ideas. The application of VR techniques integrates the visualization and animation of the three dimensional modelling to landscape analysis. The use of the VRML format produces the possibility to obtain some views of the 3D model and to explore it in real time. It is an important goal for the spatial information sciences.
An e-learning application on electrochemotherapy
Corovic, Selma; Bester, Janez; Miklavcic, Damijan
2009-01-01
Background Electrochemotherapy is an effective approach in local tumour treatment employing locally applied high-voltage electric pulses in combination with chemotherapeutic drugs. In planning and performing electrochemotherapy a multidisciplinary expertise is required and collaboration, knowledge and experience exchange among the experts from different scientific fields such as medicine, biology and biomedical engineering is needed. The objective of this study was to develop an e-learning application in order to provide the educational content on electrochemotherapy and its underlying principles and to support collaboration, knowledge and experience exchange among the experts involved in the research and clinics. Methods The educational content on electrochemotherapy and cell and tissue electroporation was based on previously published studies from molecular dynamics, lipid bilayers, single cell level and simplified tissue models to complex biological tissues and research and clinical results of electrochemotherapy treatment. We used computer graphics such as model-based visualization (i.e. 3D numerical modelling using finite element method) and 3D computer animations and graphical illustrations to facilitate the representation of complex biological and physical aspects in electrochemotherapy. The e-learning application is integrated into an interactive e-learning environment developed at our institution, enabling collaboration and knowledge exchange among the users. We evaluated the designed e-learning application at the International Scientific workshop and postgraduate course (Electroporation Based Technologies and Treatments). The evaluation was carried out by testing the pedagogical efficiency of the presented educational content and by performing the usability study of the application. Results The e-learning content presents three different levels of knowledge on cell and tissue electroporation. In the first part of the e-learning application we explain basic principles of electroporation process. The second part provides educational content about importance of modelling and visualization of local electric field in electroporation-based treatments. In the third part we developed an interactive module for visualization of local electric field distribution in 3D tissue models of cutaneous tumors for different parameters such as voltage applied, distance between electrodes, electrode dimension and shape, tissue geometry and electric conductivity. The pedagogical efficiency assessment showed that the participants improved their level of knowledge. The results of usability evaluation revealed that participants found the application simple to learn, use and navigate. The participants also found the information provided by the application easy to understand. Conclusion The e-learning application we present in this article provides educational material on electrochemotherapy and its underlying principles such as cell and tissue electroporation. The e-learning application is developed to provide an interactive educational content in order to simulate the "hands-on" learning approach about the parameters being important for successful therapy. The e-learning application together with the interactive e-learning environment is available to the users to provide collaborative and flexible learning in order to facilitate knowledge exchange among the experts from different scientific fields that are involved in electrochemotherapy. The modular structure of the application allows for upgrade with new educational content collected from the clinics and research, and can be easily adapted to serve as a collaborative e-learning tool also in other electroporation-based treatments such as gene electrotransfer, gene vaccination, irreversible tissue ablation and transdermal gene and drug delivery. The presented e-learning application provides an easy and rapid approach for information, knowledge and experience exchange among the experts from different scientific fields, which can facilitate development and optimisation of electroporation-based treatments. PMID:19843322
An e-learning application on electrochemotherapy.
Corovic, Selma; Bester, Janez; Miklavcic, Damijan
2009-10-20
Electrochemotherapy is an effective approach in local tumour treatment employing locally applied high-voltage electric pulses in combination with chemotherapeutic drugs. In planning and performing electrochemotherapy a multidisciplinary expertise is required and collaboration, knowledge and experience exchange among the experts from different scientific fields such as medicine, biology and biomedical engineering is needed. The objective of this study was to develop an e-learning application in order to provide the educational content on electrochemotherapy and its underlying principles and to support collaboration, knowledge and experience exchange among the experts involved in the research and clinics. The educational content on electrochemotherapy and cell and tissue electroporation was based on previously published studies from molecular dynamics, lipid bilayers, single cell level and simplified tissue models to complex biological tissues and research and clinical results of electrochemotherapy treatment. We used computer graphics such as model-based visualization (i.e. 3D numerical modelling using finite element method) and 3D computer animations and graphical illustrations to facilitate the representation of complex biological and physical aspects in electrochemotherapy. The e-learning application is integrated into an interactive e-learning environment developed at our institution, enabling collaboration and knowledge exchange among the users. We evaluated the designed e-learning application at the International Scientific workshop and postgraduate course (Electroporation Based Technologies and Treatments). The evaluation was carried out by testing the pedagogical efficiency of the presented educational content and by performing the usability study of the application. The e-learning content presents three different levels of knowledge on cell and tissue electroporation. In the first part of the e-learning application we explain basic principles of electroporation process. The second part provides educational content about importance of modelling and visualization of local electric field in electroporation-based treatments. In the third part we developed an interactive module for visualization of local electric field distribution in 3D tissue models of cutaneous tumors for different parameters such as voltage applied, distance between electrodes, electrode dimension and shape, tissue geometry and electric conductivity. The pedagogical efficiency assessment showed that the participants improved their level of knowledge. The results of usability evaluation revealed that participants found the application simple to learn, use and navigate. The participants also found the information provided by the application easy to understand. The e-learning application we present in this article provides educational material on electrochemotherapy and its underlying principles such as cell and tissue electroporation. The e-learning application is developed to provide an interactive educational content in order to simulate the "hands-on" learning approach about the parameters being important for successful therapy. The e-learning application together with the interactive e-learning environment is available to the users to provide collaborative and flexible learning in order to facilitate knowledge exchange among the experts from different scientific fields that are involved in electrochemotherapy. The modular structure of the application allows for upgrade with new educational content collected from the clinics and research, and can be easily adapted to serve as a collaborative e-learning tool also in other electroporation-based treatments such as gene electrotransfer, gene vaccination, irreversible tissue ablation and transdermal gene and drug delivery. The presented e-learning application provides an easy and rapid approach for information, knowledge and experience exchange among the experts from different scientific fields, which can facilitate development and optimisation of electroporation-based treatments.
The Osseus platform: a prototype for advanced web-based distributed simulation
NASA Astrophysics Data System (ADS)
Franceschini, Derrick; Riecken, Mark
2016-05-01
Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.
Structure-based discovery and binding site analysis of histamine receptor ligands.
Kiss, Róbert; Keserű, György M
2016-12-01
The application of structure-based drug discovery in histamine receptor projects was previously hampered by the lack of experimental structures. The publication of the first X-ray structure of the histamine H1 receptor has been followed by several successful virtual screens and binding site analysis studies of H1-antihistamines. This structure together with several other recently solved aminergic G-protein coupled receptors (GPCRs) enabled the development of more realistic homology models for H2, H3 and H4 receptors. Areas covered: In this paper, the authors review the development of histamine receptor models and their application in drug discovery. Expert opinion: In the authors' opinion, the application of atomistic histamine receptor models has played a significant role in understanding key ligand-receptor interactions as well as in the discovery of novel chemical starting points. The recently solved H1 receptor structure is a major milestone in structure-based drug discovery; however, our analysis also demonstrates that for building H3 and H4 receptor homology models, other GPCRs may be more suitable as templates. For these receptors, the authors envisage that the development of higher quality homology models will significantly contribute to the discovery and optimization of novel H3 and H4 ligands.
Completing and Adapting Models of Biological Processes
NASA Technical Reports Server (NTRS)
Margaria, Tiziana; Hinchey, Michael G.; Raffelt, Harald; Rash, James L.; Rouff, Christopher A.; Steffen, Bernhard
2006-01-01
We present a learning-based method for model completion and adaptation, which is based on the combination of two approaches: 1) R2D2C, a technique for mechanically transforming system requirements via provably equivalent models to running code, and 2) automata learning-based model extrapolation. The intended impact of this new combination is to make model completion and adaptation accessible to experts of the field, like biologists or engineers. The principle is briefly illustrated by generating models of biological procedures concerning gene activities in the production of proteins, although the main application is going to concern autonomic systems for space exploration.
FacetModeller: Software for manual creation, manipulation and analysis of 3D surface-based models
NASA Astrophysics Data System (ADS)
Lelièvre, Peter G.; Carter-McAuslan, Angela E.; Dunham, Michael W.; Jones, Drew J.; Nalepa, Mariella; Squires, Chelsea L.; Tycholiz, Cassandra J.; Vallée, Marc A.; Farquharson, Colin G.
2018-01-01
The creation of 3D models is commonplace in many disciplines. Models are often built from a collection of tessellated surfaces. To apply numerical methods to such models it is often necessary to generate a mesh of space-filling elements that conforms to the model surfaces. While there are meshing algorithms that can do so, they place restrictive requirements on the surface-based models that are rarely met by existing 3D model building software. Hence, we have developed a Java application named FacetModeller, designed for efficient manual creation, modification and analysis of 3D surface-based models destined for use in numerical modelling.
Watershed modeling at the Savannah River Site.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vache, Kellie
2015-04-29
The overall goal of the work was the development of a watershed scale model of hydrological function for application to the US Department of Energy’s (DOE) Savannah River Site (SRS). The primary outcomes is a grid based hydrological modeling system that captures near surface runoff as well as groundwater recharge and contributions of groundwater to streams. The model includes a physically-based algorithm to capture both evaporation and transpiration from forestland.
Investigating market efficiency through a forecasting model based on differential equations
NASA Astrophysics Data System (ADS)
de Resende, Charlene C.; Pereira, Adriano C. M.; Cardoso, Rodrigo T. N.; de Magalhães, A. R. Bosco
2017-05-01
A new differential equation based model for stock price trend forecast is proposed as a tool to investigate efficiency in an emerging market. Its predictive power showed statistically to be higher than the one of a completely random model, signaling towards the presence of arbitrage opportunities. Conditions for accuracy to be enhanced are investigated, and application of the model as part of a trading strategy is discussed.
Biomimetic three-dimensional tissue models for advanced high-throughput drug screening
Nam, Ki-Hwan; Smith, Alec S.T.; Lone, Saifullah; Kwon, Sunghoon; Kim, Deok-Ho
2015-01-01
Most current drug screening assays used to identify new drug candidates are 2D cell-based systems, even though such in vitro assays do not adequately recreate the in vivo complexity of 3D tissues. Inadequate representation of the human tissue environment during a preclinical test can result in inaccurate predictions of compound effects on overall tissue functionality. Screening for compound efficacy by focusing on a single pathway or protein target, coupled with difficulties in maintaining long-term 2D monolayers, can serve to exacerbate these issues when utilizing such simplistic model systems for physiological drug screening applications. Numerous studies have shown that cell responses to drugs in 3D culture are improved from those in 2D, with respect to modeling in vivo tissue functionality, which highlights the advantages of using 3D-based models for preclinical drug screens. In this review, we discuss the development of microengineered 3D tissue models which accurately mimic the physiological properties of native tissue samples, and highlight the advantages of using such 3D micro-tissue models over conventional cell-based assays for future drug screening applications. We also discuss biomimetic 3D environments, based-on engineered tissues as potential preclinical models for the development of more predictive drug screening assays for specific disease models. PMID:25385716
High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
Carvalho, Carlos M.; Chang, Jeffrey; Lucas, Joseph E.; Nevins, Joseph R.; Wang, Quanli; West, Mike
2010-01-01
We describe studies in molecular profiling and biological pathway analysis that use sparse latent factor and regression models for microarray gene expression data. We discuss breast cancer applications and key aspects of the modeling and computational methodology. Our case studies aim to investigate and characterize heterogeneity of structure related to specific oncogenic pathways, as well as links between aggregate patterns in gene expression profiles and clinical biomarkers. Based on the metaphor of statistically derived “factors” as representing biological “subpathway” structure, we explore the decomposition of fitted sparse factor models into pathway subcomponents and investigate how these components overlay multiple aspects of known biological activity. Our methodology is based on sparsity modeling of multivariate regression, ANOVA, and latent factor models, as well as a class of models that combines all components. Hierarchical sparsity priors address questions of dimension reduction and multiple comparisons, as well as scalability of the methodology. The models include practically relevant non-Gaussian/nonparametric components for latent structure, underlying often quite complex non-Gaussianity in multivariate expression patterns. Model search and fitting are addressed through stochastic simulation and evolutionary stochastic search methods that are exemplified in the oncogenic pathway studies. Supplementary supporting material provides more details of the applications, as well as examples of the use of freely available software tools for implementing the methodology. PMID:21218139