NASA Technical Reports Server (NTRS)
Hornberger, G. M.; Rastetter, E. B.
1982-01-01
A literature review of the use of sensitivity analyses in modelling nonlinear, ill-defined systems, such as ecological interactions is presented. Discussions of previous work, and a proposed scheme for generalized sensitivity analysis applicable to ill-defined systems are included. This scheme considers classes of mathematical models, problem-defining behavior, analysis procedures (especially the use of Monte-Carlo methods), sensitivity ranking of parameters, and extension to control system design.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.
Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.
Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems
Timmis, Jon; Qwarnstrom, Eva E.
2016-01-01
Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414
Defining a Model for Mitochondrial Function in mESC Differentiation
Defining a Model for Mitochondrial Function in mESC DifferentiationDefining a Model for Mitochondrial Function in mESC Differentiation Differentiating embryonic stem cells (ESCs) undergo mitochondrial maturation leading to a switch from a system dependent upon glycolysis to a re...
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Basham, Bryan D.
1989-01-01
CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.
General Training System; GENTRAS. Final Report.
ERIC Educational Resources Information Center
International Business Machines Corp., Gaithersburg, MD. Federal Systems Div.
GENTRAS (General Training System) is a computer-based training model for the Marine Corps which makes use of a systems approach. The model defines the skill levels applicable for career growth and classifies and defines the training needed for this growth. It also provides a training cost subsystem which will provide a more efficient means of…
Conceptual Model of Quantities, Units, Dimensions, and Values
NASA Technical Reports Server (NTRS)
Rouquette, Nicolas F.; DeKoenig, Hans-Peter; Burkhart, Roger; Espinoza, Huascar
2011-01-01
JPL collaborated with experts from industry and other organizations to develop a conceptual model of quantities, units, dimensions, and values based on the current work of the ISO 80000 committee revising the International System of Units & Quantities based on the International Vocabulary of Metrology (VIM). By providing support for ISO 80000 in SysML via the International Vocabulary of Metrology (VIM), this conceptual model provides, for the first time, a standard-based approach for addressing issues of unit coherence and dimensional analysis into the practice of systems engineering with SysML-based tools. This conceptual model provides support for two kinds of analyses specified in the International Vocabulary of Metrology (VIM): coherence of units as well as of systems of units, and dimension analysis of systems of quantities. To provide a solid and stable foundation, the model for defining quantities, units, dimensions, and values in SysML is explicitly based on the concepts defined in VIM. At the same time, the model library is designed in such a way that extensions to the ISQ (International System of Quantities) and SI Units (Systeme International d Unites) can be represented, as well as any alternative systems of quantities and units. The model library can be used to support SysML user models in various ways. A simple approach is to define and document libraries of reusable systems of units and quantities for reuse across multiple projects, and to link units and quantity kinds from these libraries to Unit and QuantityKind stereotypes defined in SysML user models.
Ecposure Related Dose Estimating Model
ERDEM is a physiologically based pharmacokinetic (PBPK) modeling system consisting of a general model and an associated front end. An actual model is defined when the user prepares an input command file. Such a command file defines the chemicals, compartments and processes that...
Using SysML to model complex systems for security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cano, Lester Arturo
2010-08-01
As security systems integrate more Information Technology the design of these systems has tended to become more complex. Some of the most difficult issues in designing Complex Security Systems (CSS) are: Capturing Requirements: Defining Hardware Interfaces: Defining Software Interfaces: Integrating Technologies: Radio Systems: Voice Over IP Systems: Situational Awareness Systems.
Interactive computer aided technology, evolution in the design/manufacturing process
NASA Technical Reports Server (NTRS)
English, C. H.
1975-01-01
A powerful computer-operated three dimensional graphic system and associated auxiliary computer equipment used in advanced design, production design, and manufacturing was described. This system has made these activities more productive than when using older and more conventional methods to design and build aerospace vehicles. With the use of this graphic system, designers are now able to define parts using a wide variety of geometric entities, define parts as fully surface 3-dimensional models as well as "wire-frame" models. Once geometrically defined, the designer is able to take section cuts of the surfaced model and automatically determine all of the section properties of the planar cut, lightpen detect all of the surface patches and automatically determine the volume and weight of the part. Further, his designs are defined mathematically at a degree of accuracy never before achievable.
Chang'E-3 data pre-processing system based on scientific workflow
NASA Astrophysics Data System (ADS)
tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai
2016-04-01
The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.
NASA Technical Reports Server (NTRS)
Palusinski, O. A.; Allgyer, T. T.; Mosher, R. A.; Bier, M.; Saville, D. A.
1981-01-01
A mathematical model of isoelectric focusing at the steady state has been developed for an M-component system of electrochemically defined ampholytes. The model is formulated from fundamental principles describing the components' chemical equilibria, mass transfer resulting from diffusion and electromigration, and electroneutrality. The model consists of ordinary differential equations coupled with a system of algebraic equations. The model is implemented on a digital computer using FORTRAN-based simulation software. Computer simulation data are presented for several two-component systems showing the effects of varying the isoelectric points and dissociation constants of the constituents.
A reference model for space data system interconnection services
NASA Astrophysics Data System (ADS)
Pietras, John; Theis, Gerhard
1993-03-01
The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).
A reference model for space data system interconnection services
NASA Technical Reports Server (NTRS)
Pietras, John; Theis, Gerhard
1993-01-01
The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).
NASA Glenn Wind Tunnel Model Systems Criteria
NASA Technical Reports Server (NTRS)
Soeder, Ronald H.; Roeder, James W.; Stark, David E.; Linne, Alan A.
2004-01-01
This report describes criteria for the design, analysis, quality assurance, and documentation of models that are to be tested in the wind tunnel facilities at the NASA Glenn Research Center. This report presents two methods for computing model allowable stresses on the basis of the yield stress or ultimate stress, and it defines project procedures to test models in the NASA Glenn aeropropulsion facilities. Both customer-furnished and in-house model systems are discussed. The functions of the facility personnel and customers are defined. The format for the pretest meetings, safety permit process, and model reviews are outlined. The format for the model systems report (a requirement for each model that is to be tested at NASA Glenn) is described, the engineers responsible for developing the model systems report are listed, and the timetable for its delivery to the project engineer is given.
Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach
NASA Astrophysics Data System (ADS)
Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.
Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.
Designing Class Methods from Dataflow Diagrams
NASA Astrophysics Data System (ADS)
Shoval, Peretz; Kabeli-Shani, Judith
A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.
Digital data processing system dynamic loading analysis
NASA Technical Reports Server (NTRS)
Lagas, J. J.; Peterka, J. J.; Tucker, A. E.
1976-01-01
Simulation and analysis of the Space Shuttle Orbiter Digital Data Processing System (DDPS) are reported. The mated flight and postseparation flight phases of the space shuttle's approach and landing test configuration were modeled utilizing the Information Management System Interpretative Model (IMSIM) in a computerized simulation modeling of the ALT hardware, software, and workload. System requirements simulated for the ALT configuration were defined. Sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and the sensitivity analyses, a test design is described for adapting, parameterizing, and executing the IMSIM. Varying load and stress conditions for the model execution are given. The analyses of the computer simulation runs were documented as results, conclusions, and recommendations for DDPS improvements.
Space shuttle orbiter digital data processing system timing sensitivity analysis OFT ascent phase
NASA Technical Reports Server (NTRS)
Lagas, J. J.; Peterka, J. J.; Becker, D. A.
1977-01-01
Dynamic loads were investigated to provide simulation and analysis of the space shuttle orbiter digital data processing system (DDPS). Segments of the ascent test (OFT) configuration were modeled utilizing the information management system interpretive model (IMSIM) in a computerized simulation modeling of the OFT hardware and software workload. System requirements for simulation of the OFT configuration were defined, and sensitivity analyses determined areas of potential data flow problems in DDPS operation. Based on the defined system requirements and these sensitivity analyses, a test design was developed for adapting, parameterizing, and executing IMSIM, using varying load and stress conditions for model execution. Analyses of the computer simulation runs are documented, including results, conclusions, and recommendations for DDPS improvements.
NASA Astrophysics Data System (ADS)
Lu, Meilian; Yang, Dong; Zhou, Xing
2013-03-01
Based on the analysis of the requirements of conversation history storage in CPM (Converged IP Messaging) system, a Multi-views storage model and access methods of conversation history are proposed. The storage model separates logical views from physical storage and divides the storage into system managed region and user managed region. It simultaneously supports conversation view, system pre-defined view and user-defined view of storage. The rationality and feasibility of multi-view presentation, the physical storage model and access methods are validated through the implemented prototype. It proves that, this proposal has good scalability, which will help to optimize the physical data storage structure and improve storage performance.
Design, fabrication and test of a trace contaminant control system
NASA Technical Reports Server (NTRS)
1975-01-01
A trace contaminant control system was designed, fabricated, and evaluated to determine suitability of the system concept to future manned spacecraft. Two different models were considered. The load model initially required by the contract was based on the Space Station Prototype (SSP) general specifications SVSK HS4655, reflecting a change from a 9 man crew to a 6 man crew of the model developed in previous phases of this effort. Trade studies and a system preliminary design were accomplished based on this contaminant load, including computer analyses to define the optimum system configuration in terms of component arrangements, flow rates and component sizing. At the completion of the preliminary design effort a revised contaminant load model was developed for the SSP. Additional analyses were then conducted to define the impact of this new contaminant load model on the system configuration. A full scale foam-core mock-up with the appropriate SSP system interfaces was also fabricated.
2012-12-01
system be implemented. In this study, we created a mathematical model to simulate accumulated savings under the proposed defined...retirement system be implemented. In this study, we created a mathematical model to simulate accumulated savings under the proposed defined...lumbering recovery, it has reemerged as a potential austerity measure within the U.S. government. B. METHODOLOGY We created a mathematical model of
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1994-01-01
This standard establishes the Space Generic Open Avionics Architecture (SGOAA). The SGOAA includes a generic functional model, processing structural model, and an architecture interface model. This standard defines the requirements for applying these models to the development of spacecraft core avionics systems. The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture models to the design of a specific avionics hardware/software processing system. This standard defines a generic set of system interface points to facilitate identification of critical services and interfaces. It establishes the requirement for applying appropriate low level detailed implementation standards to those interfaces points. The generic core avionics functions and processing structural models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
The Role of Intelligent Agents in Advanced Information Systems
NASA Technical Reports Server (NTRS)
Kerschberg, Larry
1999-01-01
In this presentation we review the current ongoing research within George Mason University's (GMU) Center for Information Systems Integration and Evolution (CISE). We define characteristics of advanced information systems, discuss a family of agents for such systems, and show how GMU's Domain modeling tools and techniques can be used to define a product line Architecture for configuring NASA missions. These concepts can be used to define Advanced Engineering Environments such as those envisioned for NASA's new initiative for intelligent design and synthesis environments.
Using a System Model for Irrigation Management
NASA Astrophysics Data System (ADS)
de Souza, Leonardo; de Miranda, Eu; Sánchez-Román, Rodrigo; Orellana-González, Alba
2014-05-01
When using Systems Thinking variables involved in any process have a dynamic behavior, according to nonstatic relationships with the environment. In this paper it is presented a system dynamics model developed to be used as an irrigation management tool. The model involves several parameters related to irrigation such as: soil characteristics, climate data and culture's physiological parameters. The water availability for plants in the soil is defined as a stock in the model, and this soil water content will define the right moment to irrigate and the water depth required to be applied. The crop water consumption will reduce soil water content; it is defined by the potential evapotranspiration (ET) that acts as an outflow from the stock (soil water content). ET can be estimated by three methods: a) FAO Penman-Monteith (ETPM), b) Hargreaves-Samani (ETHS) method, based on air temperature data and c) Class A pan (ETTCA) method. To validate the model were used data from the States of Ceará and Minas Gerais, Brazil, and the culture was bean. Keyword: System Dynamics, soil moisture content, agricultural water balance, irrigation scheduling.
A Structural Model Decomposition Framework for Hybrid Systems Diagnosis
NASA Technical Reports Server (NTRS)
Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil
2015-01-01
Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rizzo, Davinia B.; Blackburn, Mark R.
As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less
Rizzo, Davinia B.; Blackburn, Mark R.
2018-03-30
As systems become more complex, systems engineers rely on experts to inform decisions. There are few experts and limited data in many complex new technologies. This challenges systems engineers as they strive to plan activities such as qualification in an environment where technical constraints are coupled with the traditional cost, risk, and schedule constraints. Bayesian network (BN) models provide a framework to aid systems engineers in planning qualification efforts with complex constraints by harnessing expert knowledge and incorporating technical factors. By quantifying causal factors, a BN model can provide data about the risk of implementing a decision supplemented with informationmore » on driving factors. This allows a systems engineer to make informed decisions and examine “what-if” scenarios. This paper discusses a novel process developed to define a BN model structure based primarily on expert knowledge supplemented with extremely limited data (25 data sets or less). The model was developed to aid qualification decisions—specifically to predict the suitability of six degrees of freedom (6DOF) vibration testing for qualification. The process defined the model structure with expert knowledge in an unbiased manner. Finally, validation during the process execution and of the model provided evidence the process may be an effective tool in harnessing expert knowledge for a BN model.« less
META-GLARE: a shell for CIG systems.
Bottrighi, Alessio; Rubrichi, Stefania; Terenziani, Paolo
2015-01-01
In the last twenty years, many different approaches to deal with Computer-Interpretable clinical Guidelines (CIGs) have been developed, each one proposing its own representation formalism (mostly based on the Task-Network Model) execution engine. We propose META-GLARE a shell for easily defining new CIG systems. Using META-GLARE, CIG system designers can easily define their own systems (basically by defining their representation language), with a minimal programming effort. META-GLARE is thus a flexible and powerful vehicle for research about CIGs, since it supports easy and fast prototyping of new CIG systems.
Content-Addressable Memory Storage by Neural Networks: A General Model and Global Liapunov Method,
1988-03-01
point ex- ists. Liapunov functions were also described for Volterra -Lotka systems whose off-diagonal terms are relatively small (Kilmer, 1972...field, bidirectional associative memory, Volterra -Lotka, Gilpin-Ayala, and Eigen- Schuster models. The Cohen-Grossberg model thus defines a general...masking field, bidirectional associative memory. Volterra -Lotka, Gilpin-Ayala. and Eigen-Schuster models. The Cohen-Grossberg model thus defines a
Arthur, J.K.; Taylor, R.E.
1986-01-01
As part of the Gulf Coast Regional Aquifer System Analysis (GC RASA) study, data from 184 geophysical well logs were used to define the geohydrologic framework of the Mississippi embayment aquifer system in Mississippi for flow model simulation. Five major aquifers of Eocene and Paleocene age were defined within this aquifer system in Mississippi. A computer data storage system was established to assimilate the information obtained from the geophysical logs. Computer programs were developed to manipulate the data to construct geologic sections and structure maps. Data from the storage system will be input to a five-layer, three-dimensional, finite-difference digital computer model that is used to simulate the flow dynamics in the five major aquifers of the Mississippi embayment aquifer system.
Modelling safety of multistate systems with ageing components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kołowrocki, Krzysztof; Soszyńska-Budny, Joanna
An innovative approach to safety analysis of multistate ageing systems is presented. Basic notions of the ageing multistate systems safety analysis are introduced. The system components and the system multistate safety functions are defined. The mean values and variances of the multistate systems lifetimes in the safety state subsets and the mean values of their lifetimes in the particular safety states are defined. The multi-state system risk function and the moment of exceeding by the system the critical safety state are introduced. Applications of the proposed multistate system safety models to the evaluation and prediction of the safty characteristics ofmore » the consecutive “m out of n: F” is presented as well.« less
A Model of Workflow Composition for Emergency Management
NASA Astrophysics Data System (ADS)
Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu
The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.
A parsimonious land data assimilation system for the SMAP/GPM satellite era
USDA-ARS?s Scientific Manuscript database
Land data assimilation systems typically require complex parameterizations in order to: define required observation operators, quantify observing/forecasting errors and calibrate a land surface assimilation model. These parameters are commonly defined in an arbitrary manner and, if poorly specified,...
Models for the indices of thermal comfort
Adrian, Streinu-Cercel; Sergiu, Costoiu; Maria, Mârza; Anca, Streinu-Cercel; Monica, Mârza
2008-01-01
The current paper propose the analysis and extension formulation required for establishing decision in the management of the medical national system from the point of view of quality and efficiency such as: conceiving models for the indices of thermal comfort, defining the predicted mean vote (on the thermal sensation scale) „PMV”, defining the metabolism „M”, heat transfer between the human body and the environment, defining the predicted percent of dissatisfied people „PPD”, defining all indices of thermal comfort. PMID:20108461
LINEAR - DERIVATION AND DEFINITION OF A LINEAR AIRCRAFT MODEL
NASA Technical Reports Server (NTRS)
Duke, E. L.
1994-01-01
The Derivation and Definition of a Linear Model program, LINEAR, provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models. LINEAR was developed to provide a standard, documented, and verified tool to derive linear models for aircraft stability analysis and control law design. Linear system models define the aircraft system in the neighborhood of an analysis point and are determined by the linearization of the nonlinear equations defining vehicle dynamics and sensors. LINEAR numerically determines a linear system model using nonlinear equations of motion and a user supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. LINEAR is capable of extracting both linearized engine effects, such as net thrust, torque, and gyroscopic effects and including these effects in the linear system model. The point at which this linear model is defined is determined either by completely specifying the state and control variables, or by specifying an analysis point on a trajectory and directing the program to determine the control variables and the remaining state variables. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to provide easy selection of state, control, and observation variables to be used in a particular model. Thus, the order of the system model is completely under user control. Further, the program provides the flexibility of allowing alternate formulations of both the state and observation equations. Data describing the aircraft and the test case is input to the program through a terminal or formatted data files. All data can be modified interactively from case to case. The aerodynamic model can be defined in two ways: a set of nondimensional stability and control derivatives for the flight point of interest, or a full non-linear aerodynamic model as used in simulations. LINEAR is written in FORTRAN and has been implemented on a DEC VAX computer operating under VMS with a virtual memory requirement of approximately 296K of 8 bit bytes. Both an interactive and batch version are included. LINEAR was developed in 1988.
Object-oriented analysis and design of a health care management information system.
Krol, M; Reich, D L
1999-04-01
We have created a prototype for a universal object-oriented model of a health care system compatible with the object-oriented approach used in version 3.0 of the HL7 standard for communication messages. A set of three models has been developed: (1) the Object Model describes the hierarchical structure of objects in a system--their identity, relationships, attributes, and operations; (2) the Dynamic Model represents the sequence of operations in time as a collection of state diagrams for object classes in the system; and (3) functional Diagram represents the transformation of data within a system by means of data flow diagrams. Within these models, we have defined major object classes of health care participants and their subclasses, associations, attributes and operators, states, and behavioral scenarios. We have also defined the major processes and subprocesses. The top-down design approach allows use, reuse, and cloning of standard components.
Pulsar timing and general relativity
NASA Technical Reports Server (NTRS)
Backer, D. C.; Hellings, R. W.
1986-01-01
Techniques are described for accounting for relativistic effects in the analysis of pulsar signals. Design features of instrumentation used to achieve millisecond accuracy in the signal measurements are discussed. The accuracy of the data permits modeling the pulsar physical characteristics from the natural glitches in the emissions. Relativistic corrections are defined for adjusting for differences between the pulsar motion in its spacetime coordinate system relative to the terrestrial coordinate system, the earth's motion, and the gravitational potentials of solar system bodies. Modifications of the model to allow for a binary pulsar system are outlined, including treatment of the system as a point mass. Finally, a quadrupole model is presented for gravitational radiation and techniques are defined for using pulsars in the search for gravitational waves.
Using A Model-Based Systems Engineering Approach For Exploration Medical System Development
NASA Technical Reports Server (NTRS)
Hanson, A.; Mindock, J.; McGuire, K.; Reilly, J.; Cerro, J.; Othon, W.; Rubin, D.; Urbina, M.; Canga, M.
2017-01-01
NASA's Human Research Program's Exploration Medical Capabilities (ExMC) element is defining the medical system needs for exploration class missions. ExMC's Systems Engineering (SE) team will play a critical role in successful design and implementation of the medical system into exploration vehicles. The team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." Development of the medical system is being conducted in parallel with exploration mission architecture and vehicle design development. Successful implementation of the medical system in this environment will require a robust systems engineering approach to enable technical communication across communities to create a common mental model of the emergent engineering and medical systems. Model-Based Systems Engineering (MBSE) improves shared understanding of system needs and constraints between stakeholders and offers a common language for analysis. The ExMC SE team is using MBSE techniques to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. Systems Modeling Language (SysML) is the specific language the SE team is utilizing, within an MBSE approach, to model the medical system functional needs, requirements, and architecture. Modeling methods are being developed through the practice of MBSE within the team, and tools are being selected to support meta-data exchange as integration points to other system models are identified. Use of MBSE is supporting the development of relationships across disciplines and NASA Centers to build trust and enable teamwork, enhance visibility of team goals, foster a culture of unbiased learning and serving, and be responsive to customer needs. The MBSE approach to medical system design offers a paradigm shift toward greater integration between vehicle and the medical system and directly supports the transition of Earth-reliant ISS operations to the Earth-independent operations envisioned for Mars. Here, we describe the methods and approach to building this integrated model.
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
Defining the pharmaceutical system to support proactive drug safety.
Lewis, Vicki R; Hernandez, Angelica; Meadors, Margaret
2013-02-01
The military, aviation, nuclear, and transportation industries have transformed their safety records by using a systems approach to safety and risk mitigation. This article creates a preliminary model of the U.S. pharmaceutical system using available literature including academic publications, policies, and guidelines established by regulatory bodies and drug industry trade publications. Drawing from the current literature, the goals, roles, and individualized processes of pharmaceutical subsystems will be defined. Defining the pharmaceutical system provides a vehicle to assess and address known problems within the system, and provides a means to conduct proactive risk analyses, which would create significant pharmaceutical safety advancement.
Models for discrete-time self-similar vector processes with application to network traffic
NASA Astrophysics Data System (ADS)
Lee, Seungsin; Rao, Raghuveer M.; Narasimha, Rajesh
2003-07-01
The paper defines self-similarity for vector processes by employing the discrete-time continuous-dilation operation which has successfully been used previously by the authors to define 1-D discrete-time stochastic self-similar processes. To define self-similarity of vector processes, it is required to consider the cross-correlation functions between different 1-D processes as well as the autocorrelation function of each constituent 1-D process in it. System models to synthesize self-similar vector processes are constructed based on the definition. With these systems, it is possible to generate self-similar vector processes from white noise inputs. An important aspect of the proposed models is that they can be used to synthesize various types of self-similar vector processes by choosing proper parameters. Additionally, the paper presents evidence of vector self-similarity in two-channel wireless LAN data and applies the aforementioned systems to simulate the corresponding network traffic traces.
From the experience of development of composite materials with desired properties
NASA Astrophysics Data System (ADS)
Garkina, I. A.; Danilov, A. M.
2017-04-01
Using the experience in the development of composite materials with desired properties is given the algorithm of construction materials synthesis on the basis of their representation in the form of a complex system. The possibility of creation of a composite and implementation of the technical task originally are defined at a stage of cognitive modeling. On the basis of development of the cognitive map hierarchical structures of criteria of quality are defined; according to them for each allocated large-scale level the corresponding block diagrams of system are specified. On the basis of the solution of problems of one-criteria optimization with use of the found optimum values formalization of a multi-criteria task and its decision is carried out (the optimum organization and properties of system are defined). The emphasis is on methodological aspects of mathematical modeling (construction of a generalized and partial models to optimize the properties and structure of materials, including those based on the concept of systemic homeostasis).
Theory of constraints for publicly funded health systems.
Sadat, Somayeh; Carter, Michael W; Golden, Brian
2013-03-01
Originally developed in the context of publicly traded for-profit companies, theory of constraints (TOC) improves system performance through leveraging the constraint(s). While the theory seems to be a natural fit for resource-constrained publicly funded health systems, there is a lack of literature addressing the modifications required to adopt TOC and define the goal and performance measures. This paper develops a system dynamics representation of the classical TOC's system-wide goal and performance measures for publicly traded for-profit companies, which forms the basis for developing a similar model for publicly funded health systems. The model is then expanded to include some of the factors that affect system performance, providing a framework to apply TOC's process of ongoing improvement in publicly funded health systems. Future research is required to more accurately define the factors affecting system performance and populate the model with evidence-based estimates for various parameters in order to use the model to guide TOC's process of ongoing improvement.
Cannon, Robert C; Gleeson, Padraig; Crook, Sharon; Ganapathy, Gautham; Marin, Boris; Piasini, Eugenio; Silver, R Angus
2014-01-01
Computational models are increasingly important for studying complex neurophysiological systems. As scientific tools, it is essential that such models can be reproduced and critically evaluated by a range of scientists. However, published models are currently implemented using a diverse set of modeling approaches, simulation tools, and computer languages making them inaccessible and difficult to reproduce. Models also typically contain concepts that are tightly linked to domain-specific simulators, or depend on knowledge that is described exclusively in text-based documentation. To address these issues we have developed a compact, hierarchical, XML-based language called LEMS (Low Entropy Model Specification), that can define the structure and dynamics of a wide range of biological models in a fully machine readable format. We describe how LEMS underpins the latest version of NeuroML and show that this framework can define models of ion channels, synapses, neurons and networks. Unit handling, often a source of error when reusing models, is built into the core of the language by specifying physical quantities in models in terms of the base dimensions. We show how LEMS, together with the open source Java and Python based libraries we have developed, facilitates the generation of scripts for multiple neuronal simulators and provides a route for simulator free code generation. We establish that LEMS can be used to define models from systems biology and map them to neuroscience-domain specific simulators, enabling models to be shared between these traditionally separate disciplines. LEMS and NeuroML 2 provide a new, comprehensive framework for defining computational models of neuronal and other biological systems in a machine readable format, making them more reproducible and increasing the transparency and accessibility of their underlying structure and properties.
Cannon, Robert C.; Gleeson, Padraig; Crook, Sharon; Ganapathy, Gautham; Marin, Boris; Piasini, Eugenio; Silver, R. Angus
2014-01-01
Computational models are increasingly important for studying complex neurophysiological systems. As scientific tools, it is essential that such models can be reproduced and critically evaluated by a range of scientists. However, published models are currently implemented using a diverse set of modeling approaches, simulation tools, and computer languages making them inaccessible and difficult to reproduce. Models also typically contain concepts that are tightly linked to domain-specific simulators, or depend on knowledge that is described exclusively in text-based documentation. To address these issues we have developed a compact, hierarchical, XML-based language called LEMS (Low Entropy Model Specification), that can define the structure and dynamics of a wide range of biological models in a fully machine readable format. We describe how LEMS underpins the latest version of NeuroML and show that this framework can define models of ion channels, synapses, neurons and networks. Unit handling, often a source of error when reusing models, is built into the core of the language by specifying physical quantities in models in terms of the base dimensions. We show how LEMS, together with the open source Java and Python based libraries we have developed, facilitates the generation of scripts for multiple neuronal simulators and provides a route for simulator free code generation. We establish that LEMS can be used to define models from systems biology and map them to neuroscience-domain specific simulators, enabling models to be shared between these traditionally separate disciplines. LEMS and NeuroML 2 provide a new, comprehensive framework for defining computational models of neuronal and other biological systems in a machine readable format, making them more reproducible and increasing the transparency and accessibility of their underlying structure and properties. PMID:25309419
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less
Meta II: Multi-Model Language Suite for Cyber Physical Systems
2013-03-01
AVM META) projects have developed tools for designing cyber physical (or Mechatronic ) Systems . These systems are increasingly complex, take much...projects have developed tools for designing cyber physical (CPS) (or Mechatronic ) systems . Exemplified by modern amphibious and ground military...and parametric interface of Simulink models and defines associations with CyPhy components and component interfaces. 2. Embedded Systems Modeling
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-26
... some days and this does not appear to be an error in the modeling system''.\\2\\ \\2\\ Commenter referenced... modeling with a readily available modeling system (since construction of a complete modeling system from... from WildEarth Guardians. Comment No. 1--The commenter stated that EPA inappropriately defined the term...
NASA Astrophysics Data System (ADS)
Gülerce, Zeynep; Buğra Soyman, Kadir; Güner, Barış; Kaymakci, Nuretdin
2017-12-01
This contribution provides an updated planar seismic source characterization (SSC) model to be used in the probabilistic seismic hazard assessment (PSHA) for Istanbul. It defines planar rupture systems for the four main segments of the North Anatolian fault zone (NAFZ) that are critical for the PSHA of Istanbul: segments covering the rupture zones of the 1999 Kocaeli and Düzce earthquakes, central Marmara, and Ganos/Saros segments. In each rupture system, the source geometry is defined in terms of fault length, fault width, fault plane attitude, and segmentation points. Activity rates and the magnitude recurrence models for each rupture system are established by considering geological and geodetic constraints and are tested based on the observed seismicity that is associated with the rupture system. Uncertainty in the SSC model parameters (e.g., b value, maximum magnitude, slip rate, weights of the rupture scenarios) is considered, whereas the uncertainty in the fault geometry is not included in the logic tree. To acknowledge the effect of earthquakes that are not associated with the defined rupture systems on the hazard, a background zone is introduced and the seismicity rates in the background zone are calculated using smoothed-seismicity approach. The state-of-the-art SSC model presented here is the first fully documented and ready-to-use fault-based SSC model developed for the PSHA of Istanbul.
Review of Soil Models and Their Implementation in Multibody System Algorithms
2012-02-01
models for use with ABAQUS . The constitutive models of the user defined materials can be programmed in the user subroutine UMAT. Many user defined...mechanical characteristics of mildly or moderately expansive unsaturated soils. As originally proposed by Alonso, utilizing a critical state framework...review of some of these programs is presented. ABAQUS ABAQUS is a popular FE analysis program that contains a wide variety of material models and
Guo, Xiufang; Das, Mainak; Rumsey, John; Gonzalez, Mercedes; Stancescu, Maria; Hickman, James
2010-12-01
To date, the coculture of motoneurons (MNs) and skeletal muscle in a defined in vitro system has only been described in one study and that was between rat MNs and rat skeletal muscle. No in vitro studies have demonstrated human MN to rat muscle synapse formation, although numerous studies have attempted to implant human stem cells into rat models to determine if they could be of therapeutic use in disease or spinal injury models, although with little evidence of neuromuscular junction (NMJ) formation. In this report, MNs differentiated from human spinal cord stem cells, together with rat skeletal myotubes, were used to build a coculture system to demonstrate that NMJ formation between human MNs and rat skeletal muscles is possible. The culture was characterized by morphology, immunocytochemistry, and electrophysiology, while NMJ formation was demonstrated by immunocytochemistry and videography. This defined system provides a highly controlled reproducible model for studying the formation, regulation, maintenance, and repair of NMJs. The in vitro coculture system developed here will be an important model system to study NMJ development, the physiological and functional mechanism of synaptic transmission, and NMJ- or synapse-related disorders such as amyotrophic lateral sclerosis, as well as for drug screening and therapy design.
A general method for radio spectrum efficiency defining
NASA Astrophysics Data System (ADS)
Ramadanovic, Ljubomir M.
1986-08-01
A general method for radio spectrum efficiency defining is proposed. Although simple it can be applied to various radio services. The concept of spectral elements, as information carriers, is introduced to enable the organization of larger spectral spaces - radio network models - characteristic for a particular radio network. The method is applied to some radio network models, concerning cellular radio telephone systems and digital radio relay systems, to verify its unified approach capability. All discussed radio services operate continuously.
Mathematical Methods of System Analysis in Construction Materials
NASA Astrophysics Data System (ADS)
Garkina, Irina; Danilov, Alexander
2017-10-01
System attributes of construction materials are defined: complexity of an object, integrity of set of elements, existence of essential, stable relations between elements defining integrative properties of system, existence of structure, etc. On the basis of cognitive modelling (intensive and extensive properties; the operating parameters) materials (as difficult systems) and creation of the cognitive map the hierarchical modular structure of criteria of quality is under construction. It actually is a basis for preparation of the specification on development of material (the required organization and properties). Proceeding from a modern paradigm (model of statement of problems and their decisions) of development of materials, levels and modules are specified in structure of material. It when using the principles of the system analysis allows to considered technological process as the difficult system consisting of elements of the distinguished specification level: from atomic before separate process. Each element of system depending on an effective objective is considered as separate system with more detailed levels of decomposition. Among them, semantic and qualitative analyses of an object (are considered a research objective, decomposition levels, separate elements and communications between them come to light). Further formalization of the available knowledge in the form of mathematical models (structural identification) is carried out; communications between input and output parameters (parametrical identification) are defined. Hierarchical structures of criteria of quality are under construction for each allocated level. On her the relevant hierarchical structures of system (material) are under construction. Regularities of structurization and formation of properties, generally are considered at the levels from micro to a macrostructure. The mathematical model of material is represented as set of the models corresponding to private criteria by which separate modules and their levels (the mathematical description, a decision algorithm) are defined. Adequacy is established (compliance of results of modelling to experimental data; is defined by the level of knowledge of process and validity of the accepted assumptions). The global criterion of quality of material is considered as a set of private criteria (properties). Synthesis of material is carried out on the basis of one-criteria optimization on each of the chosen private criteria. Results of one-criteria optimization are used at multicriteria optimization. The methods of developing materials as single-purpose, multi-purpose, including contradictory, systems are indicated. The scheme of synthesis of composite materials as difficult systems is developed. The specified system approach effectively was used in case of synthesis of composite materials with special properties.
Development of a Water Recovery System Resource Tracking Model
NASA Technical Reports Server (NTRS)
Chambliss, Joe; Stambaugh, Imelda; Sarguishm, Miriam; Shull, Sarah; Moore, Michael
2014-01-01
A simulation model has been developed to track water resources in an exploration vehicle using regenerative life support (RLS) systems. The model integrates the functions of all the vehicle components that affect the processing and recovery of water during simulated missions. The approach used in developing the model results in the RTM being a part of of a complete vehicle simulation that can be used in real time mission studies. Performance data for the variety of components in the RTM is focused on water processing and has been defined based on the most recent information available for the technology of the component. This paper will describe the process of defining the RLS system to be modeled and then the way the modeling environment was selected and how the model has been implemented. Results showing how the variety of RLS components exchange water are provided in a set of test cases.
The development of a classification system for maternity models of care.
Donnolley, Natasha; Butler-Henderson, Kerryn; Chapman, Michael; Sullivan, Elizabeth
2016-08-01
A lack of standard terminology or means to identify and define models of maternity care in Australia has prevented accurate evaluations of outcomes for mothers and babies in different models of maternity care. As part of the Commonwealth-funded National Maternity Data Development Project, a classification system was developed utilising a data set specification that defines characteristics of models of maternity care. The Maternity Care Classification System or MaCCS was developed using a participatory action research design that built upon the published and grey literature. The study identified the characteristics that differentiate models of care and classifies models into eleven different Major Model Categories. The MaCCS will enable individual health services, local health districts (networks), jurisdictional and national health authorities to make better informed decisions for planning, policy development and delivery of maternity services in Australia. © The Author(s) 2016.
Conceptual models of information processing
NASA Technical Reports Server (NTRS)
Stewart, L. J.
1983-01-01
The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.
Static shape control for flexible structures
NASA Technical Reports Server (NTRS)
Rodriguez, G.; Scheid, R. E., Jr.
1986-01-01
An integrated methodology is described for defining static shape control laws for large flexible structures. The techniques include modeling, identifying and estimating the control laws of distributed systems characterized in terms of infinite dimensional state and parameter spaces. The models are expressed as interconnected elliptic partial differential equations governing a range of static loads, with the capability of analyzing electromagnetic fields around antenna systems. A second-order analysis is carried out for statistical errors, and model parameters are determined by maximizing an appropriate defined likelihood functional which adjusts the model to observational data. The parameter estimates are derived from the conditional mean of the observational data, resulting in a least squares superposition of shape functions obtained from the structural model.
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Kelley, Gary W.
2012-01-01
The Department of Defense (DoD) defined System Operational Effectiveness (SOE) model provides an exceptional framework for an affordable approach to the development and operation of space launch vehicles and their supporting infrastructure. The SOE model provides a focal point from which to direct and measure technical effectiveness and process efficiencies of space launch vehicles. The application of the SOE model to a space launch vehicle's development and operation effort leads to very specific approaches and measures that require consideration during the design phase. This paper provides a mapping of the SOE model to the development of space launch vehicles for human exploration by addressing the SOE model key points of measurement including System Performance, System Availability, Technical Effectiveness, Process Efficiency, System Effectiveness, Life Cycle Cost, and Affordable Operational Effectiveness. In addition, the application of the SOE model to the launch vehicle development process is defined providing the unique aspects of space launch vehicle production and operations in lieu of the traditional broader SOE context that examines large quantities of fielded systems. The tailoring and application of the SOE model to space launch vehicles provides some key insights into the operational design drivers, capability phasing, and operational support systems.
Maximally Expressive Modeling of Operations Tasks
NASA Technical Reports Server (NTRS)
Jaap, John; Richardson, Lea; Davis, Elizabeth
2002-01-01
Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiments for the International Space Station. In these experiments, the equipment used is among the most complex hardware ever developed, the information sought is at the cutting edge of scientific endeavor, and the procedures are intricate and exacting. Scheduling is made more difficult by a scarcity of station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling International Space Station experiment operations calls for a "maximally expressive" modeling schema.
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Johnson, Stephen B.; Breckenridge, Jonathan T.
2013-01-01
The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM).
A model of cloud application assignments in software-defined storages
NASA Astrophysics Data System (ADS)
Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; E Shukhman, Alexander
2017-01-01
The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.
Synthesis and Control of Flexible Systems with Component-Level Uncertainties
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Lim, Kyong B.
2009-01-01
An efficient and computationally robust method for synthesis of component dynamics is developed. The method defines the interface forces/moments as feasible vectors in transformed coordinates to ensure that connectivity requirements of the combined structure are met. The synthesized system is then defined in a transformed set of feasible coordinates. The simplicity of form is exploited to effectively deal with modeling parametric and non-parametric uncertainties at the substructure level. Uncertainty models of reasonable size and complexity are synthesized for the combined structure from those in the substructure models. In particular, we address frequency and damping uncertainties at the component level. The approach first considers the robustness of synthesized flexible systems. It is then extended to deal with non-synthesized dynamic models with component-level uncertainties by projecting uncertainties to the system level. A numerical example is given to demonstrate the feasibility of the proposed approach.
NASA Technical Reports Server (NTRS)
Gettman, Chang-Ching L.; Adams, Neil; Bedrossian, Nazareth; Valavani, Lena
1993-01-01
This paper demonstrates an approach to nonlinear control system design that uses linearization by state feedback to allow faster maneuvering of payloads by the Shuttle Remote Manipulator System (SRMS). A nonlinear feedback law is defined to cancel the nonlinear plant dynamics so that a linear controller can be designed for the SRMS. First a nonlinear design model was generated via SIMULINK. This design model included nonlinear arm dynamics derived from the Lagrangian approach, linearized servo model, and linearized gearbox model. The current SRMS position hold controller was implemented on this system. Next, a trajectory was defined using a rigid body kinematics SRMS tool, KRMS. The maneuver was simulated. Finally, higher bandwidth controllers were developed. Results of the new controllers were compared with the existing SRMS automatic control modes for the Space Station Freedom Mission Build 4 Payload extended on the SRMS.
Generation of animation sequences of three dimensional models
NASA Technical Reports Server (NTRS)
Poi, Sharon (Inventor); Bell, Brad N. (Inventor)
1990-01-01
The invention is directed toward a method and apparatus for generating an animated sequence through the movement of three-dimensional graphical models. A plurality of pre-defined graphical models are stored and manipulated in response to interactive commands or by means of a pre-defined command file. The models may be combined as part of a hierarchical structure to represent physical systems without need to create a separate model which represents the combined system. System motion is simulated through the introduction of translation, rotation and scaling parameters upon a model within the system. The motion is then transmitted down through the system hierarchy of models in accordance with hierarchical definitions and joint movement limitations. The present invention also calls for a method of editing hierarchical structure in response to interactive commands or a command file such that a model may be included, deleted, copied or moved within multiple system model hierarchies. The present invention also calls for the definition of multiple viewpoints or cameras which may exist as part of a system hierarchy or as an independent camera. The simulated movement of the models and systems is graphically displayed on a monitor and a frame is recorded by means of a video controller. Multiple movement and hierarchy manipulations are then recorded as a sequence of frames which may be played back as an animation sequence on a video cassette recorder.
ERIC Educational Resources Information Center
Nee, John G.; Kare, Audhut P.
1987-01-01
Explores several concepts in computer assisted design/computer assisted manufacturing (CAD/CAM). Defines, evaluates, reviews and compares advanced computer-aided geometric modeling and analysis techniques. Presents the results of a survey to establish the capabilities of minicomputer based-systems with the CAD/CAM packages evaluated. (CW)
A distributed snow-evolution modeling system (SnowModel)
Glen E. Liston; Kelly Elder
2006-01-01
SnowModel is a spatially distributed snow-evolution modeling system designed for application in landscapes, climates, and conditions where snow occurs. It is an aggregation of four submodels: MicroMet defines meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowPack simulates snow depth and water-equivalent evolution, and SnowTran-3D...
Regenerative fuel cell energy storage system for a low earth orbit space station
NASA Technical Reports Server (NTRS)
Martin, R. E.; Garow, J.; Michaels, K. B.
1988-01-01
A study was conducted to define characteristics of a Regenerative Fuel Cell System (RFCS) for low earth orbit Space Station missions. The RFCS's were defined and characterized based on both an alkaline electrolyte fuel cell integrated with an alkaline electrolyte water electrolyzer and an alkaline electrolyte fuel cell integrated with an acid solid polymer electrolyte (SPE) water electrolyzer. The study defined the operating characteristics of the systems including system weight, volume, and efficiency. A maintenance philosophy was defined and the implications of system reliability requirements and modularization were determined. Finally, an Engineering Model System was defined and a program to develop and demonstrate the EMS and pacing technology items that should be developed in parallel with the EMS were identified. The specific weight of an optimized RFCS operating at 140 F was defined as a function of system efficiency for a range of module sizes. An EMS operating at a nominal temperature of 180 F and capable of delivery of 10 kW at an overall efficiency of 55.4 percent is described. A program to develop the EMS is described including a technology development effort for pacing technology items.
From complexity to reality: providing useful frameworks for defining systems of care.
Levison-Johnson, Jody; Wenz-Gross, Melodie
2010-02-01
Because systems of care are not uniform across communities, there is a need to better document the process of system development, define the complexity, and describe the development of the structures, processes, and relationships within communities engaged in system transformation. By doing so, we begin to identify the necessary and sufficient components that, at minimum, move us from usual care within a naturally occurring system to a true system of care. Further, by documenting and measuring the degree to which key components are operating, we may be able to identify the most successful strategies in creating system reform. The theory of change and logic model offer a useful framework for communities to begin the adaptive work necessary to effect true transformation. Using the experience of two system of care communities, this new definition and the utility of a theory of change and logic model framework for defining local system transformation efforts will be discussed. Implications for the field, including the need to further examine the natural progression of systems change and to create quantifiable measures of transformation, will be raised as new challenges for the evolving system of care movement.
Transactions in domain-specific information systems
NASA Astrophysics Data System (ADS)
Zacek, Jaroslav
2017-07-01
Substantial number of the current information system (IS) implementations is based on transaction approach. In addition, most of the implementations are domain-specific (e.g. accounting IS, resource planning IS). Therefore, we have to have a generic transaction model to build and verify domain-specific IS. The paper proposes a new transaction model for domain-specific ontologies. This model is based on value oriented business process modelling technique. The transaction model is formalized by the Petri Net theory. First part of the paper presents common business processes and analyses related to business process modeling. Second part defines the transactional model delimited by REA enterprise ontology paradigm and introduces states of the generic transaction model. The generic model proposal is defined and visualized by the Petri Net modelling tool. Third part shows application of the generic transaction model. Last part of the paper concludes results and discusses a practical usability of the generic transaction model.
High-level PC-based laser system modeling
NASA Astrophysics Data System (ADS)
Taylor, Michael S.
1991-05-01
Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.
The Value of SysML Modeling During System Operations: A Case Study
NASA Technical Reports Server (NTRS)
Dutenhoffer, Chelsea; Tirona, Joseph
2013-01-01
System models are often touted as engineering tools that promote better understanding of systems, but these models are typically created during system design. The Ground Data System (GDS) team for the Dawn spacecraft took on a case study to see if benefits could be achieved by starting a model of a system already in operations. This paper focuses on the four steps the team undertook in modeling the Dawn GDS: defining a model structure, populating model elements, verifying that the model represented reality, and using the model to answer system-level questions and simplify day-to-day tasks. Throughout this paper the team outlines our thought processes and the system insights the model provided.
The value of SysML modeling during system operations: A case study
NASA Astrophysics Data System (ADS)
Dutenhoffer, C.; Tirona, J.
System models are often touted as engineering tools that promote better understanding of systems, but these models are typically created during system design. The Ground Data System (GDS) team for the Dawn spacecraft took on a case study to see if benefits could be achieved by starting a model of a system already in operations. This paper focuses on the four steps the team undertook in modeling the Dawn GDS: defining a model structure, populating model elements, verifying that the model represented reality, and using the model to answer system-level questions and simplify day-to-day tasks. Throughout this paper the team outlines our thought processes and the system insights the model provided.
Active control of large space structures: An introduction and overview
NASA Technical Reports Server (NTRS)
Doane, G. B., III; Tollison, D. K.; Waites, H. B.
1985-01-01
An overview of the large space structure (LSS) control system design problem is presented. The LSS is defined as a class of system, and LSS modeling techniques are discussed. Model truncation, control system objectives, current control law design techniques, and particular problem areas are discussed.
NASA Technical Reports Server (NTRS)
Albus, James S.; Mccain, Harry G.; Lumia, Ronald
1989-01-01
The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.
Evolution of Geometric Sensitivity Derivatives from Computer Aided Design Models
NASA Technical Reports Server (NTRS)
Jones, William T.; Lazzara, David; Haimes, Robert
2010-01-01
The generation of design parameter sensitivity derivatives is required for gradient-based optimization. Such sensitivity derivatives are elusive at best when working with geometry defined within the solid modeling context of Computer-Aided Design (CAD) systems. Solid modeling CAD systems are often proprietary and always complex, thereby necessitating ad hoc procedures to infer parameter sensitivity. A new perspective is presented that makes direct use of the hierarchical associativity of CAD features to trace their evolution and thereby track design parameter sensitivity. In contrast to ad hoc methods, this method provides a more concise procedure following the model design intent and determining the sensitivity of CAD geometry directly to its respective defining parameters.
Current state of the mass storage system reference model
NASA Technical Reports Server (NTRS)
Coyne, Robert
1993-01-01
IEEE SSSWG was chartered in May 1990 to abstract the hardware and software components of existing and emerging storage systems and to define the software interfaces between these components. The immediate goal is the decomposition of a storage system into interoperable functional modules which vendors can offer as separate commercial products. The ultimate goal is to develop interoperable standards which define the software interfaces, and in the distributed case, the associated protocols to each of the architectural modules in the model. The topics are presented in viewgraph form and include the following: IEEE SSSWG organization; IEEE SSSWG subcommittees & chairs; IEEE standards activity board; layered view of the reference model; layered access to storage services; IEEE SSSWG emphasis; and features for MSSRM version 5.
Defining, Assessing, and Promoting E-Learning Success: An Information Systems Perspective
ERIC Educational Resources Information Center
Holsapple, Clyde W.; Lee-Post, Anita
2006-01-01
This research advances the understanding of how to define, evaluate, and promote e-learning success from an information systems perspective. It introduces the E-Learning Success Model, which posits that the overall success of an e-learning initiative depends on the attainment of success at each of the three stages of e-learning systems…
NASA Astrophysics Data System (ADS)
Pasqualini, D.; Witkowski, M.
2005-12-01
The Critical Infrastructure Protection / Decision Support System (CIP/DSS) project, supported by the Science and Technology Office, has been developing a risk-informed Decision Support System that provides insights for making critical infrastructure protection decisions. The system considers seventeen different Department of Homeland Security defined Critical Infrastructures (potable water system, telecommunications, public health, economics, etc.) and their primary interdependencies. These infrastructures have been modeling in one model called CIP/DSS Metropolitan Model. The modeling approach used is a system dynamics modeling approach. System dynamics modeling combines control theory and the nonlinear dynamics theory, which is defined by a set of coupled differential equations, which seeks to explain how the structure of a given system determines its behavior. In this poster we present a system dynamics model for one of the seventeen critical infrastructures, a generic metropolitan potable water system (MPWS). Three are the goals: 1) to gain a better understanding of the MPWS infrastructure; 2) to identify improvements that would help protect MPWS; and 3) to understand the consequences, interdependencies, and impacts, when perturbations occur to the system. The model represents raw water sources, the metropolitan water treatment process, storage of treated water, damage and repair to the MPWS, distribution of water, and end user demand, but does not explicitly represent the detailed network topology of an actual MPWS. The MPWS model is dependent upon inputs from the metropolitan population, energy, telecommunication, public health, and transportation models as well as the national water and transportation models. We present modeling results and sensitivity analysis indicating critical choke points, negative and positive feedback loops in the system. A general scenario is also analyzed where the potable water system responds to a generic disruption.
Control and modeling of a CELSS (Controlled Ecological Life Support System)
NASA Technical Reports Server (NTRS)
Auslander, D. M.; Spear, R. C.; Babcock, P. S.; Nadel, M.
1983-01-01
Research topics that arise from the conceptualization of control for closed life support systems which are life support systems in which all or most of the mass is recycled are discussed. Modeling and control of uncertain and poorly defined systems, resource allocation in closed life support systems, and control structures or systems with delay and closure are emphasized.
Business model design for a wearable biofeedback system.
Hidefjäll, Patrik; Titkova, Dina
2015-01-01
Wearable sensor technologies used to track daily activities have become successful in the consumer market. In order for wearable sensor technology to offer added value in the more challenging areas of stress-rehab care and occupational health stress-related biofeedback parameters need to be monitored and more elaborate business models are needed. To identify probable success factors for a wearable biofeedback system (Affective Health) in the two mentioned market segments in a Swedish setting, we conducted literature studies and interviews with relevant representatives. Data were collected and used first to describe the two market segments and then to define likely feasible business model designs, according to the Business Model Canvas framework. Needs of stakeholders were identified as inputs to business model design. Value propositions, a key building block of a business model, were defined for each segment. The value proposition for occupational health was defined as "A tool that can both identify employees at risk of stress-related disorders and reinforce healthy sustainable behavior" and for healthcare as: "Providing therapists with objective data about the patient's emotional state and motivating patients to better engage in the treatment process".
Modeling a Longitudinal Relational Research Data Systems
ERIC Educational Resources Information Center
Olsen, Michelle D. Hunt
2010-01-01
A study was conducted to propose a research-based model for a longitudinal data research system that addressed recommendations from a synthesis of literature related to: (1) needs reported by the U.S. Department of Education, (2) the twelve mandatory elements that define federally approved state longitudinal data systems (SLDS), (3) the…
Urine sampling and collection system
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Reinhardt, C. G.
1971-01-01
This specification defines the performance and design requirements for the urine sampling and collection system engineering model and establishes requirements for its design, development, and test. The model shall provide conceptual verification of a system applicable to manned space flight which will automatically provide for collection, volume sensing, and sampling of urine.
A proposed application programming interface for a physical volume repository
NASA Technical Reports Server (NTRS)
Jones, Merritt; Williams, Joel; Wrenn, Richard
1996-01-01
The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.
Models and techniques for evaluating the effectiveness of aircraft computing systems
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1977-01-01
Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.
OWL references in ORM conceptual modelling
NASA Astrophysics Data System (ADS)
Matula, Jiri; Belunek, Roman; Hunka, Frantisek
2017-07-01
Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.
Hospital information system: reusability, designing, modelling, recommendations for implementing.
Huet, B
1998-01-01
The aims of this paper are to precise some essential conditions for building reuse models for hospital information systems (HIS) and to present an application for hospital clinical laboratories. Reusability is a general trend in software, however reuse can involve a more or less part of design, classes, programs; consequently, a project involving reusability must be precisely defined. In the introduction it is seen trends in software, the stakes of reuse models for HIS and the special use case constituted with a HIS. The main three parts of this paper are: 1) Designing a reuse model (which objects are common to several information systems?) 2) A reuse model for hospital clinical laboratories (a genspec object model is presented for all laboratories: biochemistry, bacteriology, parasitology, pharmacology, ...) 3) Recommendations for generating plug-compatible software components (a reuse model can be implemented as a framework, concrete factors that increase reusability are presented). In conclusion reusability is a subtle exercise of which project must be previously and carefully defined.
NASA Technical Reports Server (NTRS)
Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)
1991-01-01
An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.
Systems and context modeling approach to requirements analysis
NASA Astrophysics Data System (ADS)
Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick
2014-08-01
Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.
Flight Dynamic Model Exchange using XML
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2002-01-01
The AIAA Modeling and Simulation Technical Committee has worked for several years to develop a standard by which the information needed to develop physics-based models of aircraft can be specified. The purpose of this standard is to provide a well-defined set of information, definitions, data tables and axis systems so that cooperating organizations can transfer a model from one simulation facility to another with maximum efficiency. This paper proposes using an application of the eXtensible Markup Language (XML) to implement the AIAA simulation standard. The motivation and justification for using a standard such as XML is discussed. Necessary data elements to be supported are outlined. An example of an aerodynamic model as an XML file is given. This example includes definition of independent and dependent variables for function tables, definition of key variables used to define the model, and axis systems used. The final steps necessary for implementation of the standard are presented. Software to take an XML-defined model and import/export it to/from a given simulation facility is discussed, but not demonstrated. That would be the next step in final implementation of standards for physics-based aircraft dynamic models.
Short-term integrated forecasting system : 1993 model documentation report
DOT National Transportation Integrated Search
1993-12-01
The purpose of this report is to define the Short-Term Integrated Forecasting System (STIFS) and describe its basic properties. The Energy Information Administration (EIA) of the U.S. Energy Department (DOE) developed the STIFS model to generate shor...
Survivability Modeling & Simulation(Aircraft Survivability, Fall 2009)
2009-01-01
Projects.” The Human Effectiveness Directorate is responsible for providing injury assessments for most modern Air Force ejection systems, for...developing ejection test mannequins, and for continuing to define human injury limits and criteria. The directorate maintains a man-rated horizontal...Using numerous models and testing, the directorate can define ejection /impact injury criteria for aircraft equipment to prevent personnel injuries
1988 Revisions to the 1978 National Fire-Danger Rating System
Robert E. Burgan
1988-01-01
The 1978 National Fire-Danger Rating System does not work well in the humid environment of the Eastern United States. System modifications to correct problems and their operational impact on System users are described. A new set of 20 fuel models is defined and compared graphically with the 1978 fuel models. Technical documentation of System changes is provided.
Making the Invisible Visible: A Model for Delivery Systems in Adult Education
ERIC Educational Resources Information Center
Alex, Jennifer L.; Miller, Elizabeth A.; Platt, R. Eric; Rachal, John R.; Gammill, Deidra M.
2007-01-01
Delivery systems are not well defined in adult education. Therefore, this article reviews the multiple components that overlap to affect the adult learner and uses them to create a model for a comprehensive delivery system in adult education with these individual components as sub-systems that are interrelated and inter-locked. These components…
NASA Astrophysics Data System (ADS)
Leakeas, Charles L.; Capehart, Shay R.; Bartell, Richard J.; Cusumano, Salvatore J.; Whiteley, Matthew R.
2011-06-01
Laser weapon systems comprised of tiled subapertures are rapidly emerging in importance in the directed energy community. Performance models of these laser weapon systems have been developed from numerical simulations of a high fidelity wave-optics code called WaveTrain which is developed by MZA Associates. System characteristics such as mutual coherence, differential jitter, and beam quality rms wavefront error are defined for a focused beam on the target. Engagement scenarios are defined for various platform and target altitudes, speeds, headings, and slant ranges along with the natural wind speed and heading. Inputs to the performance model include platform and target height and velocities, Fried coherence length, Rytov number, isoplanatic angle, thermal blooming distortion number, Greenwood and Tyler frequencies, and atmospheric transmission. The performance model fit is based on power-in-the-bucket (PIB) values against the PIB from the simulation results for the vacuum diffraction-limited spot size as the bucket. The goal is to develop robust performance models for aperture phase error, turbulence, and thermal blooming effects in tiled subaperture systems.
Security Engineering FY17 Systems Aware Cybersecurity
2017-12-07
11 Figure 4 A hierarchical controls model that defines the expected service of a UAV. Each level is...defined by a generic control structure. Inadequate control in each level can cause an adversarial action to degrade the expected service and produce a...and can completely violate the systems expected service by escalating their privileges by either using the attack vectors presented individually or
Modelling and analysis of workflow for lean supply chains
NASA Astrophysics Data System (ADS)
Ma, Jinping; Wang, Kanliang; Xu, Lida
2011-11-01
Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.
1981-03-01
tifiability is imposed; and the system designer now has a tool to evaluate how well the model describes the system . The algorithm is verified by checking its...xi I. Introduction In analyzing a system , the design engineer uses a mathematical model. The model, by its very definition, represents the system . It...number of G (See Eq (23).) can 18 give the designer a good indication of just how well the model defined by Eqs (1) through (3) describes the system
Free-form geometric modeling by integrating parametric and implicit PDEs.
Du, Haixia; Qin, Hong
2007-01-01
Parametric PDE techniques, which use partial differential equations (PDEs) defined over a 2D or 3D parametric domain to model graphical objects and processes, can unify geometric attributes and functional constraints of the models. PDEs can also model implicit shapes defined by level sets of scalar intensity fields. In this paper, we present an approach that integrates parametric and implicit trivariate PDEs to define geometric solid models containing both geometric information and intensity distribution subject to flexible boundary conditions. The integrated formulation of second-order or fourth-order elliptic PDEs permits designers to manipulate PDE objects of complex geometry and/or arbitrary topology through direct sculpting and free-form modeling. We developed a PDE-based geometric modeling system for shape design and manipulation of PDE objects. The integration of implicit PDEs with parametric geometry offers more general and arbitrary shape blending and free-form modeling for objects with intensity attributes than pure geometric models.
Functional structure and dynamics of the human nervous system
NASA Technical Reports Server (NTRS)
Lawrence, J. A.
1981-01-01
The status of an effort to define the directions needed to take in extending pilot models is reported. These models are needed to perform closed-loop (man-in-the-loop) feedback flight control system designs and to develop cockpit display requirements. The approach taken is to develop a hypothetical working model of the human nervous system by reviewing the current literature in neurology and psychology and to develop a computer model of this hypothetical working model.
A Comprehensive Planning Model and Delivery System for Leadership Training Programs.
ERIC Educational Resources Information Center
Janosik, Steven M.; Sina, Julie A.
1988-01-01
Presents an eight-step planning model that operationally defines a comprehensive delivery systems approach to campuswide leadership training. Lists four goals of the model: to increase efficiency of leadership training through shared resources, to decrease costs, to provide quality control, and to increase impact of programming effort by creating…
Managing Analysis Models in the Design Process
NASA Technical Reports Server (NTRS)
Briggs, Clark
2006-01-01
Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.
Tello-Leal, Edgar; Chiotti, Omar; Villarreal, Pablo David
2012-12-01
The paper presents a methodology that follows a top-down approach based on a Model-Driven Architecture for integrating and coordinating healthcare services through cross-organizational processes to enable organizations providing high quality healthcare services and continuous process improvements. The methodology provides a modeling language that enables organizations conceptualizing an integration agreement, and identifying and designing cross-organizational process models. These models are used for the automatic generation of: the private view of processes each organization should perform to fulfill its role in cross-organizational processes, and Colored Petri Net specifications to implement these processes. A multi-agent system platform provides agents able to interpret Colored Petri-Nets to enable the communication between the Healthcare Information Systems for executing the cross-organizational processes. Clinical documents are defined using the HL7 Clinical Document Architecture. This methodology guarantees that important requirements for healthcare services integration and coordination are fulfilled: interoperability between heterogeneous Healthcare Information Systems; ability to cope with changes in cross-organizational processes; guarantee of alignment between the integrated healthcare service solution defined at the organizational level and the solution defined at technological level; and the distributed execution of cross-organizational processes keeping the organizations autonomy.
Identification of propulsion systems
NASA Technical Reports Server (NTRS)
Merrill, Walter; Guo, Ten-Huei; Duyar, Ahmet
1991-01-01
This paper presents a tutorial on the use of model identification techniques for the identification of propulsion system models. These models are important for control design, simulation, parameter estimation, and fault detection. Propulsion system identification is defined in the context of the classical description of identification as a four step process that is unique because of special considerations of data and error sources. Propulsion system models are described along with the dependence of system operation on the environment. Propulsion system simulation approaches are discussed as well as approaches to propulsion system identification with examples for both air breathing and rocket systems.
Reference coordinate systems: An update. Supplement 11
NASA Technical Reports Server (NTRS)
Mueller, Ivan I.
1988-01-01
A common requirement for all geodetic investigations is a well-defined coordinate system attached to the earth in some prescribed way, as well as a well-defined inertial coordinate system in which the motions of the terrestrial frame can be monitored. The paper deals with the problems encountered when establishing such coordinate systems and the transformations between them. In addition, problems related to the modeling of the deformable earth are discussed. This paper is an updated version of the earlier work, Reference Coordinate Systems for Earth Dynamics: A Preview, by the author.
Remaining lifetime modeling using State-of-Health estimation
NASA Astrophysics Data System (ADS)
Beganovic, Nejra; Söffker, Dirk
2017-08-01
Technical systems and system's components undergo gradual degradation over time. Continuous degradation occurred in system is reflected in decreased system's reliability and unavoidably lead to a system failure. Therefore, continuous evaluation of State-of-Health (SoH) is inevitable to provide at least predefined lifetime of the system defined by manufacturer, or even better, to extend the lifetime given by manufacturer. However, precondition for lifetime extension is accurate estimation of SoH as well as the estimation and prediction of Remaining Useful Lifetime (RUL). For this purpose, lifetime models describing the relation between system/component degradation and consumed lifetime have to be established. In this contribution modeling and selection of suitable lifetime models from database based on current SoH conditions are discussed. Main contribution of this paper is the development of new modeling strategies capable to describe complex relations between measurable system variables, related system degradation, and RUL. Two approaches with accompanying advantages and disadvantages are introduced and compared. Both approaches are capable to model stochastic aging processes of a system by simultaneous adaption of RUL models to current SoH. The first approach requires a priori knowledge about aging processes in the system and accurate estimation of SoH. An estimation of SoH here is conditioned by tracking actual accumulated damage into the system, so that particular model parameters are defined according to a priori known assumptions about system's aging. Prediction accuracy in this case is highly dependent on accurate estimation of SoH but includes high number of degrees of freedom. The second approach in this contribution does not require a priori knowledge about system's aging as particular model parameters are defined in accordance to multi-objective optimization procedure. Prediction accuracy of this model does not highly depend on estimated SoH. This model has lower degrees of freedom. Both approaches rely on previously developed lifetime models each of them corresponding to predefined SoH. Concerning first approach, model selection is aided by state-machine-based algorithm. In the second approach, model selection conditioned by tracking an exceedance of predefined thresholds is concerned. The approach is applied to data generated from tribological systems. By calculating Root Squared Error (RSE), Mean Squared Error (MSE), and Absolute Error (ABE) the accuracy of proposed models/approaches is discussed along with related advantages and disadvantages. Verification of the approach is done using cross-fold validation, exchanging training and test data. It can be stated that the newly introduced approach based on data (denoted as data-based or data-driven) parametric models can be easily established providing detailed information about remaining useful/consumed lifetime valid for systems with constant load but stochastically occurred damage.
McDonald, Richard; Nelson, Jonathan; Kinzel, Paul; Conaway, Jeffrey S.
2006-01-01
The Multi-Dimensional Surface-Water Modeling System (MD_SWMS) is a Graphical User Interface for surface-water flow and sediment-transport models. The capabilities of MD_SWMS for developing models include: importing raw topography and other ancillary data; building the numerical grid and defining initial and boundary conditions; running simulations; visualizing results; and comparing results with measured data.
Automated Gun Laying System for Self-Propelled Artillery Weapons.
1980-05-30
model designed specifically to the requirements of a test bed system. The system configuration and characteristics were specified through a series of...proposed by the contractor was further defined, utilizing the M109 component information provided by the COTR. Math models were developed to predict system...data. The model used for the TB-I program did not have the capability for a remote reset function, hence it was necessary to instruct the crew (loader
User Modeling in Adaptive Hypermedia Educational Systems
ERIC Educational Resources Information Center
Martins, Antonio Constantino; Faria, Luiz; Vaz de Carvalho, Carlos; Carrapatoso, Eurico
2008-01-01
This document is a survey in the research area of User Modeling (UM) for the specific field of Adaptive Learning. The aims of this document are: To define what it is a User Model; To present existing and well known User Models; To analyze the existent standards related with UM; To compare existing systems. In the scientific area of User Modeling…
ERIC Educational Resources Information Center
Spaulding, Trent Joseph
2011-01-01
The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…
Nanoscale Transport Optimization
2008-12-04
could be argued that the advantage of using ABAQUS for this modeling construct has more to do with its ability to impose a user-defined subroutine that...finite element analysis. This is accomplished by employing a user defined subroutine for fluid properties at the interface within the finite element...package ABAQUS . Model Components: As noted above the governing equation for the material system is given as, ( ) ( ) 4484476444 8444 76
Three-dimensional magnetic induction model of an octagonal edge-defined film-fed growth system
NASA Astrophysics Data System (ADS)
Rajendran, S.; Holmes, K.; Menna, A.
1994-03-01
Silicon wafers for the photovoltaic industry are produced by growing thin octagonal tubes by the edge-defined film-fed growth (EFG) process. The thermal origin of the wafer thickness variations was studied with a three-dimensional (3D) magnetic induction model. The implementation of the computer code and the significance of the computed results for improving the thickness uniformity are discussed.
NASA Astrophysics Data System (ADS)
Wray, Richard B.
1991-12-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
NASA Technical Reports Server (NTRS)
Wray, Richard B.
1991-01-01
A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.
NASA Astrophysics Data System (ADS)
Arabi, Ehsan; Gruenwald, Benjamin C.; Yucelen, Tansel; Nguyen, Nhan T.
2018-05-01
Research in adaptive control algorithms for safety-critical applications is primarily motivated by the fact that these algorithms have the capability to suppress the effects of adverse conditions resulting from exogenous disturbances, imperfect dynamical system modelling, degraded modes of operation, and changes in system dynamics. Although government and industry agree on the potential of these algorithms in providing safety and reducing vehicle development costs, a major issue is the inability to achieve a-priori, user-defined performance guarantees with adaptive control algorithms. In this paper, a new model reference adaptive control architecture for uncertain dynamical systems is presented to address disturbance rejection and uncertainty suppression. The proposed framework is predicated on a set-theoretic adaptive controller construction using generalised restricted potential functions.The key feature of this framework allows the system error bound between the state of an uncertain dynamical system and the state of a reference model, which captures a desired closed-loop system performance, to be less than a-priori, user-defined worst-case performance bound, and hence, it has the capability to enforce strict performance guarantees. Examples are provided to demonstrate the efficacy of the proposed set-theoretic model reference adaptive control architecture.
Embedded CLIPS for SDI BM/C3 simulation and analysis
NASA Technical Reports Server (NTRS)
Gossage, Brett; Nanney, Van
1990-01-01
Nichols Research Corporation is developing the BM/C3 Requirements Analysis Tool (BRAT) for the U.S. Army Strategic Defense Command. BRAT uses embedded CLIPS/Ada to model the decision making processes used by the human commander of a defense system. Embedding CLlPS/Ada in BRAT allows the user to explore the role of the human in Command and Control (C2) and the use of expert systems for automated C2. BRAT models assert facts about the current state of the system, the simulated scenario, and threat information into CLIPS/Ada. A user-defined rule set describes the decision criteria for the commander. We have extended CLIPS/Ada with user-defined functions that allow the firing of a rule to invoke a system action such as weapons release or a change in strategy. The use of embedded CLIPS/Ada will provide a powerful modeling tool for our customer at minimal cost.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Godwin, Aaron
The scope will be limited to analyzing the effect of the EFC within the system and how one improperly installed coupling affects the rest of the HPFL system. The discussion will include normal operations, impaired flow, and service interruptions. Normal operations are defined as two-way flow to buildings. Impaired operations are defined as a building that only has one-way flow being provided to the building. Service interruptions will be when a building does not have water available to it. The project will look at the following aspects of the reliability of the HPFL system: mean time to failure (MTTF) ofmore » EFCs, mean time between failures (MTBF), series system models, and parallel system models. These calculations will then be used to discuss the reliability of the system when one of the couplings fails. Compare the reliability of two-way feeds versus one-way feeds.« less
Enhancements to an Agriculture-land Modeling System - FEST-C and Its Applications
The Fertilizer Emission Scenario Tool for CMAQ (FEST-C) system was originally developed to simulate daily fertilizer application information using the Environmental Policy Integrated Climate (EPIC) model across any defined CMAQ conterminous United States (U.S.) CMAQ domain and gr...
Safety Analysis of FMS/CTAS Interactions During Aircraft Arrivals
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.
1998-01-01
This grant funded research on human-computer interaction design and analysis techniques, using future ATC environments as a testbed. The basic approach was to model the nominal behavior of both the automated and human procedures and then to apply safety analysis techniques to these models. Our previous modeling language, RSML, had been used to specify the system requirements for TCAS II for the FAA. Using the lessons learned from this experience, we designed a new modeling language that (among other things) incorporates features to assist in designing less error-prone human-computer interactions and interfaces and in detecting potential HCI problems, such as mode confusion. The new language, SpecTRM-RL, uses "intent" abstractions, based on Rasmussen's abstraction hierarchy, and includes both informal (English and graphical) specifications and formal, executable models for specifying various aspects of the system. One of the goals for our language was to highlight the system modes and mode changes to assist in identifying the potential for mode confusion. Three published papers resulted from this research. The first builds on the work of Degani on mode confusion to identify aspects of the system design that could lead to potential hazards. We defined and modeled modes differently than Degani and also defined design criteria for SpecTRM-RL models. Our design criteria include the Degani criteria but extend them to include more potential problems. In a second paper, Leveson and Palmer showed how the criteria for indirect mode transitions could be applied to a mode confusion problem found in several ASRS reports for the MD-88. In addition, we defined a visual task modeling language that can be used by system designers to model human-computer interaction. The visual models can be translated into SpecTRM-RL models, and then the SpecTRM-RL suite of analysis tools can be used to perform formal and informal safety analyses on the task model in isolation or integrated with the rest of the modeled system. We had hoped to be able to apply these modeling languages and analysis tools to a TAP air/ground trajectory negotiation scenario, but the development of the tools took more time than we anticipated.
Solving quantum optimal control problems using Clebsch variables and Lin constraints
NASA Astrophysics Data System (ADS)
Delgado-Téllez, M.; Ibort, A.; Rodríguez de la Peña, T.
2018-01-01
Clebsch variables (and Lin constraints) are applied to the study of a class of optimal control problems for affine-controlled quantum systems. The optimal control problem will be modelled with controls defined on an auxiliary space where the dynamical group of the system acts freely. The reciprocity between both theories: the classical theory defined by the objective functional and the quantum system, is established by using a suitable version of Lagrange’s multipliers theorem and a geometrical interpretation of the constraints of the system as defining a subspace of horizontal curves in an associated bundle. It is shown how the solutions of the variational problem defined by the objective functional determine solutions of the quantum problem. Then a new way of obtaining explicit solutions for a family of optimal control problems for affine-controlled quantum systems (finite or infinite dimensional) is obtained. One of its main advantages, is the the use of Clebsch variables allows to compute such solutions from solutions of invariant problems that can often be computed explicitly. This procedure can be presented as an algorithm that can be applied to a large class of systems. Finally, some simple examples, spin control, a simple quantum Hamiltonian with an ‘Elroy beanie’ type classical model and a controlled one-dimensional quantum harmonic oscillator, illustrating the main features of the theory, will be discussed.
ERIC Educational Resources Information Center
Longenecker, Herbert E., Jr.; Yarbrough, David M.; Feinstein, David L.
2010-01-01
IS2002 has become a well defined standard for information systems curricula. The Data Management Association (DAMA 2006) curriculum framework defines a body of knowledge that points to a skill set that can enhance IS2002. While data management professionals are highly skilled individuals requiring as much as a decade of relevant experience before…
Swanson, Larry W.; Bota, Mihail
2010-01-01
The nervous system is a biological computer integrating the body's reflex and voluntary environmental interactions (behavior) with a relatively constant internal state (homeostasis)—promoting survival of the individual and species. The wiring diagram of the nervous system's structural connectivity provides an obligatory foundational model for understanding functional localization at molecular, cellular, systems, and behavioral organization levels. This paper provides a high-level, downwardly extendible, conceptual framework—like a compass and map—for describing and exploring in neuroinformatics systems (such as our Brain Architecture Knowledge Management System) the structural architecture of the nervous system's basic wiring diagram. For this, the Foundational Model of Connectivity's universe of discourse is the structural architecture of nervous system connectivity in all animals at all resolutions, and the model includes two key elements—a set of basic principles and an internally consistent set of concepts (defined vocabulary of standard terms)—arranged in an explicitly defined schema (set of relationships between concepts) allowing automatic inferences. In addition, rules and procedures for creating and modifying the foundational model are considered. Controlled vocabularies with broad community support typically are managed by standing committees of experts that create and refine boundary conditions, and a set of rules that are available on the Web. PMID:21078980
Using a simulation assistant in modeling manufacturing systems
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, S. X.; Wolfsberger, John W.
1988-01-01
Numerous simulation languages exist for modeling discrete event processes, and are now ported to microcomputers. Graphic and animation capabilities were added to many of these languages to assist the users build models and evaluate the simulation results. With all these languages and added features, the user is still plagued with learning the simulation language. Futhermore, the time to construct and then to validate the simulation model is always greater than originally anticipated. One approach to minimize the time requirement is to use pre-defined macros that describe various common processes or operations in a system. The development of a simulation assistant for modeling discrete event manufacturing processes is presented. A simulation assistant is defined as an interactive intelligent software tool that assists the modeler in writing a simulation program by translating the modeler's symbolic description of the problem and then automatically generating the corresponding simulation code. The simulation assistant is discussed with emphasis on an overview of the simulation assistant, the elements of the assistant, and the five manufacturing simulation generators. A typical manufacturing system will be modeled using the simulation assistant and the advantages and disadvantages discussed.
Meteorological Processes Affecting Air Quality – Research and Model Development Needs
Meteorology modeling is an important component of air quality modeling systems that defines the physical and dynamical environment for atmospheric chemistry. The meteorology models used for air quality applications are based on numerical weather prediction models that were devel...
Pedersen, Rune
2017-01-01
This is a project proposal derived from an urge to re-define the governance of ICT in healthcare towards regional and national standardization of the patient pathways. The focus is on a two-levelled approach for governing EPR systems where the clinicians' model structured variables and patient pathways. The overall goal is a patient centric EPR portfolio. This paper define and enlighten the need for establishing the socio- technical architect role necessary to obtain the capabilities of a modern structured EPR system. Clinicians are not capable to moderate between the technical and the clinical.
Urine sampling and collection system optimization and testing
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Geating, J. A.; Koesterer, M. G.
1975-01-01
A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.
From conceptual modeling to a map
NASA Astrophysics Data System (ADS)
Gotlib, Dariusz; Olszewski, Robert
2018-05-01
Nowadays almost every map is a component of the information system. Design and production of maps requires the use of specific rules for modeling information systems: conceptual, application and data modelling. While analyzing various stages of cartographic modeling the authors ask the question: at what stage of this process a map occurs. Can we say that the "life of the map" begins even before someone define its form of presentation? This question is particularly important at the time of exponentially increasing number of new geoinformation products. During the analysis of the theory of cartography and relations of the discipline to other fields of knowledge it has been attempted to define a few properties of cartographic modeling which distinguish the process from other methods of spatial modeling. Assuming that the map is a model of reality (created in the process of cartographic modeling supported by domain-modeling) the article proposes an analogy of the process of cartographic modeling to the scheme of conceptual modeling presented in ISO 19101 standard.
Method and system for analyzing and classifying electronic information
McGaffey, Robert W.; Bell, Michael Allen; Kortman, Peter J.; Wilson, Charles H.
2003-04-29
A data analysis and classification system that reads the electronic information, analyzes the electronic information according to a user-defined set of logical rules, and returns a classification result. The data analysis and classification system may accept any form of computer-readable electronic information. The system creates a hash table wherein each entry of the hash table contains a concept corresponding to a word or phrase which the system has previously encountered. The system creates an object model based on the user-defined logical associations, used for reviewing each concept contained in the electronic information in order to determine whether the electronic information is classified. The data analysis and classification system extracts each concept in turn from the electronic information, locates it in the hash table, and propagates it through the object model. In the event that the system can not find the electronic information token in the hash table, that token is added to a missing terms list. If any rule is satisfied during propagation of the concept through the object model, the electronic information is classified.
Scoping Planning Agents With Shared Models
NASA Technical Reports Server (NTRS)
Bedrax-Weiss, Tania; Frank, Jeremy D.; Jonsson, Ari K.; McGann, Conor
2003-01-01
In this paper we provide a formal framework to define the scope of planning agents based on a single declarative model. Having multiple agents sharing a single model provides numerous advantages that lead to reduced development costs and increase reliability of the system. We formally define planning in terms of extensions of an initial partial plan, and a set of flaws that make the plan unacceptable. A Flaw Filter (FF) allows us to identify those flaws relevant to an agent. Flaw filters motivate the Plan Identification Function (PIF), which specifies when an agent is is ready hand control to another agent for further work. PIFs define a set of plan extensions that can be generated from a model and a plan request. FFs and PIFs can be used to define the scope of agents without changing the model. We describe an implementation of PIFsand FFswithin the context of EUROPA, a constraint-based planning architecture, and show how it can be used to easily design many different agents.
A Control Concept for Large Flexible Spacecraft Using Order Reduction Techniques
NASA Technical Reports Server (NTRS)
Thieme, G.; Roth, H.
1985-01-01
Results found during the investigation of control problems of large flexible spacecraft are given. A triple plate configuration of such a spacecraft is defined and studied. The model is defined by modal data derived from infinite element modeling. The order reduction method applied is briefly described. An attitude control concept with low and high authority control has been developed to design an attitude controller for the reduced model. The stability and response of the original system together with the reduced controller is analyzed.
AutoRoute Rapid Flood Inundation Model
2013-03-01
Res. 33(2): 309-319. U.S. Army Engineer Hydrologic Engineering Center. 2010. “ HEC - RAS : River Analysis System, User’s Manual, Version 4.1.” Davis...cross-section data does not exist. As such, the AutoRoute model is not meant to be as accurate as models such as HEC - RAS (U.S. Army Engineer...such as HEC - RAS assume that the defined low point of cross sections must be connected. However, in this approach the channel is assumed to be defined
Defining and reconstructing clinical processes based on IHE and BPMN 2.0.
Strasser, Melanie; Pfeifer, Franz; Helm, Emmanuel; Schuler, Andreas; Altmann, Josef
2011-01-01
This paper describes the current status and the results of our process management system for defining and reconstructing clinical care processes, which contributes to compare, analyze and evaluate clinical processes and further to identify high cost tasks or stays. The system is founded on IHE, which guarantees standardized interfaces and interoperability between clinical information systems. At the heart of the system there is BPMN, a modeling notation and specification language, which allows the definition and execution of clinical processes. The system provides functionality to define healthcare information system independent clinical core processes and to execute the processes in a workflow engine. Furthermore, the reconstruction of clinical processes is done by evaluating an IHE audit log database, which records patient movements within a health care facility. The main goal of the system is to assist hospital operators and clinical process managers to detect discrepancies between defined and actual clinical processes and as well to identify main causes of high medical costs. Beyond that, the system can potentially contribute to reconstruct and improve clinical processes and enhance cost control and patient care quality.
Exploratory Model Analysis of the Space Based Infrared System (SBIRS) Low Global Scheduler Problem
1999-12-01
solution. The non- linear least squares model is defined as Y = f{e,t) where: 0 =M-element parameter vector Y =N-element vector of all data t...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM (SBIRS) LOW GLOBAL SCHEDULER...December 1999 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM
Particle Model for Work, Heat, and the Energy of a Thermodynamic System
ERIC Educational Resources Information Center
DeVoe, Howard
2007-01-01
A model of a thermodynamic system is described in which particles (representing atoms) interact with one another, the surroundings, and the earth's gravitational field according to the principles of classical mechanics. The system's energy "E" and internal energy "U" are defined. The importance is emphasized of the dependence of energy and work on…
Beyond the Systems Approach to Family Therapy: An Ecological Perspective.
ERIC Educational Resources Information Center
Coleman, Paul R.; Griffith, Mariellen
A brief review of systems theory provides a rationale for an underlying theoretical model within which systems theory can be more completely understood. The essence of the model is that persons are the major unit of study because the available means of satisfying "basic needs" define and shape interaction patterns in the family as in other human…
NASA Technical Reports Server (NTRS)
Jung, Jaewoo; D'Souza, Sarah N.; Johnson, Marcus A.; Ishihara, Abraham K.; Modi, Hemil C.; Nikaido, Ben; Hasseeb, Hashmatullah
2016-01-01
In anticipation of a rapid increase in the number of civil Unmanned Aircraft System(UAS) operations, NASA is researching prototype technologies for a UAS Traffic Management (UTM) system that will investigate airspace integration requirements for enabling safe, efficient low-altitude operations. One aspect a UTM system must consider is the correlation between UAS operations (such as vehicles, operation areas and durations), UAS performance requirements, and the risk to people and property in the operational area. This paper investigates the potential application of the International Civil Aviation Organizations (ICAO) Required Navigation Performance (RNP) concept to relate operational risk with trajectory conformance requirements. The approach is to first define a method to quantify operational risk and then define the RNP level requirement as a function of the operational risk. Greater operational risk corresponds to more accurate RNP level, or smaller tolerable Total System Error (TSE). Data from 19 small UAS flights are used to develop and validate a formula that defines this relationship. An approach to assessing UAS-RNP conformance capability using vehicle modeling and wind field simulation is developed to investigate how this formula may be applied in a future UTM system. The results indicate the modeled vehicles flight path is robust to the simulated wind variation, and it can meet RNP level requirements calculated by the formula. The results also indicate how vehicle-modeling fidelity may be improved to adequately verify assessed RNP level.
2017-06-01
The Naval Postgraduate School has developed a competency model for the systems engineering profession and is implementing a tool to support high...stakes human resource functions for the U.S. Army. A systems engineering career competency model (SECCM), recently developed by the Navy and verified by...the Office of Personnel Management (OPM), defines the critical competencies for successful performance as a systems engineer at each general schedule
NASA Astrophysics Data System (ADS)
Noffke, Benjamin W.
Carbon materials have the potential to replace some precious metals in renewable energy applications. These materials are particularly attractive because of the elemental abundance and relatively low nuclear mass of carbon, implying economically feasible and lightweight materials. Targeted design of carbon materials is hindered by the lack of fundamental understanding that is required to tailor their properties for the desired application. However, most available synthetic methods to create carbon materials involve harsh conditions that limit the control of the resulting structure. Without a well-defined structure, the system is too complex and fundamental studies cannot be definitive. This work seeks to gain fundamental understanding through the development and application of efficient computational models for these systems, in conjunction with experiments performed on soluble, well-defined graphene nanostructures prepared by our group using a bottom-up synthetic approach. Theory is used to determine mechanistic details for well-defined carbon systems in applications of catalysis and electrochemical transformations. The resulting computational models do well to explain previous observations of carbon materials and provide suggestions for future directions. However, as the system size of the nanostructures gets larger, the computational cost can become prohibitive. To reduce the computational scaling of quantum chemical calculations, a new fragmentation scheme has been developed that addresses the challenges of fragmenting conjugated molecules. By selecting fragments that retain important structural characteristics in graphene, a more efficient method is achieved. The new method paves the way for an automated, systematic fragmentation scheme of graphene molecules.
Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling
NASA Astrophysics Data System (ADS)
March, Salvatore T.; Allen, Gove N.
Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.
Data Base Design Using Entity-Relationship Models.
ERIC Educational Resources Information Center
Davis, Kathi Hogshead
1983-01-01
The entity-relationship (ER) approach to database design is defined, and a specific example of an ER model (personnel-payroll) is examined. The requirements for converting ER models into specific database management systems are discussed. (Author/MSE)
Extension of Alvis compiler front-end
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wypych, Michał; Szpyrka, Marcin; Matyasik, Piotr, E-mail: mwypych@agh.edu.pl, E-mail: mszpyrka@agh.edu.pl, E-mail: ptm@agh.edu.pl
2015-12-31
Alvis is a formal modelling language that enables possibility of verification of distributed concurrent systems. An Alvis model semantics finds expression in an LTS graph (labelled transition system). Execution of any language statement is expressed as a transition between formally defined states of such a model. An LTS graph is generated using a middle-stage Haskell representation of an Alvis model. Moreover, Haskell is used as a part of the Alvis language and is used to define parameters’ types and operations on them. Thanks to the compiler’s modular construction many aspects of compilation of an Alvis model may be modified. Providingmore » new plugins for Alvis Compiler that support languages like Java or C makes possible using these languages as a part of Alvis instead of Haskell. The paper presents the compiler internal model and describes how the default specification language can be altered by new plugins.« less
NASA Astrophysics Data System (ADS)
Najafi, M. N.; Dashti-Naserabadi, H.
2018-03-01
In many situations we are interested in the propagation of energy in some portions of a three-dimensional system with dilute long-range links. In this paper, a sandpile model is defined on the three-dimensional small-world network with real dissipative boundaries and the energy propagation is studied in three dimensions as well as the two-dimensional cross-sections. Two types of cross-sections are defined in the system, one in the bulk and another in the system boundary. The motivation of this is to make clear how the statistics of the avalanches in the bulk cross-section tend to the statistics of the dissipative avalanches, defined in the boundaries as the concentration of long-range links (α ) increases. This trend is numerically shown to be a power law in a manner described in the paper. Two regimes of α are considered in this work. For sufficiently small α s the dominant behavior of the system is just like that of the regular BTW, whereas for the intermediate values the behavior is nontrivial with some exponents that are reported in the paper. It is shown that the spatial extent up to which the statistics is similar to the regular BTW model scales with α just like the dissipative BTW model with the dissipation factor (mass in the corresponding ghost model) m2˜α for the three-dimensional system as well as its two-dimensional cross-sections.
Crisis Intervention and the Military Family: A Model Installation Program.
1988-03-07
abuse to less serious problems like budget management and good parenting techniques. Family dysfunction is defined in two categories: High and low...serious problems like budget management and good parenting techniques. Family dysfunction is defined in two categories: High and low intensity conflict. The...system activated to deal with the problem through the Family N, Advocacy Case Management Team. It is an excellent system that protects, helps and
Reliability Modeling of Double Beam Bridge Crane
NASA Astrophysics Data System (ADS)
Han, Zhu; Tong, Yifei; Luan, Jiahui; Xiangdong, Li
2018-05-01
This paper briefly described the structure of double beam bridge crane and the basic parameters of double beam bridge crane are defined. According to the structure and system division of double beam bridge crane, the reliability architecture of double beam bridge crane system is proposed, and the reliability mathematical model is constructed.
Goal Structuring Notation in a Radiation Hardening Assurance Case for COTS-Based Spacecraft
NASA Technical Reports Server (NTRS)
Witulski, A.; Austin, R.; Evans, J.; Mahadevan, N.; Karsai, G.; Sierawski, B.; LaBel, K.; Reed, R.
2016-01-01
The attached presentation is a summary of how mission assurance is supported by model-based representations of spacecraft systems that can define sub-system functionality and interfacing, reliability parameters, as well as detailing a new paradigm for assurance, a model-centric and not document-centric process.
NASA Technical Reports Server (NTRS)
Adams, Douglas S.; Wu, Shih-Chin
2006-01-01
The MARSIS antenna booms are constructed using lenticular hinges between straight boom segments in a novel design which allows the booms to be extremely lightweight while retaining a high stiffness and well defined structural properties once they are deployed. Lenticular hinges are elegant in form but are complicated to model as they deploy dynamically and require highly specialized nonlinear techniques founded on carefully measured mechanical properties. Results from component level testing were incorporated into a highly specialized ADAMS model which employed an automated damping algorithm to account for the discontinuous boom lengths formed during the deployment. Additional models with more limited capabilities were also developed in both DADS and ABAQUS to verify the ADAMS model computations and to help better define the numerical behavior of the models at the component and system levels. A careful comparison is made between the ADAMS and DADS models in a series of progressive steps in order to verify their numerical results. Different trade studies considered in the model development are outlined to demonstrate a suitable level of model fidelity. Some model sensitivities to various parameters are explored using subscale and full system models. Finally, some full system DADS models are exercised to illustrate the limitations of traditional modeling techniques for variable geometry systems which were overcome in the ADAMS model.
A solution to the surface intersection problem. [Boolean functions in geometric modeling
NASA Technical Reports Server (NTRS)
Timer, H. G.
1977-01-01
An application-independent geometric model within a data base framework should support the use of Boolean operators which allow the user to construct a complex model by appropriately combining a series of simple models. The use of these operators leads to the concept of implicitly and explicitly defined surfaces. With an explicitly defined model, the surface area may be computed by simply summing the surface areas of the bounding surfaces. For an implicitly defined model, the surface area computation must deal with active and inactive regions. Because the surface intersection problem involves four unknowns and its solution is a space curve, the parametric coordinates of each surface must be determined as a function of the arc length. Various subproblems involved in the general intersection problem are discussed, and the mathematical basis for their solution is presented along with a program written in FORTRAN IV for implementation on the IBM 370 TSO system.
Anderman, E.R.; Hill, M.C.
2000-01-01
This report documents the Hydrogeologic-Unit Flow (HUF) Package for the groundwater modeling computer program MODFLOW-2000. The HUF Package is an alternative internal flow package that allows the vertical geometry of the system hydrogeology to be defined explicitly within the model using hydrogeologic units that can be different than the definition of the model layers. The HUF Package works with all the processes of MODFLOW-2000. For the Ground-Water Flow Process, the HUF Package calculates effective hydraulic properties for the model layers based on the hydraulic properties of the hydrogeologic units, which are defined by the user using parameters. The hydraulic properties are used to calculate the conductance coefficients and other terms needed to solve the ground-water flow equation. The sensitivity of the model to the parameters defined within the HUF Package input file can be calculated using the Sensitivity Process, using observations defined with the Observation Process. Optimal values of the parameters can be estimated by using the Parameter-Estimation Process. The HUF Package is nearly identical to the Layer-Property Flow (LPF) Package, the major difference being the definition of the vertical geometry of the system hydrogeology. Use of the HUF Package is illustrated in two test cases, which also serve to verify the performance of the package by showing that the Parameter-Estimation Process produces the true parameter values when exact observations are used.
From guideline modeling to guideline execution: defining guideline-based decision-support services.
Tu, S. W.; Musen, M. A.
2000-01-01
We describe our task-based approach to defining the guideline-based decision-support services that the EON system provides. We categorize uses of guidelines in patient-specific decision support into a set of generic tasks--making of decisions, specification of work to be performed, interpretation of data, setting of goals, and issuance of alert and reminders--that can be solved using various techniques. Our model includes constructs required for representing the knowledge used by these techniques. These constructs form a toolkit from which developers can select modeling solutions for guideline task. Based on the tasks and the guideline model, we define a guideline-execution architecture and a model of interactions between a decision-support server and clients that invoke services provided by the server. These services use generic interfaces derived from guideline tasks and their associated modeling constructs. We describe two implementations of these decision-support services and discuss how this work can be generalized. We argue that a well-defined specification of guideline-based decision-support services will facilitate sharing of tools that implement computable clinical guidelines. PMID:11080007
Animal Models of Human Granulocyte Diseases
Schäffer, Alejandro A.; Klein, Christoph
2012-01-01
In vivo animal models have proven very useful to understand basic biological pathways of the immune system, a prerequisite for the development of innovate therapies. This manuscript addresses currently available models for defined human monogenetic defects of neutrophil granulocytes, including murine, zebrafish and larger mammalian species. Strengths and weaknesses of each system are summarized, and clinical investigators may thus be inspired to develop further lines of research to improve diagnosis and therapy by use of the appropriate animal model system. PMID:23351993
NASA Technical Reports Server (NTRS)
1974-01-01
User models defined as any explicit process or procedure used to transform information extracted from remotely sensed data into a form useful as a resource management information input are discussed. The role of the user models as information, technological, and operations interfaces between the TERSSE and the resource managers is emphasized. It is recommended that guidelines and management strategies be developed for a systems approach to user model development.
Federation of UML models for cyber physical use cases
DOE Office of Scientific and Technical Information (OSTI.GOV)
This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interfaces to link all of domain models. Federation seeks to build on existing bodies of work. Some examples include the Common Information Models (CIM) maintained by the International Electrotechnical Commission Technical Committee 57 (IEC TC 57) for the electric power industry. Another relevant model is the CIM maintained by the Distributed Management Task Force (DMTF)? this CIM defines a representation of the managed elements in anmore » Information Technology (IT) environment. The power system is an example of a cyber-physical system, where the cyber systems, consisting of computing infrastructure such as networks and devices, play a critical role in the operation of the underlying physical electricity delivery system. Measurements from remote field devices are relayed to control centers through computer networks, and the data is processed to determine suitable control actions. Control decisions are then relayed back to field devices. It has been observed that threat actors may be able to successfully compromise this cyber layer in order to impact power system operation. Therefore, future control center applications must be wary of potentially compromised measurements coming from field devices. In order to ensure the integrity of the field measurements, these applications could make use of compromise indicators from alternate sources of information such as cyber security. Thus, modern control applications may require access to data from sources that are not defined in the local information model. In such cases, software application interfaces will require integration of data objects from cross-domain data models. When incorporating or federating different domains, it is important to have subject matter experts work together, recognizing that not everyone has the same knowledge, responsibilities, focus, or skill set.« less
Spectrophotovoltaic orbital power generation, phase 2
NASA Technical Reports Server (NTRS)
Lo, S. K.; Stoltzman, D.; Knowles, G.; Lin, R.
1981-01-01
A subscale model of the spectral splitting concentrator system with 10" aperture is defined and designed. The model is basically a scaled down version of Phase 1 design with an effective concentration ratio up to 1000:1. The system performance is predicted to be 21.5% for the 2 cell GaAs/Si system, and 20% for Si/GaAs at AM2 using realistic component efficiencies. Component cost of the model is projected in the $50K range. Component and system test plans are also detailed.
Specification and simulation of behavior of the Continuous Infusion Insulin Pump system.
Babamir, Seyed Morteza; Dehkordi, Mehdi Borhani
2014-01-01
Continuous Infusion Insulin Pump (CIIP) system is responsible for monitoring diabetic blood sugar. In this paper, we aim to specify and simulate the CIIP software behavior. To this end, we first: (1) presented a model consisting of the CIIP system behavior in response to its environment (diabetic) behavior and (2) we formally defined the safety requirements of the system environment (diabetic) in the Z formal modeling language. Such requirements should be satisfied by the CIIP software. Finally, we programmed the model and requirements.
Modelling of Biometric Identification System with Given Parameters Using Colored Petri Nets
NASA Astrophysics Data System (ADS)
Petrosyan, G.; Ter-Vardanyan, L.; Gaboutchian, A.
2017-05-01
Biometric identification systems use given parameters and function on the basis of Colored Petri Nets as a modelling language developed for systems in which communication, synchronization and distributed resources play an important role. Colored Petri Nets combine the strengths of Classical Petri Nets with the power of a high-level programming language. Coloured Petri Nets have both, formal intuitive and graphical presentations. Graphical CPN model consists of a set of interacting modules which include a network of places, transitions and arcs. Mathematical representation has a well-defined syntax and semantics, as well as defines system behavioural properties. One of the best known features used in biometric is the human finger print pattern. During the last decade other human features have become of interest, such as iris-based or face recognition. The objective of this paper is to introduce the fundamental concepts of Petri Nets in relation to tooth shape analysis. Biometric identification systems functioning has two phases: data enrollment phase and identification phase. During the data enrollment phase images of teeth are added to database. This record contains enrollment data as a noisy version of the biometrical data corresponding to the individual. During the identification phase an unknown individual is observed again and is compared to the enrollment data in the database and then system estimates the individual. The purpose of modeling biometric identification system by means of Petri Nets is to reveal the following aspects of the functioning model: the efficiency of the model, behavior of the model, mistakes and accidents in the model, feasibility of the model simplification or substitution of its separate components for more effective components without interfering system functioning. The results of biometric identification system modeling and evaluating are presented and discussed.
NASA Technical Reports Server (NTRS)
Bole, Brian; Goebel, Kai; Vachtsevanos, George
2012-01-01
This paper introduces a novel Markov process formulation of stochastic fault growth modeling, in order to facilitate the development and analysis of prognostics-based control adaptation. A metric representing the relative deviation between the nominal output of a system and the net output that is actually enacted by an implemented prognostics-based control routine, will be used to define the action space of the formulated Markov process. The state space of the Markov process will be defined in terms of an abstracted metric representing the relative health remaining in each of the system s components. The proposed formulation of component fault dynamics will conveniently relate feasible system output performance modifications to predictions of future component health deterioration.
Modeling stock price dynamics by continuum percolation system and relevant complex systems analysis
NASA Astrophysics Data System (ADS)
Xiao, Di; Wang, Jun
2012-10-01
The continuum percolation system is developed to model a random stock price process in this work. Recent empirical research has demonstrated various statistical features of stock price changes, the financial model aiming at understanding price fluctuations needs to define a mechanism for the formation of the price, in an attempt to reproduce and explain this set of empirical facts. The continuum percolation model is usually referred to as a random coverage process or a Boolean model, the local interaction or influence among traders is constructed by the continuum percolation, and a cluster of continuum percolation is applied to define the cluster of traders sharing the same opinion about the market. We investigate and analyze the statistical behaviors of normalized returns of the price model by some analysis methods, including power-law tail distribution analysis, chaotic behavior analysis and Zipf analysis. Moreover, we consider the daily returns of Shanghai Stock Exchange Composite Index from January 1997 to July 2011, and the comparisons of return behaviors between the actual data and the simulation data are exhibited.
Peer-to-peer model for the area coverage and cooperative control of mobile sensor networks
NASA Astrophysics Data System (ADS)
Tan, Jindong; Xi, Ning
2004-09-01
This paper presents a novel model and distributed algorithms for the cooperation and redeployment of mobile sensor networks. A mobile sensor network composes of a collection of wireless connected mobile robots equipped with a variety of sensors. In such a sensor network, each mobile node has sensing, computation, communication, and locomotion capabilities. The locomotion ability enhances the autonomous deployment of the system. The system can be rapidly deployed to hostile environment, inaccessible terrains or disaster relief operations. The mobile sensor network is essentially a cooperative multiple robot system. This paper first presents a peer-to-peer model to define the relationship between neighboring communicating robots. Delaunay Triangulation and Voronoi diagrams are used to define the geometrical relationship between sensor nodes. This distributed model allows formal analysis for the fusion of spatio-temporal sensory information of the network. Based on the distributed model, this paper discusses a fault tolerant algorithm for autonomous self-deployment of the mobile robots. The algorithm considers the environment constraints, the presence of obstacles and the nonholonomic constraints of the robots. The distributed algorithm enables the system to reconfigure itself such that the area covered by the system can be enlarged. Simulation results have shown the effectiveness of the distributed model and deployment algorithms.
A service-oriented data access control model
NASA Astrophysics Data System (ADS)
Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali
2017-01-01
The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.
Exploring the Components of Dynamic Modeling Techniques
ERIC Educational Resources Information Center
Turnitsa, Charles Daniel
2012-01-01
Upon defining the terms modeling and simulation, it becomes apparent that there is a wide variety of different models, using different techniques, appropriate for different levels of representation for any one system to be modeled. Selecting an appropriate conceptual modeling technique from those available is an open question for the practitioner.…
Lemke, Heinz U; Berliner, Leonard
2011-05-01
Appropriate use of information and communication technology (ICT) and mechatronic (MT) systems is viewed by many experts as a means to improve workflow and quality of care in the operating room (OR). This will require a suitable information technology (IT) infrastructure, as well as communication and interface standards, such as specialized extensions of DICOM, to allow data interchange between surgical system components in the OR. A design of such an infrastructure, sometimes referred to as surgical PACS, but better defined as a Therapy Imaging and Model Management System (TIMMS), will be introduced in this article. A TIMMS should support the essential functions that enable and advance image guided therapy, and in the future, a more comprehensive form of patient-model guided therapy. Within this concept, the "image-centric world view" of the classical PACS technology is complemented by an IT "model-centric world view". Such a view is founded in the special patient modelling needs of an increasing number of modern surgical interventions as compared to the imaging intensive working mode of diagnostic radiology, for which PACS was originally conceptualised and developed. The modelling aspects refer to both patient information and workflow modelling. Standards for creating and integrating information about patients, equipment, and procedures are vitally needed when planning for an efficient OR. The DICOM Working Group 24 (WG-24) has been established to develop DICOM objects and services related to image and model guided surgery. To determine these standards, it is important to define step-by-step surgical workflow practices and create interventional workflow models per procedures or per variable cases. As the boundaries between radiation therapy, surgery and interventional radiology are becoming less well-defined, precise patient models will become the greatest common denominator for all therapeutic disciplines. In addition to imaging, the focus of WG-24 is to serve the therapeutic disciplines by enabling modelling technology to be based on standards. Copyright © 2011. Published by Elsevier Ireland Ltd.
CRT--Cascade Routing Tool to define and visualize flow paths for grid-based watershed models
Henson, Wesley R.; Medina, Rose L.; Mayers, C. Justin; Niswonger, Richard G.; Regan, R.S.
2013-01-01
The U.S. Geological Survey Cascade Routing Tool (CRT) is a computer application for watershed models that include the coupled Groundwater and Surface-water FLOW model, GSFLOW, and the Precipitation-Runoff Modeling System (PRMS). CRT generates output to define cascading surface and shallow subsurface flow paths for grid-based model domains. CRT requires a land-surface elevation for each hydrologic response unit (HRU) of the model grid; these elevations can be derived from a Digital Elevation Model raster data set of the area containing the model domain. Additionally, a list is required of the HRUs containing streams, swales, lakes, and other cascade termination features along with indices that uniquely define these features. Cascade flow paths are determined from the altitudes of each HRU. Cascade paths can cross any of the four faces of an HRU to a stream or to a lake within or adjacent to an HRU. Cascades can terminate at a stream, lake, or HRU that has been designated as a watershed outflow location.
A data types profile suitable for use with ISO EN 13606.
Sun, Shanghua; Austin, Tony; Kalra, Dipak
2012-12-01
ISO EN 13606 is a five part International Standard specifying how Electronic Healthcare Record (EHR) information should be communicated between different EHR systems and repositories. Part 1 of the standard defines an information model for representing the EHR information itself, including the representation of types of data value. A later International Standard, ISO 21090:2010, defines a comprehensive set of models for data types needed by all health IT systems. This latter standard is vast, and duplicates some of the functions already handled by ISO EN 13606 part 1. A profile (sub-set) of ISO 21090 would therefore be expected to provide EHR system vendors with a more specially tailored set of data types to implement and avoid the risk of providing more than one modelling option for representing the information properties. This paper describes the process and design decisions made for developing a data types profile for EHR interoperability.
Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.
Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi
2015-10-01
In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.
A cloud-based approach for interoperable electronic health records (EHRs).
Bahga, Arshdeep; Madisetti, Vijay K
2013-09-01
We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.
Study on queueing behavior in pedestrian evacuation by extended cellular automata model
NASA Astrophysics Data System (ADS)
Hu, Jun; You, Lei; Zhang, Hong; Wei, Juan; Guo, Yangyong
2018-01-01
This paper proposes a pedestrian evacuation model for effective simulation of evacuation efficiency based on extended cellular automata. In the model, pedestrians' momentary transition probability to a target position is defined in terms of the floor field and queueing time, and the critical time is defined as the waiting time threshold in a queue. Queueing time and critical time are derived using Fractal Brownian Motion through analysis of pedestrian arrival characteristics. Simulations using the platform and actual evacuations were conducted to study the relationships among system evacuation time, average system velocity, pedestrian density, flow rate, and critical time. The results demonstrate that at low pedestrian density, evacuation efficiency can be improved through adoption of the shortest route strategy, and critical time has an inverse relationship with average system velocity. Conversely, at higher pedestrian densities, it is better to adopt the shortest queueing time strategy, and critical time is inversely related to flow rate.
Precision agricultural systems: a model of integrative science and technology
USDA-ARS?s Scientific Manuscript database
In the world of science research, long gone are the days when investigations are done in isolation. More often than not, science funding starts with one or more well-defined challenges or problems, judged by society as high-priority and needing immediate attention. As such, problems are not defined...
Health economics, equity, and efficiency: are we almost there?
Ferraz, Marcos Bosi
2015-01-01
Health care is a highly complex, dynamic, and creative sector of the economy. While health economics has to continue its efforts to improve its methods and tools to better inform decisions, the application needs to be aligned with the insights and models of other social sciences disciplines. Decisions may be guided by four concept models based on ethical and distributive justice: libertarian, communitarian, egalitarian, and utilitarian. The societal agreement on one model or a defined mix of models is critical to avoid inequity and unfair decisions in a public and/or private insurance-based health care system. The excess use of methods and tools without fully defining the basic goals and philosophical principles of the health care system and without evaluating the fitness of these measures to reaching these goals may not contribute to an efficient improvement of population health.
Health economics, equity, and efficiency: are we almost there?
Ferraz, Marcos Bosi
2015-01-01
Health care is a highly complex, dynamic, and creative sector of the economy. While health economics has to continue its efforts to improve its methods and tools to better inform decisions, the application needs to be aligned with the insights and models of other social sciences disciplines. Decisions may be guided by four concept models based on ethical and distributive justice: libertarian, communitarian, egalitarian, and utilitarian. The societal agreement on one model or a defined mix of models is critical to avoid inequity and unfair decisions in a public and/or private insurance-based health care system. The excess use of methods and tools without fully defining the basic goals and philosophical principles of the health care system and without evaluating the fitness of these measures to reaching these goals may not contribute to an efficient improvement of population health. PMID:25709481
MASCARET: creating virtual learning environments from system modelling
NASA Astrophysics Data System (ADS)
Querrec, Ronan; Vallejo, Paola; Buche, Cédric
2013-03-01
The design process for a Virtual Learning Environment (VLE) such as that put forward in the SIFORAS project (SImulation FOR training and ASsistance) means that system specifications can be differentiated from pedagogical specifications. System specifications can also be obtained directly from the specialists' expertise; that is to say directly from Product Lifecycle Management (PLM) tools. To do this, the system model needs to be considered as a piece of VLE data. In this paper we present Mascaret, a meta-model which can be used to represent such system models. In order to ensure that the meta-model is capable of describing, representing and simulating such systems, MASCARET is based SysML1, a standard defined by Omg.
Probabilistic computer model of optimal runway turnoffs
NASA Technical Reports Server (NTRS)
Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.
1985-01-01
Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.
Finite State Models of Manned Systems: Validation, Simplification, and Extension.
1979-11-01
model a time set is needed. A time set is some set T together with a binary relation defined on T which linearly orders the set. If "model time" is...discrete, so is T ; continuous time is represented by a set corresponding to a subset of the non-negative real numbers. In the following discussion time...defined as sequences, over time, of input and outIut values. The notion of sequences or trajectories is formalized as: AT = xx: T -- Al BT = tyIy: T -4BJ AT
The existence of negative absolute temperatures in Axelrod’s social influence model
NASA Astrophysics Data System (ADS)
Villegas-Febres, J. C.; Olivares-Rivas, W.
2008-06-01
We introduce the concept of temperature as an order parameter in the standard Axelrod’s social influence model. It is defined as the relation between suitably defined entropy and energy functions, T=(. We show that at the critical point, where the order/disorder transition occurs, this absolute temperature changes in sign. At this point, which corresponds to the transition homogeneous/heterogeneous culture, the entropy of the system shows a maximum. We discuss the relationship between the temperature and other properties of the model in terms of cultural traits.
An interactive graphics system to facilitate finite element structural analysis
NASA Technical Reports Server (NTRS)
Burk, R. C.; Held, F. H.
1973-01-01
The characteristics of an interactive graphics systems to facilitate the finite element method of structural analysis are described. The finite element model analysis consists of three phases: (1) preprocessing (model generation), (2) problem solution, and (3) postprocessing (interpretation of results). The advantages of interactive graphics to finite element structural analysis are defined.
Edge-defined film-fed growth of thin silicon sheets
NASA Technical Reports Server (NTRS)
Ettouney, H. M.; Kalejs, J. P.
1984-01-01
Finite element analysis was used on two length scales to understand crystal growth of thin silicon sheets. Thermal-capillary models of entire ribbon growth systems were developed. Microscopic modeling of morphological structure of melt/solid interfaces beyond the point of linear instability was carried out. The application to silicon system is discussed.
Nonlinear stability in reaction-diffusion systems via optimal Lyapunov functions
NASA Astrophysics Data System (ADS)
Lombardo, S.; Mulone, G.; Trovato, M.
2008-06-01
We define optimal Lyapunov functions to study nonlinear stability of constant solutions to reaction-diffusion systems. A computable and finite radius of attraction for the initial data is obtained. Applications are given to the well-known Brusselator model and a three-species model for the spatial spread of rabies among foxes.
ERIC Educational Resources Information Center
Gauthier, Benoit; And Others
1997-01-01
Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)
Equilibrator: Modeling Chemical Equilibria with Excel
ERIC Educational Resources Information Center
Vander Griend, Douglas A.
2011-01-01
Equilibrator is a Microsoft Excel program for learning about chemical equilibria through modeling, similar in function to EQS4WIN, which is no longer supported and does not work well with newer Windows operating systems. Similar to EQS4WIN, Equilibrator allows the user to define a system with temperature, initial moles, and then either total…
Mathematical Modeling of Food Supply for Long Term Space Missions Using Advanced Life Support
NASA Technical Reports Server (NTRS)
Cruthirds, John E.
2003-01-01
A habitat for long duration missions which utilizes Advanced Life Support (ALS), the Bioregenerative Planetary Life Support Systems Test Complex (BIO-Plex), is currently being built at JSC. In this system all consumables will be recycled and reused. In support of this effort, a menu is being planned utilizing ALS crops that will meet nutritional and psychological requirements. The need exists in the food system to identify specific physical quantities that define life support systems from an analysis and modeling perspective. Once these quantities are defined, they need to be fed into a mathematical model that takes into consideration other systems in the BIO-Plex. This model, if successful, will be used to understand the impacts of changes in the food system on the other systems and vice versa. The Equivalent System Mass (ESM) metric has been used to describe systems and subsystems, including the food system options, in terms of the single parameter, mass. There is concern that this approach might not adequately address the important issues of food quality and psychological impact on crew morale of a supply of fiesh food items. In fact, the mass of food can also depend on the quality of the food. This summer faculty fellow project will involve creating an appropriate mathematical model for the food plan developed by the Food Processing System for BIO-Plex. The desired outcome of this work will be a quantitative model that can be applied to the various options of supplying food on long-term space missions.
ODEion--a software module for structural identification of ordinary differential equations.
Gennemark, Peter; Wedelin, Dag
2014-02-01
In the systems biology field, algorithms for structural identification of ordinary differential equations (ODEs) have mainly focused on fixed model spaces like S-systems and/or on methods that require sufficiently good data so that derivatives can be accurately estimated. There is therefore a lack of methods and software that can handle more general models and realistic data. We present ODEion, a software module for structural identification of ODEs. Main characteristic features of the software are: • The model space is defined by arbitrary user-defined functions that can be nonlinear in both variables and parameters, such as for example chemical rate reactions. • ODEion implements computationally efficient algorithms that have been shown to efficiently handle sparse and noisy data. It can run a range of realistic problems that previously required a supercomputer. • ODEion is easy to use and provides SBML output. We describe the mathematical problem, the ODEion system itself, and provide several examples of how the system can be used. Available at: http://www.odeidentification.org.
Measuring neuronal avalanches in disordered systems with absorbing states
NASA Astrophysics Data System (ADS)
Girardi-Schappo, M.; Tragtenberg, M. H. R.
2018-04-01
Power-law-shaped avalanche-size distributions are widely used to probe for critical behavior in many different systems, particularly in neural networks. The definition of avalanche is ambiguous. Usually, theoretical avalanches are defined as the activity between a stimulus and the relaxation to an inactive absorbing state. On the other hand, experimental neuronal avalanches are defined by the activity between consecutive silent states. We claim that the latter definition may be extended to some theoretical models to characterize their power-law avalanches and critical behavior. We study a system in which the separation of driving and relaxation time scales emerges from its structure. We apply both definitions of avalanche to our model. Both yield power-law-distributed avalanches that scale with system size in the critical point as expected. Nevertheless, we find restricted power-law-distributed avalanches outside of the critical region within the experimental procedure, which is not expected by the standard theoretical definition. We remark that these results are dependent on the model details.
NASA Technical Reports Server (NTRS)
Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher
2005-01-01
This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.
Using the NPSS Environment to Model an Altitude Test Facility
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Owen, Albert K.; Huffman, Brian C.
2013-01-01
An altitude test facility was modeled using Numerical Propulsion System Simulation (NPSS). This altitude test facility model represents the most detailed facility model developed in the NPSS architecture. The current paper demonstrates the use of the NPSS system to define the required operating range of a component for the facility. A significant number of additional component models were easily developed to complete the model. Discussed in this paper are the additional components developed and what was done in the development of these components.
Healthcare waste management: an interpretive structural modeling approach.
Thakur, Vikas; Anbanandam, Ramesh
2016-06-13
Purpose - The World Health Organization identified infectious healthcare waste as a threat to the environment and human health. India's current medical waste management system has limitations, which lead to ineffective and inefficient waste handling practices. Hence, the purpose of this paper is to: first, identify the important barriers that hinder India's healthcare waste management (HCWM) systems; second, classify operational, tactical and strategical issues to discuss the managerial implications at different management levels; and third, define all barriers into four quadrants depending upon their driving and dependence power. Design/methodology/approach - India's HCWM system barriers were identified through the literature, field surveys and brainstorming sessions. Interrelationships among all the barriers were analyzed using interpretive structural modeling (ISM). Fuzzy-Matrice d'Impacts Croisés Multiplication Appliquée á un Classement (MICMAC) analysis was used to classify HCWM barriers into four groups. Findings - In total, 25 HCWM system barriers were identified and placed in 12 different ISM model hierarchy levels. Fuzzy-MICMAC analysis placed eight barriers in the second quadrant, five in third and 12 in fourth quadrant to define their relative ISM model importance. Research limitations/implications - The study's main limitation is that all the barriers were identified through a field survey and barnstorming sessions conducted only in Uttarakhand, Northern State, India. The problems in implementing HCWM practices may differ with the region, hence, the current study needs to be replicated in different Indian states to define the waste disposal strategies for hospitals. Practical implications - The model will help hospital managers and Pollution Control Boards, to plan their resources accordingly and make policies, targeting key performance areas. Originality/value - The study is the first attempt to identify India's HCWM system barriers and prioritize them.
Federating Cyber and Physical Models for Event-Driven Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stephan, Eric G.; Pawlowski, Ronald A.; Sridhar, Siddharth
The purpose of this paper is to describe a novel method to improve electric power system monitoring and control software application interoperability. This method employs the concept of federation, which is defined as the use of existing models that represent aspects of a system in specific domains (such as physical and cyber security domains) and building interface to link all of domain models.
Berry connection in atom-molecule systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui Fucheng; Wu Biao; International Center for Quantum Materials, Peking University, 100871 Beijing
2011-08-15
In the mean-field theory of atom-molecule systems, where bosonic atoms combine to form molecules, there is no usual U(1) symmetry, presenting an apparent hurdle for defining the Berry phase and Berry curvature for these systems. We define a Berry connection for this system, with which the Berry phase and Berry curvature can be naturally computed. We use a three-level atom-molecule system to illustrate our results. In particular, we have computed the mean-field Berry curvature of this system analytically, and compared it to the Berry curvature computed with the second-quantized model of the same system. An excellent agreement is found, indicatingmore » the validity of our definition.« less
NASA Technical Reports Server (NTRS)
1978-01-01
A payload mission model covering 129 launches, was examined and compared against the space transportation system shuttle standard orbit inclinations and a shuttle launch site implementation schedule. Based on this examination and comparison, a set of six reference missions were defined in terms of spacecraft weight and velocity requirements to deliver the payload from a 296 km circular Shuttle standard orbit to the spacecraft's planned orbit. Payload characteristics and requirements representative of the model payloads included in the regime bounded by each of the six reference missions were determined. A set of launch cost envelopes were developed and defined based on the characteristics of existing/planned Shuttle upper stages and expendable launch systems in terms of launch cost and velocity delivered. These six reference missions were used to define the requirements for the candidate propulsion modes which were developed and screened to determine the propulsion approaches for conceptual design.
User-Defined Material Model for Progressive Failure Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)
2006-01-01
An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.
Alternative Models for the Co-operative Governance of Teacher Education Programs.
ERIC Educational Resources Information Center
Sagan, Edgar L.; Smith, Barbara G.
This paper reviews and criticizes existing models of governance of teacher education and proposes alternative ones. Chapter I defines three models of governance including a) a bureaucratic model; b) a collaborative model; and c) a systems analysis model which is used to plan new models in the final chapters. Chapter II deals with the current…
Insights into mortality patterns and causes of death through a process point of view model.
Anderson, James J; Li, Ting; Sharrow, David J
2017-02-01
Process point of view (POV) models of mortality, such as the Strehler-Mildvan and stochastic vitality models, represent death in terms of the loss of survival capacity through challenges and dissipation. Drawing on hallmarks of aging, we link these concepts to candidate biological mechanisms through a framework that defines death as challenges to vitality where distal factors defined the age-evolution of vitality and proximal factors define the probability distribution of challenges. To illustrate the process POV, we hypothesize that the immune system is a mortality nexus, characterized by two vitality streams: increasing vitality representing immune system development and immunosenescence representing vitality dissipation. Proximal challenges define three mortality partitions: juvenile and adult extrinsic mortalities and intrinsic adult mortality. Model parameters, generated from Swedish mortality data (1751-2010), exhibit biologically meaningful correspondences to economic, health and cause-of-death patterns. The model characterizes the twentieth century epidemiological transition mainly as a reduction in extrinsic mortality resulting from a shift from high magnitude disease challenges on individuals at all vitality levels to low magnitude stress challenges on low vitality individuals. Of secondary importance, intrinsic mortality was described by a gradual reduction in the rate of loss of vitality presumably resulting from reduction in the rate of immunosenescence. Extensions and limitations of a distal/proximal framework for characterizing more explicit causes of death, e.g. the young adult mortality hump or cancer in old age are discussed.
Insights into mortality patterns and causes of death through a process point of view model
Anderson, James J.; Li, Ting; Sharrow, David J.
2016-01-01
Process point of view models of mortality, such as the Strehler-Mildvan and stochastic vitality models, represent death in terms of the loss of survival capacity through challenges and dissipation. Drawing on hallmarks of aging, we link these concepts to candidate biological mechanisms through a framework that defines death as challenges to vitality where distal factors defined the age-evolution of vitality and proximal factors define the probability distribution of challenges. To illustrate the process point of view, we hypothesize that the immune system is a mortality nexus, characterized by two vitality streams: increasing vitality representing immune system development and immunosenescence representing vitality dissipation. Proximal challenges define three mortality partitions: juvenile and adult extrinsic mortalities and intrinsic adult mortality. Model parameters, generated from Swedish mortality data (1751-2010), exhibit biologically meaningful correspondences to economic, health and cause-of-death patterns. The model characterizes the 20th century epidemiological transition mainly as a reduction in extrinsic mortality resulting from a shift from high magnitude disease challenges on individuals at all vitality levels to low magnitude stress challenges on low vitality individuals. Of secondary importance, intrinsic mortality was described by a gradual reduction in the rate of loss of vitality presumably resulting from reduction in the rate of immunosenescence. Extensions and limitations of a distal/proximal framework for characterizing more explicit causes of death, e.g. the young adult mortality hump or cancer in old age are discussed. PMID:27885527
Pathogen Treatment Guidance and Monitoring Approaches fro ...
On-site non-potable water reuse is increasingly used to augment water supplies, but traditional fecal indicator approaches for defining and monitoring exposure risks are limited when applied to these decentralized options. This session emphasizes risk-based modeling to define pathogen log-reduction requirements coupled with alternative targets for monitoring enabled by genomic sequencing (i.e., the microbiome of reuse systems). 1. Discuss risk-based modeling to define pathogen log-reduction requirements 2. Review alternative targets for monitoring 3. Gain an understanding of how new tools can help improve successful development of sustainable on-site non-potable water reuse Presented at the Water Wastewater Equipment Treatment & Transport Show.
Mining of Business-Oriented Conversations at a Call Center
NASA Astrophysics Data System (ADS)
Takeuchi, Hironori; Nasukawa, Tetsuya; Watanabe, Hideo
Recently it has become feasible to transcribe textual records from telephone conversations at call centers by using automatic speech recognition. In this research, we extended a text mining system for call summary records and constructed a conversation mining system for the business-oriented conversations at the call center. To acquire useful business insights from the conversational data through the text mining system, it is critical to identify appropriate textual segments and expressions as the viewpoints to focus on. In the analysis of call summary data using a text mining system, some experts defined the viewpoints for the analysis by looking at some sample records and by preparing the dictionaries based on frequent keywords in the sample dataset. However with conversations it is difficult to identify such viewpoints manually and in advance because the target data consists of complete transcripts that are often lengthy and redundant. In this research, we defined a model of the business-oriented conversations and proposed a mining method to identify segments that have impacts on the outcomes of the conversations and can then extract useful expressions in each of these identified segments. In the experiment, we processed the real datasets from a car rental service center and constructed a mining system. With this system, we show the effectiveness of the method based on the defined conversation model.
Hucka, Michael; Bergmann, Frank T.; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M.; Le Novére, Nicolas; Myers, Chris J.; Olivier, Brett G.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Waltemath, Dagmar; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528569
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org.
Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 5 of SBML Level 2. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
River Devices to Recover Energy with Advanced Materials (River DREAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, Daniel P.
2013-07-03
The purpose of this project is to develop a generator called a Galloping Hydroelectric Energy Extraction Device (GHEED). It uses a galloping prism to convert water flow into linear motion. This motion is converted into electricity via a dielectric elastomer generator (DEG). The galloping mechanism and the DEG are combined to create a system to effectively generate electricity. This project has three research objectives: 1. Oscillator development and design a. Characterize galloping behavior, evaluate control surface shape change on oscillator performance and demonstrate shape change with water flow change. 2. Dielectric Energy Generator (DEG) characterization and modeling a. Characterize andmore » model the performance of the DEG based on oscillator design 3. Galloping Hydroelectric Energy Extraction Device (GHEED) system modeling and integration a. Create numerical models for construction of a system performance model and define operating capabilities for this approach Accomplishing these three objectives will result in the creation of a model that can be used to fully define the operating parameters and performance capabilities of a generator based on the GHEED design. This information will be used in the next phase of product development, the creation of an integrated laboratory scale generator to confirm model predictions.« less
Multi-Modal Transportation System Simulation
DOT National Transportation Integrated Search
1971-01-01
THE PRESENT STATUS OF A LABORATORY BEING DEVELOPED FOR REAL-TIME SIMULATION OF COMMAND AND CONTROL FUNCTIONS IN TRANSPORTATION SYSTEMS IS DISCUSSED. DETAILS ARE GIVEN ON THE SIMULATION MODELS AND ON PROGRAMMING TECHNIQUES USED IN DEFINING AND EVALUAT...
An Open Simulation System Model for Scientific Applications
NASA Technical Reports Server (NTRS)
Williams, Anthony D.
1995-01-01
A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.
Automated extraction of knowledge for model-based diagnostics
NASA Technical Reports Server (NTRS)
Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.
1990-01-01
The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.
Development of guidelines for the definition of the relavant information content in data classes
NASA Technical Reports Server (NTRS)
Schmitt, E.
1973-01-01
The problem of experiment design is defined as an information system consisting of information source, measurement unit, environmental disturbances, data handling and storage, and the mathematical analysis and usage of data. Based on today's concept of effective computability, general guidelines for the definition of the relevant information content in data classes are derived. The lack of a universally applicable information theory and corresponding mathematical or system structure is restricting the solvable problem classes to a small set. It is expected that a new relativity theory of information, generally described by a universal algebra of relations will lead to new mathematical models and system structures capable of modeling any well defined practical problem isomorphic to an equivalence relation at any corresponding level of abstractness.
NASA Astrophysics Data System (ADS)
Legro, J. R.; Abi-Samra, N. C.; Tesche, F. M.
1985-05-01
In addition to the initial transients designated as fast transient high-altitude EMP (HEMP) and intermediate time EMP, electromagnetic signals are also perceived at times from seconds to hundreds of seconds after a high-altitude nuclear burst. This signal was defined by the term magnetohydrodynamic-electromagnetic pulse (MHD-EMP). The MHD-EMP phenomena was detected in actual weapon tests and predicted from theoretical models. A preliminary research effort to investigate the nature and coupling of the MHD-EMP environments to electric power systems documented the construction of approximate system response network models, and the development of a unified methodology to assess equipment and systematic vulnerability are defined. The MHD-EMP environment is compared to a qualitatively similar natural event, the electromagnetic environment produced by geomagnetic storms.
Space Generic Open Avionics Architecture (SGOAA) standard specification
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
The purpose of this standard is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of a specific avionics hardware/software system. This standard defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor
2011-01-01
Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.
NASA Astrophysics Data System (ADS)
Macian-Sorribes, Hector; Pulido-Velazquez, Manuel
2013-04-01
Water resources systems are operated, mostly, using a set of pre-defined rules not regarding, usually, to an optimal allocation in terms of water use or economic benefits, but to historical and institutional reasons. These operating policies are reproduced, commonly, as hedging rules, pack rules or zone-based operations, and simulation models can be used to test their performance under a wide range of hydrological and/or socio-economic hypothesis. Despite the high degree of acceptation and testing that these models have achieved, the actual operation of water resources systems hardly follows all the time the pre-defined rules with the consequent uncertainty on the system performance. Real-world reservoir operation is very complex, affected by input uncertainty (imprecision in forecast inflow, seepage and evaporation losses, etc.), filtered by the reservoir operator's experience and natural risk-aversion, while considering the different physical and legal/institutional constraints in order to meet the different demands and system requirements. The aim of this work is to expose a fuzzy logic approach to derive and assess the historical operation of a system. This framework uses a fuzzy rule-based system to reproduce pre-defined rules and also to match as close as possible the actual decisions made by managers. After built up, the fuzzy rule-based system can be integrated in a water resources management model, making possible to assess the system performance at the basin scale. The case study of the Mijares basin (eastern Spain) is used to illustrate the method. A reservoir operating curve regulates the two main reservoir releases (operated in a conjunctive way) with the purpose of guaranteeing a high realiability of supply to the traditional irrigation districts with higher priority (more senior demands that funded the reservoir construction). A fuzzy rule-based system has been created to reproduce the operating curve's performance, defining the system state (total water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").
Generic Educational Knowledge Representation for Adaptive and Cognitive Systems
ERIC Educational Resources Information Center
Caravantes, Arturo; Galan, Ramon
2011-01-01
The interoperability of educational systems, encouraged by the development of specifications, standards and tools related to the Semantic Web is limited to the exchange of information in domain and student models. High system interoperability requires that a common framework be defined that represents the functional essence of educational systems.…
NASA Technical Reports Server (NTRS)
Daly, J. K.; Torian, J. G.
1979-01-01
An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.
Guide to the Stand-Damage Model interface management system
George Racin; J. J. Colbert
1995-01-01
This programmer's support document describes the Gypsy Moth Stand-Damage Model interface management system. Management of stand-damage data made it necessary to define structures to store data and provide the mechanisms to manipulate these data. The software provides a user-friendly means to manipulate files, graph and manage outputs, and edit input data. The...
ERIC Educational Resources Information Center
Ledbetter, Michael P.; Hwang, Tony W.; Stovall, Gwendolyn M.; Ellington, Andrew D.
2013-01-01
Evolution is a defining criterion of life and is central to understanding biological systems. However, the timescale of evolutionary shifts in phenotype limits most classroom evolution experiments to simple probability simulations. "In vitro" directed evolution (IVDE) frequently serves as a model system for the study of Darwinian…
Applying knowledge compilation techniques to model-based reasoning
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.
Rule-based graph theory to enable exploration of the space system architecture design space
NASA Astrophysics Data System (ADS)
Arney, Dale Curtis
The primary goal of this research is to improve upon system architecture modeling in order to enable the exploration of design space options. A system architecture is the description of the functional and physical allocation of elements and the relationships, interactions, and interfaces between those elements necessary to satisfy a set of constraints and requirements. The functional allocation defines the functions that each system (element) performs, and the physical allocation defines the systems required to meet those functions. Trading the functionality between systems leads to the architecture-level design space that is available to the system architect. The research presents a methodology that enables the modeling of complex space system architectures using a mathematical framework. To accomplish the goal of improved architecture modeling, the framework meets five goals: technical credibility, adaptability, flexibility, intuitiveness, and exhaustiveness. The framework is technically credible, in that it produces an accurate and complete representation of the system architecture under consideration. The framework is adaptable, in that it provides the ability to create user-specified locations, steady states, and functions. The framework is flexible, in that it allows the user to model system architectures to multiple destinations without changing the underlying framework. The framework is intuitive for user input while still creating a comprehensive mathematical representation that maintains the necessary information to completely model complex system architectures. Finally, the framework is exhaustive, in that it provides the ability to explore the entire system architecture design space. After an extensive search of the literature, graph theory presents a valuable mechanism for representing the flow of information or vehicles within a simple mathematical framework. Graph theory has been used in developing mathematical models of many transportation and network flow problems in the past, where nodes represent physical locations and edges represent the means by which information or vehicles travel between those locations. In space system architecting, expressing the physical locations (low-Earth orbit, low-lunar orbit, etc.) and steady states (interplanetary trajectory) as nodes and the different means of moving between the nodes (propulsive maneuvers, etc.) as edges formulates a mathematical representation of this design space. The selection of a given system architecture using graph theory entails defining the paths that the systems take through the space system architecture graph. A path through the graph is defined as a list of edges that are traversed, which in turn defines functions performed by the system. A structure to compactly represent this information is a matrix, called the system map, in which the column indices are associated with the systems that exist and row indices are associated with the edges, or functions, to which each system has access. Several contributions have been added to the state of the art in space system architecture analysis. The framework adds the capability to rapidly explore the design space without the need to limit trade options or the need for user interaction during the exploration process. The unique mathematical representation of a system architecture, through the use of the adjacency, incidence, and system map matrices, enables automated design space exploration using stochastic optimization processes. The innovative rule-based graph traversal algorithm ensures functional feasibility of each system architecture that is analyzed, and the automatic generation of the system hierarchy eliminates the need for the user to manually determine the relationships between systems during or before the design space exploration process. Finally, the rapid evaluation of system architectures for various mission types enables analysis of the system architecture design space for multiple destinations within an evolutionary exploration program. (Abstract shortened by UMI.).
Models for predicting adverse outcomes can help reduce and focus animal testing with new and existing chemicals. This short "thought starter" describes how quantitative-structure activity relationship and systems biology models can be used to help define toxicity pathways and li...
Goodman, Dan F M; Brette, Romain
2009-09-01
"Brian" is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.
Security for safety critical space borne systems
NASA Technical Reports Server (NTRS)
Legrand, Sue
1987-01-01
The Space Station contains safety critical computer software components in systems that can affect life and vital property. These components require a multilevel secure system that provides dynamic access control of the data and processes involved. A study is under way to define requirements for a security model providing access control through level B3 of the Orange Book. The model will be prototyped at NASA-Johnson Space Center.
A new method for qualitative simulation of water resources systems: 1. Theory
NASA Astrophysics Data System (ADS)
Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.
1987-11-01
A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.
NASA Astrophysics Data System (ADS)
Zhao, Runchen; Ientilucci, Emmett J.
2017-05-01
Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.
A neural model of border-ownership from kinetic occlusion.
Layton, Oliver W; Yazdanbakhsh, Arash
2015-01-01
Camouflaged animals that have very similar textures to their surroundings are difficult to detect when stationary. However, when an animal moves, humans readily see a figure at a different depth than the background. How do humans perceive a figure breaking camouflage, even though the texture of the figure and its background may be statistically identical in luminance? We present a model that demonstrates how the primate visual system performs figure-ground segregation in extreme cases of breaking camouflage based on motion alone. Border-ownership signals develop as an emergent property in model V2 units whose receptive fields are nearby kinetically defined borders that separate the figure and background. Model simulations support border-ownership as a general mechanism by which the visual system performs figure-ground segregation, despite whether figure-ground boundaries are defined by luminance or motion contrast. The gradient of motion- and luminance-related border-ownership signals explains the perceived depth ordering of the foreground and background surfaces. Our model predicts that V2 neurons, which are sensitive to kinetic edges, are selective to border-ownership (magnocellular B cells). A distinct population of model V2 neurons is selective to border-ownership in figures defined by luminance contrast (parvocellular B cells). B cells in model V2 receive feedback from neurons in V4 and MT with larger receptive fields to bias border-ownership signals toward the figure. We predict that neurons in V4 and MT sensitive to kinetically defined figures play a crucial role in determining whether the foreground surface accretes, deletes, or produces a shearing motion with respect to the background. Copyright © 2014 Elsevier Ltd. All rights reserved.
Topological invariant and cotranslational symmetry in strongly interacting multi-magnon systems
NASA Astrophysics Data System (ADS)
Qin, Xizhou; Mei, Feng; Ke, Yongguan; Zhang, Li; Lee, Chaohong
2018-01-01
It is still an outstanding challenge to characterize and understand the topological features of strongly interacting states such as bound states in interacting quantum systems. Here, by introducing a cotranslational symmetry in an interacting multi-particle quantum system, we systematically develop a method to define a Chern invariant, which is a generalization of the well-known Thouless-Kohmoto-Nightingale-den Nijs invariant, for identifying strongly interacting topological states. As an example, we study the topological multi-magnon states in a generalized Heisenberg XXZ model, which can be realized by the currently available experiment techniques of cold atoms (Aidelsburger et al 2013 Phys. Rev. Lett. 111, 185301; Miyake et al 2013 Phys. Rev. Lett. 111, 185302). Through calculating the two-magnon excitation spectrum and the defined Chern number, we explore the emergence of topological edge bound states and give their topological phase diagram. We also analytically derive an effective single-particle Hofstadter superlattice model for a better understanding of the topological bound states. Our results not only provide a new approach to defining a topological invariant for interacting multi-particle systems, but also give insights into the characterization and understanding of strongly interacting topological states.
Model systems for life processes on Mars
NASA Technical Reports Server (NTRS)
Mitz, M. A.
1974-01-01
In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.
Geometric curvature and phase of the Rabi model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, Lijun; Huai, Sainan; Guo, Liping
2015-11-15
We study the geometric curvature and phase of the Rabi model. Under the rotating-wave approximation (RWA), we apply the gauge independent Berry curvature over a surface integral to calculate the Berry phase of the eigenstates for both single and two-qubit systems, which is found to be identical with the system of spin-1/2 particle in a magnetic field. We extend the idea to define a vacuum-induced geometric curvature when the system starts from an initial state with pure vacuum bosonic field. The induced geometric phase is related to the average photon number in a period which is possible to measure inmore » the qubit–cavity system. We also calculate the geometric phase beyond the RWA and find an anomalous sudden change, which implies the breakdown of the adiabatic theorem and the Berry phases in an adiabatic cyclic evolution are ill-defined near the anti-crossing point in the spectrum.« less
Le management des projets scientifiques
NASA Astrophysics Data System (ADS)
Perrier, Françoise
2000-12-01
We describe in this paper a new approach for the management of scientific projects. This approach is the result of a long reflexion carried out within the MQDP (Methodology and Quality in the Project Development) group of INSU-CNRS, and continued with Guy Serra. Our reflexion was initiated with the study of the so-called `North-American Paradigm' which was, initially considered as the only relevant management model. Through our active participation in several astrophysical projects we realized that this model could not be applied to our laboratories without major modifications. Therefore, step-by-step, we have constructed our own methodology, using to the fullest human potential resources existing in our research field, their habits and skills. We have also participated in various working groups in industrial and scientific organisms for the benefits of CNRS. The management model presented here is based on a systemic and complex approach. This approach lets us describe the multiple aspects of a scientific project specially taking into account the human dimension. The project system model includes three major interconnected systems, immersed within an influencing and influenced environment: the `System to be Realized' which defines scientific and technical tasks leading to the scientific goals, the `Realizing System' which describes procedures, processes and organization, and the `Actors' System' which implements and boosts all the processes. Each one exists only through a series of successive models, elaborated at predefined dates of the project called `key-points'. These systems evolve with time and under often-unpredictable circumstances and the models have to take it into account. At these key-points, each model is compared to reality and the difference between the predicted and realized tasks is evaluated in order to define the data for the next model. This model can be applied to any kind of projects.
Biomathematical modeling of pulsatile hormone secretion: a historical perspective.
Evans, William S; Farhy, Leon S; Johnson, Michael L
2009-01-01
Shortly after the recognition of the profound physiological significance of the pulsatile nature of hormone secretion, computer-based modeling techniques were introduced for the identification and characterization of such pulses. Whereas these earlier approaches defined perturbations in hormone concentration-time series, deconvolution procedures were subsequently employed to separate such pulses into their secretion event and clearance components. Stochastic differential equation modeling was also used to define basal and pulsatile hormone secretion. To assess the regulation of individual components within a hormone network, a method that quantitated approximate entropy within hormone concentration-times series was described. To define relationships within coupled hormone systems, methods including cross-correlation and cross-approximate entropy were utilized. To address some of the inherent limitations of these methods, modeling techniques with which to appraise the strength of feedback signaling between and among hormone-secreting components of a network have been developed. Techniques such as dynamic modeling have been utilized to reconstruct dose-response interactions between hormones within coupled systems. A logical extension of these advances will require the development of mathematical methods with which to approximate endocrine networks exhibiting multiple feedback interactions and subsequently reconstruct their parameters based on experimental data for the purpose of testing regulatory hypotheses and estimating alterations in hormone release control mechanisms.
Sharif Razavian, Reza; Mehrabi, Naser; McPhee, John
2015-01-01
This paper presents a new model-based method to define muscle synergies. Unlike the conventional factorization approach, which extracts synergies from electromyographic data, the proposed method employs a biomechanical model and formally defines the synergies as the solution of an optimal control problem. As a result, the number of required synergies is directly related to the dimensions of the operational space. The estimated synergies are posture-dependent, which correlate well with the results of standard factorization methods. Two examples are used to showcase this method: a two-dimensional forearm model, and a three-dimensional driver arm model. It has been shown here that the synergies need to be task-specific (i.e., they are defined for the specific operational spaces: the elbow angle and the steering wheel angle in the two systems). This functional definition of synergies results in a low-dimensional control space, in which every force in the operational space is accurately created by a unique combination of synergies. As such, there is no need for extra criteria (e.g., minimizing effort) in the process of motion control. This approach is motivated by the need for fast and bio-plausible feedback control of musculoskeletal systems, and can have important implications in engineering, motor control, and biomechanics. PMID:26500530
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.
Evaluation methodology for query-based scene understanding systems
NASA Astrophysics Data System (ADS)
Huster, Todd P.; Ross, Timothy D.; Culbertson, Jared L.
2015-05-01
In this paper, we are proposing a method for the principled evaluation of scene understanding systems in a query-based framework. We can think of a query-based scene understanding system as a generalization of typical sensor exploitation systems where instead of performing a narrowly defined task (e.g., detect, track, classify, etc.), the system can perform general user-defined tasks specified in a query language. Examples of this type of system have been developed as part of DARPA's Mathematics of Sensing, Exploitation, and Execution (MSEE) program. There is a body of literature on the evaluation of typical sensor exploitation systems, but the open-ended nature of the query interface introduces new aspects to the evaluation problem that have not been widely considered before. In this paper, we state the evaluation problem and propose an approach to efficiently learn about the quality of the system under test. We consider the objective of the evaluation to be to build a performance model of the system under test, and we rely on the principles of Bayesian experiment design to help construct and select optimal queries for learning about the parameters of that model.
World Energy Projection System Plus Model Documentation: Commercial Module
2016-01-01
The Commercial Model of the World Energy Projection System Plus (WEPS ) is an energy demand modeling system of the world commercial end?use sector at a regional level. This report describes the version of the Commercial Model that was used to produce the commercial sector projections published in the International Energy Outlook 2016 (IEO2016). The Commercial Model is one of 13 components of the WEPS system. The WEPS is a modular system, consisting of a number of separate energy models that are communicate and work with each other through an integrated system model. The model components are each developed independently, but are designed with well?defined protocols for system communication and interactivity. The WEPS modeling system uses a shared database (the “restart” file) that allows all the models to communicate with each other when they are run in sequence over a number of iterations. The overall WEPS system uses an iterative solution technique that forces convergence of consumption and supply pressures to solve for an equilibrium price.
Redesign of a health science centre: reflections on co-leadership.
MacTavish, M; Norton, P
1995-01-01
Since 1988, the Sunnybrook Health Science Centre has been proactive in re-designing its system toward decentralized management, the purpose being to further enhance patient care. This process has involved numerous changes, among which were the establishment of three large clinical units. These clinical units are not defined following the historic medical model, but group patients with similar service and care needs. Subsequently, each of the clinical units defined Patient Service Units (PSUs). The hospital has chosen a co-leadership model for the lead management at each of the unit levels. This paper describes the model for clinical units.
Nonequilibrium phase transitions in isotropic Ashkin-Teller model
NASA Astrophysics Data System (ADS)
Akıncı, Ümit
2017-03-01
Dynamic behavior of an isotropic Ashkin-Teller model in the presence of a periodically oscillating magnetic field has been analyzed by means of the mean field approximation. The dynamic equation of motion has been constructed with the help of a Glauber type stochastic process and solved for a square lattice. After defining the possible dynamical phases of the system, phase diagrams have been given and the behavior of the hysteresis loops has been investigated in detail. The hysteresis loop for specific order parameter of isotropic Ashkin-Teller model has been defined and characteristics of this loop in different dynamical phases have been given.
Describing different brain computer interface systems through a unique model: a UML implementation.
Quitadamo, Lucia Rita; Marciani, Maria Grazia; Cardarilli, Gian Carlo; Bianchi, Luigi
2008-01-01
All the protocols currently implemented in brain computer interface (BCI) experiments are characterized by different structural and temporal entities. Moreover, due to the lack of a unique descriptive model for BCI systems, there is not a standard way to define the structure and the timing of a BCI experimental session among different research groups and there is also great discordance on the meaning of the most common terms dealing with BCI, such as trial, run and session. The aim of this paper is to provide a unified modeling language (UML) implementation of BCI systems through a unique dynamic model which is able to describe the main protocols defined in the literature (P300, mu-rhythms, SCP, SSVEP, fMRI) and demonstrates to be reasonable and adjustable according to different requirements. This model includes a set of definitions of the typical entities encountered in a BCI, diagrams which explain the structural correlations among them and a detailed description of the timing of a trial. This last represents an innovation with respect to the models already proposed in the literature. The UML documentation and the possibility of adapting this model to the different BCI systems built to date, make it a basis for the implementation of new systems and a mean for the unification and dissemination of resources. The model with all the diagrams and definitions reported in the paper are the core of the body language framework, a free set of routines and tools for the implementation, optimization and delivery of cross-platform BCI systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ringler, Todd; Ju, Lili; Gunzburger, Max
2008-11-14
During the next decade and beyond, climate system models will be challenged to resolve scales and processes that are far beyond their current scope. Each climate system component has its prototypical example of an unresolved process that may strongly influence the global climate system, ranging from eddy activity within ocean models, to ice streams within ice sheet models, to surface hydrological processes within land system models, to cloud processes within atmosphere models. These new demands will almost certainly result in the develop of multiresolution schemes that are able, at least regionally, to faithfully simulate these fine-scale processes. Spherical centroidal Voronoimore » tessellations (SCVTs) offer one potential path toward the development of a robust, multiresolution climate system model components. SCVTs allow for the generation of high quality Voronoi diagrams and Delaunay triangulations through the use of an intuitive, user-defined density function. In each of the examples provided, this method results in high-quality meshes where the quality measures are guaranteed to improve as the number of nodes is increased. Real-world examples are developed for the Greenland ice sheet and the North Atlantic ocean. Idealized examples are developed for ocean–ice shelf interaction and for regional atmospheric modeling. In addition to defining, developing, and exhibiting SCVTs, we pair this mesh generation technique with a previously developed finite-volume method. Our numerical example is based on the nonlinear, shallow water equations spanning the entire surface of the sphere. This example is used to elucidate both the potential benefits of this multiresolution method and the challenges ahead.« less
He, Temple; Habib, Salman
2013-09-01
Simple dynamical systems--with a small number of degrees of freedom--can behave in a complex manner due to the presence of chaos. Such systems are most often (idealized) limiting cases of more realistic situations. Isolating a small number of dynamical degrees of freedom in a realistically coupled system generically yields reduced equations with terms that can have a stochastic interpretation. In situations where both noise and chaos can potentially exist, it is not immediately obvious how Lyapunov exponents, key to characterizing chaos, should be properly defined. In this paper, we show how to do this in a class of well-defined noise-driven dynamical systems, derived from an underlying Hamiltonian model.
Ontology for Life-Cycle Modeling of Electrical Distribution Systems: Model View Definition
2013-06-01
building information models ( BIM ) at the coordinated design stage of building construction. 1.3 Approach To...standard for exchanging Building Information Modeling ( BIM ) data, which defines hundreds of classes for common use in software, currently supported by...specifications, Construction Operations Building in- formation exchange (COBie), Building Information Modeling ( BIM ) 16. SECURITY CLASSIFICATION OF:
Enhancements to the Branched Lagrangian Transport Modeling System
Jobson, Harvey E.
1997-01-01
The Branched Lagrangian Transport Model (BLTM) has received wide use within the U.S. Geological Survey over the past 10 years. This report documents the enhancements and modifications that have been made to this modeling system since it was first introduced. The programs in the modeling system are arranged into five levels?programs to generate time-series of meteorological data (EQULTMP, SOLAR), programs to process time-series data (INTRP, MRG), programs to build input files for transport model (BBLTM, BQUAL2E), the model with defined reaction kinetics (BLTM, QUAL2E), and post processor plotting programs (CTPLT, CXPLT). An example application is presented to illustrate how the modeling system can be used to simulate 10 water-quality constituents in the Chattahoochee River below Atlanta, Georgia.
NASA Astrophysics Data System (ADS)
Hakkarinen, C.; Brown, D.; Callahan, J.; hankin, S.; de Koningh, M.; Middleton-Link, D.; Wigley, T.
2001-05-01
A Web-based access system to climate model output data sets for intercomparison and analysis has been produced, using the NOAA-PMEL developed Live Access Server software as host server and Ferret as the data serving and visualization engine. Called ARCAS ("ACACIA Regional Climate-data Access System"), and publicly accessible at http://dataserver.ucar.edu/arcas, the site currently serves climate model outputs from runs of the NCAR Climate System Model for the 21st century, for Business as Usual and Stabilization of Greenhouse Gas Emission scenarios. Users can select, download, and graphically display single variables or comparisons of two variables from either or both of the CSM model runs, averaged for monthly, seasonal, or annual time resolutions. The time length of the averaging period, and the geographical domain for download and display, are fully selectable by the user. A variety of arithmetic operations on the data variables can be computed "on-the-fly", as defined by the user. Expansions of the user-selectable options for defining analysis options, and for accessing other DOD-compatible ("Distributed Ocean Data System-compatible") data sets, residing at locations other than the NCAR hardware server on which ARCAS operates, are planned for this year. These expansions are designed to allow users quick and easy-to-operate web-based access to the largest possible selection of climate model output data sets available throughout the world.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
Modeling Complex Cross-Systems Software Interfaces Using SysML
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin
2013-01-01
The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).
SSBRP Communication & Data System Development using the Unified Modeling Language (UML)
NASA Technical Reports Server (NTRS)
Windrem, May; Picinich, Lou; Givens, John J. (Technical Monitor)
1998-01-01
The Unified Modeling Language (UML) is the standard method for specifying, visualizing, and documenting the artifacts of an object-oriented system under development. UML is the unification of the object-oriented methods developed by Grady Booch and James Rumbaugh, and of the Use Case Model developed by Ivar Jacobson. This paper discusses the application of UML by the Communications and Data Systems (CDS) team to model the ground control and command of the Space Station Biological Research Project (SSBRP) User Operations Facility (UOF). UML is used to define the context of the system, the logical static structure, the life history of objects, and the interactions among objects.
Neuro-Linguistic Programming: Enhancing Teacher-Student Communications.
ERIC Educational Resources Information Center
Childers, John H., Jr.
1985-01-01
Defines Neurolinguistic Programming (NCP) and discusses specific dimensions of the model that have applications for classroom teaching. Describes five representational systems individuals use to process information and gives examples of effective and ineffective teacher-student communication for each system. (MCF)
Stochastic blockmodeling of the modules and core of the Caenorhabditis elegans connectome.
Pavlovic, Dragana M; Vértes, Petra E; Bullmore, Edward T; Schafer, William R; Nichols, Thomas E
2014-01-01
Recently, there has been much interest in the community structure or mesoscale organization of complex networks. This structure is characterised either as a set of sparsely inter-connected modules or as a highly connected core with a sparsely connected periphery. However, it is often difficult to disambiguate these two types of mesoscale structure or, indeed, to summarise the full network in terms of the relationships between its mesoscale constituents. Here, we estimate a community structure with a stochastic blockmodel approach, the Erdős-Rényi Mixture Model, and compare it to the much more widely used deterministic methods, such as the Louvain and Spectral algorithms. We used the Caenorhabditis elegans (C. elegans) nervous system (connectome) as a model system in which biological knowledge about each node or neuron can be used to validate the functional relevance of the communities obtained. The deterministic algorithms derived communities with 4-5 modules, defined by sparse inter-connectivity between all modules. In contrast, the stochastic Erdős-Rényi Mixture Model estimated a community with 9 blocks or groups which comprised a similar set of modules but also included a clearly defined core, made of 2 small groups. We show that the "core-in-modules" decomposition of the worm brain network, estimated by the Erdős-Rényi Mixture Model, is more compatible with prior biological knowledge about the C. elegans nervous system than the purely modular decomposition defined deterministically. We also show that the blockmodel can be used both to generate stochastic realisations (simulations) of the biological connectome, and to compress network into a small number of super-nodes and their connectivity. We expect that the Erdős-Rényi Mixture Model may be useful for investigating the complex community structures in other (nervous) systems.
Petersson, K J F; Friberg, L E; Karlsson, M O
2010-10-01
Computer models of biological systems grow more complex as computing power increase. Often these models are defined as differential equations and no analytical solutions exist. Numerical integration is used to approximate the solution; this can be computationally intensive, time consuming and be a large proportion of the total computer runtime. The performance of different integration methods depend on the mathematical properties of the differential equations system at hand. In this paper we investigate the possibility of runtime gains by calculating parts of or the whole differential equations system at given time intervals, outside of the differential equations solver. This approach was tested on nine models defined as differential equations with the goal to reduce runtime while maintaining model fit, based on the objective function value. The software used was NONMEM. In four models the computational runtime was successfully reduced (by 59-96%). The differences in parameter estimates, compared to using only the differential equations solver were less than 12% for all fixed effects parameters. For the variance parameters, estimates were within 10% for the majority of the parameters. Population and individual predictions were similar and the differences in OFV were between 1 and -14 units. When computational runtime seriously affects the usefulness of a model we suggest evaluating this approach for repetitive elements of model building and evaluation such as covariate inclusions or bootstraps.
NASA Technical Reports Server (NTRS)
Breckenridge, Jonathan T.; Johnson, Stephen B.
2013-01-01
This paper describes the core framework used to implement a Goal-Function Tree (GFT) based systems engineering process using the Systems Modeling Language. It defines a set of principles built upon by the theoretical approach described in the InfoTech 2013 ISHM paper titled "Goal-Function Tree Modeling for Systems Engineering and Fault Management" presented by Dr. Stephen B. Johnson. Using the SysML language, the principles in this paper describe the expansion of the SysML language as a baseline in order to: hierarchically describe a system, describe that system functionally within success space, and allocate detection mechanisms to success functions for system protection.
USDA-ARS?s Scientific Manuscript database
Simulation of vertical soil hydrology is a critical component of simulating even more complex soil water dynamics in space and time, including land-atmosphere and subsurface interactions. The AgroEcoSystem (AgES) model is defined here as a single land unit implementation of the full AgES-W (Watershe...
ERIC Educational Resources Information Center
Winkel, Annette; Schwarz, Stephan
By carefully considering the special characteristics of two small African scientific and technical (S&T) information systems for research and development (R&D), this report defines a simple and straightforward model which can be easily implemented in similar situations with a minimum of external support. The model is designed to build up a…
A Model-Driven Approach to e-Course Management
ERIC Educational Resources Information Center
Savic, Goran; Segedinac, Milan; Milenkovic, Dušica; Hrin, Tamara; Segedinac, Mirjana
2018-01-01
This paper presents research on using a model-driven approach to the development and management of electronic courses. We propose a course management system which stores a course model represented as distinct machine-readable components containing domain knowledge of different course aspects. Based on this formally defined platform-independent…
Economic Modeling as a Component of Academic Strategic Planning.
ERIC Educational Resources Information Center
MacKinnon, Joyce; Sothmann, Mark; Johnson, James
2001-01-01
Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)
A measurement-based performability model for a multiprocessor system
NASA Technical Reports Server (NTRS)
Ilsueh, M. C.; Iyer, Ravi K.; Trivedi, K. S.
1987-01-01
A measurement-based performability model based on real error-data collected on a multiprocessor system is described. Model development from the raw errror-data to the estimation of cumulative reward is described. Both normal and failure behavior of the system are characterized. The measured data show that the holding times in key operational and failure states are not simple exponential and that semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different failure types and recovery procedures.
Using constraints and their value for optimization of large ODE systems
Domijan, Mirela; Rand, David A.
2015-01-01
We provide analytical tools to facilitate a rigorous assessment of the quality and value of the fit of a complex model to data. We use this to provide approaches to model fitting, parameter estimation, the design of optimization functions and experimental optimization. This is in the context where multiple constraints are used to select or optimize a large model defined by differential equations. We illustrate the approach using models of circadian clocks and the NF-κB signalling system. PMID:25673300
A Logical Basis In The Layered Computer Vision Systems Model
NASA Astrophysics Data System (ADS)
Tejwani, Y. J.
1986-03-01
In this paper a four layer computer vision system model is described. The model uses a finite memory scratch pad. In this model planar objects are defined as predicates. Predicates are relations on a k-tuple. The k-tuple consists of primitive points and relationship between primitive points. The relationship between points can be of the direct type or the indirect type. Entities are goals which are satisfied by a set of clauses. The grammar used to construct these clauses is examined.
Minimal time spiking in various ChR2-controlled neuron models.
Renault, Vincent; Thieullen, Michèle; Trélat, Emmanuel
2018-02-01
We use conductance based neuron models, and the mathematical modeling of optogenetics to define controlled neuron models and we address the minimal time control of these affine systems for the first spike from equilibrium. We apply tools of geometric optimal control theory to study singular extremals, and we implement a direct method to compute optimal controls. When the system is too large to theoretically investigate the existence of singular optimal controls, we observe numerically the optimal bang-bang controls.
Analysis of the Hexapod Work Space using integration of a CAD/CAE system and the LabVIEW software
NASA Astrophysics Data System (ADS)
Herbuś, K.; Ociepka, P.
2015-11-01
The paper presents the problems related to the integration of a CAD/CAE system with the LabVIEW software. The purpose of the integration is to determine the workspace of a hexapod model basing on a mathematical model describing it motion. In the first stage of the work concerning the integration task the 3D model to simulate movements of a hexapod was elaborated. This phase of the work was done in the “Motion Simulation” module of the CAD/CAE/CAM Siemens NX system. The first step was to define the components of the 3D model in the form of “links”. Individual links were defined according to the nature of the hexapod elements action. In the model prepared for movement simulation were created links corresponding to such elements as: electric actuator, top plate, bottom plate, ball-and-socket joint, toggle joint Phillips. Then were defined the constraints of the “joint” type (e.g.: revolute joint, slider joint, spherical joint) between the created component of the “link” type, so that the computer simulation corresponds to the operation of a real hexapod. The next stage of work included implementing the mathematical model describing the functioning of a hexapod in the LabVIEW software. At this stage, particular attention was paid to determining procedures for integrating the virtual 3D hexapod model with the results of calculations performed in the LabVIEW. The results relate to specific values of the jump of electric actuators depending on the position of the car on the hexapod. The use of integration made it possible to determine the safe operating space of a stationary hexapod taking into consideration the security of a person in the driving simulator designed for the disabled.
A discrete control model of PLANT
NASA Technical Reports Server (NTRS)
Mitchell, C. M.
1985-01-01
A model of the PLANT system using the discrete control modeling techniques developed by Miller is described. Discrete control models attempt to represent in a mathematical form how a human operator might decompose a complex system into simpler parts and how the control actions and system configuration are coordinated so that acceptable overall system performance is achieved. Basic questions include knowledge representation, information flow, and decision making in complex systems. The structure of the model is a general hierarchical/heterarchical scheme which structurally accounts for coordination and dynamic focus of attention. Mathematically, the discrete control model is defined in terms of a network of finite state systems. Specifically, the discrete control model accounts for how specific control actions are selected from information about the controlled system, the environment, and the context of the situation. The objective is to provide a plausible and empirically testable accounting and, if possible, explanation of control behavior.
NASA Technical Reports Server (NTRS)
Liu, F. C.
1986-01-01
The objective of this investigation is to make analytical determination of the acceleration produced by crew motion in an orbiting space station and define design parameters for the suspension system of microgravity experiments. A simple structural model for simulation of the IOC space station is proposed. Mathematical formulation of this model provides the engineers a simple and direct tool for designing an effective suspension system.
Defining the Meaning of a Major Modeling and Simulation Change as Applied to Accreditation
2012-12-12
the University of Alabama in Huntsville in 2010. His research interests include model- driven engineering, embedded systems , cloud computing. J...Stevens Institute of Technology, Systems Engineering Research Center This material is based upon work supported, in whole or in part, by the U.S...Department of Defense through the Systems Engineering Research Center (SERC) under Contract H98230-08-D-0171. SERC is a federally funded University
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core
Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.
2017-01-01
Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-09-04
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.
Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J
2015-06-01
Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.
Belcher, Wayne R.; Faunt, Claudia C.; D'Agnese, Frank A.
2002-01-01
The U.S. Geological Survey, in cooperation with the Department of Energy and other Federal, State, and local agencies, is evaluating the hydrogeologic characteristics of the Death Valley regional ground-water flow system. The ground-water flow system covers an area of about 100,000 square kilometers from latitude 35? to 38?15' North to longitude 115? to 118? West, with the flow system proper comprising about 45,000 square kilometers. The Death Valley regional ground-water flow system is one of the larger flow systems within the Southwestern United States and includes in its boundaries the Nevada Test Site, Yucca Mountain, and much of Death Valley. Part of this study includes the construction of a three-dimensional hydrogeologic framework model to serve as the foundation for the development of a steady-state regional ground-water flow model. The digital framework model provides a computer-based description of the geometry and composition of the hydrogeologic units that control regional flow. The framework model of the region was constructed by merging two previous framework models constructed for the Yucca Mountain Project and the Environmental Restoration Program Underground Test Area studies at the Nevada Test Site. The hydrologic characteristics of the region result from a currently arid climate and complex geology. Interbasinal regional ground-water flow occurs through a thick carbonate-rock sequence of Paleozoic age, a locally thick volcanic-rock sequence of Tertiary age, and basin-fill alluvium of Tertiary and Quaternary age. Throughout the system, deep and shallow ground-water flow may be controlled by extensive and pervasive regional and local faults and fractures. The framework model was constructed using data from several sources to define the geometry of the regional hydrogeologic units. These data sources include (1) a 1:250,000-scale hydrogeologic-map compilation of the region; (2) regional-scale geologic cross sections; (3) borehole information, and (4) gridded surfaces from a previous three-dimensional geologic model. In addition, digital elevation model data were used in conjunction with these data to define ground-surface altitudes. These data, properly oriented in three dimensions by using geographic information systems, were combined and gridded to produce the upper surfaces of the hydrogeologic units used in the flow model. The final geometry of the framework model is constructed as a volumetric model by incorporating the intersections of these gridded surfaces and by applying fault truncation rules to structural features from the geologic map and cross sections. The cells defining the geometry of the hydrogeologic framework model can be assigned several attributes such as lithology, hydrogeologic unit, thickness, and top and bottom altitudes.
NASA Technical Reports Server (NTRS)
Evers, Ken H.; Bachert, Robert F.
1987-01-01
The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.
Passive millimeter-wave imaging
NASA Technical Reports Server (NTRS)
Young, Stephen K.; Davidheiser, Roger A.; Hauss, Bruce; Lee, Paul S. C.; Mussetto, Michael; Shoucri, Merit M.; Yujiri, Larry
1993-01-01
Millimeter-wave hardware systems are being developed. Our approach begins with identifying and defining the applications. System requirements are then specified based on mission needs using our end-to-end performance model. The model was benchmarked against existing data bases and, where data is deficient, it is acquired via field measurements. The derived system requirements are then validated with the appropriate field measurements using our imaging testbeds and hardware breadboards. The result is a final system that satisfies all the requirements of the target mission.
Toward a Model-Based Approach for Flight System Fault Protection
NASA Technical Reports Server (NTRS)
Day, John; Meakin, Peter; Murray, Alex
2012-01-01
Use SysML/UML to describe the physical structure of the system This part of the model would be shared with other teams - FS Systems Engineering, Planning & Execution, V&V, Operations, etc., in an integrated model-based engineering environment Use the UML Profile mechanism, defining Stereotypes to precisely express the concepts of the FP domain This extends the UML/SysML languages to contain our FP concepts Use UML/SysML, along with our profile, to capture FP concepts and relationships in the model Generate typical FP engineering products (the FMECA, Fault Tree, MRD, V&V Matrices)
NASA Technical Reports Server (NTRS)
Kurien, J.; Nayak, P.; Williams, B.; Koga, Dennis (Technical Monitor)
1998-01-01
MPL is the language with which a modeler describes a system to be diagnosed or controlled by Livingstone. MPL is used to specify what are the components of the system, how they are interconnected, and how they behave both nominally and when failed. Component behavioral models used by Livingstone are described by a set of propositional, well-formed formula (wff). An understanding of well-formed formula, primitive component types specified through defcomponent, and device structure specified by defmodule, is essential to understanding of MPL, This document describes: welI-formed formula (wff): The basis for describing the behavior of a component in a system defvalues: Specifies the domain (legal values) of a variable defcomponent: Defines the modes, behaviors and mode transitions for primitive components deftnodule: Defines composite devices, consisting of interconnected components defrelation: A macro mechanism for expanding a complex wff according to the value of an argument forall: An iteration construct used to expand a wff or relation on a set of arguments defsymbol-expansion: A mechanism for naming a collection of symbols (eg the name of all valves in the system)
Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B
NASA Technical Reports Server (NTRS)
Yeganefard, Sanaz; Butler, Michael; Rezazadeh, Abdolbaghi
2010-01-01
Recently a set of guidelines, or cookbook, has been developed for modelling and refinement of control problems in Event-B. The Event-B formal method is used for system-level modelling by defining states of a system and events which act on these states. It also supports refinement of models. This cookbook is intended to systematize the process of modelling and refining a control problem system by distinguishing environment, controller and command phenomena. Our main objective in this paper is to investigate and evaluate the usefulness and effectiveness of this cookbook by following it throughout the formal modelling of cruise control system found in cars. The outcomes are identifying the benefits of the cookbook and also giving guidance to its future users.
A general U-block model-based design procedure for nonlinear polynomial control systems
NASA Astrophysics Data System (ADS)
Zhu, Q. M.; Zhao, D. Y.; Zhang, Jianhua
2016-10-01
The proposition of U-model concept (in terms of 'providing concise and applicable solutions for complex problems') and a corresponding basic U-control design algorithm was originated in the first author's PhD thesis. The term of U-model appeared (not rigorously defined) for the first time in the first author's other journal paper, which established a framework for using linear polynomial control system design approaches to design nonlinear polynomial control systems (in brief, linear polynomial approaches → nonlinear polynomial plants). This paper represents the next milestone work - using linear state-space approaches to design nonlinear polynomial control systems (in brief, linear state-space approaches → nonlinear polynomial plants). The overall aim of the study is to establish a framework, defined as the U-block model, which provides a generic prototype for using linear state-space-based approaches to design the control systems with smooth nonlinear plants/processes described by polynomial models. For analysing the feasibility and effectiveness, sliding mode control design approach is selected as an exemplary case study. Numerical simulation studies provide a user-friendly step-by-step procedure for the readers/users with interest in their ad hoc applications. In formality, this is the first paper to present the U-model-oriented control system design in a formal way and to study the associated properties and theorems. The previous publications, in the main, have been algorithm-based studies and simulation demonstrations. In some sense, this paper can be treated as a landmark for the U-model-based research from intuitive/heuristic stage to rigour/formal/comprehensive studies.
On the dynamics of chain systems. [applications in manipulator and human body models
NASA Technical Reports Server (NTRS)
Huston, R. L.; Passerello, C. E.
1974-01-01
A computer-oriented method for obtaining dynamical equations of motion for chain systems is presented. A chain system is defined as an arbitrarily assembled set of rigid bodies such that adjoining bodies have at least one common point and such that closed loops are not formed. The equations of motion are developed through the use of Lagrange's form of d'Alembert's principle. The method and procedure is illustrated with an elementary study of a tripod space manipulator. The method is designed for application with systems such as human body models, chains and cables, and dynamic finite-segment models.
NASA Lewis Wind Tunnel Model Systems Criteria
NASA Technical Reports Server (NTRS)
Soeder, Ronald H.; Haller, Henry C.
1994-01-01
This report describes criteria for the design, analysis, quality assurance, and documentation of models or test articles that are to be tested in the aeropropulsion facilities at the NASA Lewis Research Center. The report presents three methods for computing model allowable stresses on the basis of the yield stress or ultimate stress, and it gives quality assurance criteria for models tested in Lewis' aeropropulsion facilities. Both customer-furnished model systems and in-house model systems are discussed. The functions of the facility manager, project engineer, operations engineer, research engineer, and facility electrical engineer are defined. The format for pretest meetings, prerun safety meetings, and the model criteria review are outlined Then, the format for the model systems report (a requirement for each model that is to be tested at NASA Lewis) is described, the engineers that are responsible for developing the model systems report are listed, and the time table for its delivery to the facility manager is given.
Towards a Stakeholder Model for the Co-Production of the Public-Sector Information System
ERIC Educational Resources Information Center
Correia, Zita P.
2005-01-01
Introduction: Proposes a systemic approach to Public Sector Information (PSI), defined as comprising entities in four categories--citizens, businesses, policymakers and administrations. This system also comprises four categories of information--on citizenship, economic and social development, policy and administration. Methods: A selective…
Using Agent Base Models to Optimize Large Scale Network for Large System Inventories
NASA Technical Reports Server (NTRS)
Shameldin, Ramez Ahmed; Bowling, Shannon R.
2010-01-01
The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.
A New Method for Conceptual Modelling of Information Systems
NASA Astrophysics Data System (ADS)
Gustas, Remigijus; Gustiene, Prima
Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.
VizieR Online Data Catalog: Simulation data for 50 planetary model systems (Hansen+, 2015)
NASA Astrophysics Data System (ADS)
Hansen, B. M. S.; Murray, N.
2017-11-01
We have used the results (after 10 Myr of evolution) of 50 model realizations of the 20 M{Earth} rocky planet systems from Hansen & Murray (2013ApJ...775...53H) to define the initial state of our systems, given in Table A1. We assume all the planets are of terrestrial class, in the sense that they obey the tidal dissipation, and evolve them for 10 Gyr according to our model for tidal+secular evolution. The final configurations are given in Table A2. (2 data files).
Clinical Named Entity Recognition Using Deep Learning Models.
Wu, Yonghui; Jiang, Min; Xu, Jun; Zhi, Degui; Xu, Hua
2017-01-01
Clinical Named Entity Recognition (NER) is a critical natural language processing (NLP) task to extract important concepts (named entities) from clinical narratives. Researchers have extensively investigated machine learning models for clinical NER. Recently, there have been increasing efforts to apply deep learning models to improve the performance of current clinical NER systems. This study examined two popular deep learning architectures, the Convolutional Neural Network (CNN) and the Recurrent Neural Network (RNN), to extract concepts from clinical texts. We compared the two deep neural network architectures with three baseline Conditional Random Fields (CRFs) models and two state-of-the-art clinical NER systems using the i2b2 2010 clinical concept extraction corpus. The evaluation results showed that the RNN model trained with the word embeddings achieved a new state-of-the- art performance (a strict F1 score of 85.94%) for the defined clinical NER task, outperforming the best-reported system that used both manually defined and unsupervised learning features. This study demonstrates the advantage of using deep neural network architectures for clinical concept extraction, including distributed feature representation, automatic feature learning, and long-term dependencies capture. This is one of the first studies to compare the two widely used deep learning models and demonstrate the superior performance of the RNN model for clinical NER.
Clinical Named Entity Recognition Using Deep Learning Models
Wu, Yonghui; Jiang, Min; Xu, Jun; Zhi, Degui; Xu, Hua
2017-01-01
Clinical Named Entity Recognition (NER) is a critical natural language processing (NLP) task to extract important concepts (named entities) from clinical narratives. Researchers have extensively investigated machine learning models for clinical NER. Recently, there have been increasing efforts to apply deep learning models to improve the performance of current clinical NER systems. This study examined two popular deep learning architectures, the Convolutional Neural Network (CNN) and the Recurrent Neural Network (RNN), to extract concepts from clinical texts. We compared the two deep neural network architectures with three baseline Conditional Random Fields (CRFs) models and two state-of-the-art clinical NER systems using the i2b2 2010 clinical concept extraction corpus. The evaluation results showed that the RNN model trained with the word embeddings achieved a new state-of-the- art performance (a strict F1 score of 85.94%) for the defined clinical NER task, outperforming the best-reported system that used both manually defined and unsupervised learning features. This study demonstrates the advantage of using deep neural network architectures for clinical concept extraction, including distributed feature representation, automatic feature learning, and long-term dependencies capture. This is one of the first studies to compare the two widely used deep learning models and demonstrate the superior performance of the RNN model for clinical NER. PMID:29854252
NASA Astrophysics Data System (ADS)
Leuchter, S.; Reinert, F.; Müller, W.
2014-06-01
Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.
Methodological approach towards the definition of new storage conditions for inert wastes.
Perrodin, Y; Méhu, J; Grelier-Volatier, L; Charbonnierb, P; Baranger, P; Thoraval, L
2002-01-01
In 1997, the French Ministry of Environment launched studies aiming to define a specific regulation concerning inert waste disposal in order to limit potential impact of such facilities on the environment by fixing minimum requirements. A model (chemical model/hydrodynamic model) was developed to determine dumping conditions. This model was then applied on two defined scenarios (landfill surface, effective rainfalls...) in order to study the sulphate concentrations in aquifer system immediately downstream from the storage facility. Results allow us to determine in which conditions the sulphates concentrations are compatibles with the potentially drinkable character of the groundwater. They more specifically concern the nature of the waste disposed of, the efficient rainfalls and the landfill area.
Goodman, Dan F. M.; Brette, Romain
2009-01-01
“Brian” is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience. PMID:20011141
Guide on Data Models in the Selection and Use of Database Management Systems. Final Report.
ERIC Educational Resources Information Center
Gallagher, Leonard J.; Draper, Jesse M.
A tutorial introduction to data models in general is provided, with particular emphasis on the relational and network models defined by the two proposed ANSI (American National Standards Institute) database language standards. Examples based on the network and relational models include specific syntax and semantics, while examples from the other…
Sensor/Response Coordination In A Tactical Self-Protection System
NASA Astrophysics Data System (ADS)
Steinberg, Alan N.
1988-08-01
This paper describes a model for integrating information acquisition functions into a response planner within a tactical self-defense system. This model may be used in defining requirements in such applications for sensor systems and for associated processing and control functions. The goal of information acquisition in a self-defense system is generally not that of achieving the best possible estimate of the threat environment; but rather to provide resolution of that environment sufficient to support response decisions. We model the information acquisition problem as that of achieving a partition among possible world states such that the final partition maps into the system's repertoire of possible responses.
Hevesi, Joseph A.; Flint, Alan L.; Flint, Lorraine E.
2003-01-01
This report presents the development and application of the distributed-parameter watershed model, INFILv3, for estimating the temporal and spatial distribution of net infiltration and potential recharge in the Death Valley region, Nevada and California. The estimates of net infiltration quantify the downward drainage of water across the lower boundary of the root zone and are used to indicate potential recharge under variable climate conditions and drainage basin characteristics. Spatial variability in recharge in the Death Valley region likely is high owing to large differences in precipitation, potential evapotranspiration, bedrock permeability, soil thickness, vegetation characteristics, and contributions to recharge along active stream channels. The quantity and spatial distribution of recharge representing the effects of variable climatic conditions and drainage basin characteristics on recharge are needed to reduce uncertainty in modeling ground-water flow. The U.S. Geological Survey, in cooperation with the Department of Energy, developed a regional saturated-zone ground-water flow model of the Death Valley regional ground-water flow system to help evaluate the current hydrogeologic system and the potential effects of natural or human-induced changes. Although previous estimates of recharge have been made for most areas of the Death Valley region, including the area defined by the boundary of the Death Valley regional ground-water flow system, the uncertainty of these estimates is high, and the spatial and temporal variability of the recharge in these basins has not been quantified. To estimate the magnitude and distribution of potential recharge in response to variable climate and spatially varying drainage basin characteristics, the INFILv3 model uses a daily water-balance model of the root zone with a primarily deterministic representation of the processes controlling net infiltration and potential recharge. The daily water balance includes precipitation (as either rain or snow), snow accumulation, sublimation, snowmelt, infiltration into the root zone, evapotranspiration, drainage, water content change throughout the root-zone profile (represented as a 6-layered system), runoff (defined as excess rainfall and snowmelt) and surface water run-on (defined as runoff that is routed downstream), and net infiltration (simulated as drainage from the bottom root-zone layer). Potential evapotranspiration is simulated using an hourly solar radiation model to simulate daily net radiation, and daily evapotranspiration is simulated as an empirical function of root zone water content and potential evapotranspiration. The model uses daily climate records of precipitation and air temperature from a regionally distributed network of 132 climate stations and a spatially distributed representation of drainage basin characteristics defined by topography, geology, soils, and vegetation to simulate daily net infiltration at all locations, including stream channels with intermittent streamflow in response to runoff from rain and snowmelt. The temporal distribution of daily, monthly, and annual net infiltration can be used to evaluate the potential effect of future climatic conditions on potential recharge. The INFILv3 model inputs representing drainage basin characteristics were developed using a geographic information system (GIS) to define a set of spatially distributed input parameters uniquely assigned to each grid cell of the INFILv3 model grid. The model grid, which was defined by a digital elevation model (DEM) of the Death Valley region, consists of 1,252,418 model grid cells with a uniform grid cell dimension of 278.5 meters in the north-south and east-west directions. The elevation values from the DEM were used with monthly regression models developed from the daily climate data to estimate the spatial distribution of daily precipitation and air temperature. The elevation values were also used to simulate atmosp
Feature-based component model for design of embedded systems
NASA Astrophysics Data System (ADS)
Zha, Xuan Fang; Sriram, Ram D.
2004-11-01
An embedded system is a hybrid of hardware and software, which combines software's flexibility and hardware real-time performance. Embedded systems can be considered as assemblies of hardware and software components. An Open Embedded System Model (OESM) is currently being developed at NIST to provide a standard representation and exchange protocol for embedded systems and system-level design, simulation, and testing information. This paper proposes an approach to representing an embedded system feature-based model in OESM, i.e., Open Embedded System Feature Model (OESFM), addressing models of embedded system artifacts, embedded system components, embedded system features, and embedded system configuration/assembly. The approach provides an object-oriented UML (Unified Modeling Language) representation for the embedded system feature model and defines an extension to the NIST Core Product Model. The model provides a feature-based component framework allowing the designer to develop a virtual embedded system prototype through assembling virtual components. The framework not only provides a formal precise model of the embedded system prototype but also offers the possibility of designing variation of prototypes whose members are derived by changing certain virtual components with different features. A case study example is discussed to illustrate the embedded system model.
Zolkind, Paul; Przybylski, Dariusz; Marjanovic, Nemanja; Nguyen, Lan; Lin, Tianxiang; Johanns, Tanner; Alexandrov, Anton; Zhou, Liye; Allen, Clint T.; Miceli, Alexander P.; Schreiber, Robert D.; Artyomov, Maxim; Dunn, Gavin P.; Uppaluri, Ravindra
2018-01-01
Head and neck squamous cell carcinomas (HNSCC) are an ideal immunotherapy target due to their high mutation burden and frequent infiltration with lymphocytes. Preclinical models to investigate targeted and combination therapies as well as defining biomarkers to guide treatment represent an important need in the field. Immunogenomics approaches have illuminated the role of mutation-derived tumor neoantigens as potential biomarkers of response to checkpoint blockade as well as representing therapeutic vaccines. Here, we aimed to define a platform for checkpoint and other immunotherapy studies using syngeneic HNSCC cell line models (MOC2 and MOC22), and evaluated the association between mutation burden, predicted neoantigen landscape, infiltrating T cell populations and responsiveness of tumors to anti-PD1 therapy. We defined dramatic hematopoietic cell transcriptomic alterations in the MOC22 anti-PD1 responsive model in both tumor and draining lymph nodes. Using a cancer immunogenomics pipeline and validation with ELISPOT and tetramer analysis, we identified the H-2Kb-restricted ICAM1P315L (mICAM1) as a neoantigen in MOC22. Finally, we demonstrated that mICAM1 vaccination was able to protect against MOC22 tumor development defining mICAM1 as a bona fide neoantigen. Together these data define a pre-clinical HNSCC model system that provides a foundation for future investigations into combination and novel therapeutics. PMID:29423108
The Intertwining of Enterprise Strategy and Requirements
NASA Astrophysics Data System (ADS)
Loucopoulos, Pericles; Garfield, Joy
Requirements Engineering techniques need to focus not only on the target technical system, as has traditionally been the case, but also on the interplay between business and system functionality. Whether a business wishes to exploit advances in technology to achieve new strategic objectives or to organise work in innovative ways, the process of Requirements Engineering could and should present opportunities for modelling and evaluating the potential impact that technology can bring about to the enterprise.This chapter discusses a co-designing process that offers opportunities of change to both the business and its underlying technical systems, in a synergistic manner. In these design situations some of the most challenging projects involve multiple stakeholders from different participating organisations, subcontractors, divisions etc who may have a diversity of expertise, come from different organisational cultures and often have competing goals. Stakeholders are faced with many different alternative future ‘worlds’ each one demanding a possibly different development strategy.There are acute questions about the potential structure of the new business system and how key variables in this structure could impact on the dynamics of the system. This chapter presents a framework which enables the evaluation of requirements through (a) system dynamics modelling, (b) ontology modelling, (c) scenario modelling and (d) rationale modelling. System dynamics modelling is used to define the behaviour of an enterprise system in terms of four perspectives. Ontology modelling is used to formally define invariant components of the physical and social world within the enterprise domain. Scenario modelling is used to identify critical variables and by quantitatively analyzing the effects of these variables through simulation to better understand the dynamic behaviour of the possible future structures. Rationale modelling is used to assist collaborative discussions when considering either ontology models or scenarios for change, developing maps, which chart the assumptions and reasoning behind key decisions during the requirements process.
Play-fairway analysis for geothermal exploration: Examples from the Great Basin, western USA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siler, Drew L; Faulds, James E
2013-10-27
Elevated permeability within fault systems provides pathways for circulation of geothermal fluids. Future geothermal development depends on precise and accurate location of such fluid flow pathways in order to both accurately assess geothermal resource potential and increase drilling success rates. The collocation of geologic characteristics that promote permeability in a given geothermal system define the geothermal ‘fairway’, the location(s) where upflow zones are probable and where exploration efforts including drilling should be focused. We define the geothermal fairway as the collocation of 1) fault zones that are ideally oriented for slip or dilation under ambient stress conditions, 2) areas withmore » a high spatial density of fault intersections, and 3) lithologies capable of supporting dense interconnected fracture networks. Areas in which these characteristics are concomitant with both elevated temperature and fluids are probable upflow zones where economic-scale, sustainable temperatures and flow rates are most likely to occur. Employing a variety of surface and subsurface data sets, we test this ‘play-fairway’ exploration methodology on two Great Basin geothermal systems, the actively producing Brady’s geothermal system and a ‘greenfield’ geothermal prospect at Astor Pass, NV. These analyses, based on 3D structural and stratigraphic framework models, reveal subsurface characteristics about each system, well beyond the scope of standard exploration methods. At Brady’s, the geothermal fairways we define correlate well with successful production wells and pinpoint several drilling targets for maintaining or expanding production in the field. In addition, hot-dry wells within the Brady’s geothermal field lie outside our defined geothermal fairways. At Astor Pass, our play-fairway analysis provides for a data-based conceptual model of fluid flow within the geothermal system and indicates several targets for exploration drilling.« less
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Patterson, Jonathan D.; Johnson, Stephen B.
2013-01-01
The draft NASA Fault Management (FM) Handbook (2012) states that Fault Management (FM) is a "part of systems engineering", and that it "demands a system-level perspective" (NASAHDBK- 1002, 7). What, exactly, is the relationship between systems engineering and FM? To NASA, systems engineering (SE) is "the art and science of developing an operable system capable of meeting requirements within often opposed constraints" (NASA/SP-2007-6105, 3). Systems engineering starts with the elucidation and development of requirements, which set the goals that the system is to achieve. To achieve these goals, the systems engineer typically defines functions, and the functions in turn are the basis for design trades to determine the best means to perform the functions. System Health Management (SHM), by contrast, defines "the capabilities of a system that preserve the system's ability to function as intended" (Johnson et al., 2011, 3). Fault Management, in turn, is the operational subset of SHM, which detects current or future failures, and takes operational measures to prevent or respond to these failures. Failure, in turn, is the "unacceptable performance of intended function." (Johnson 2011, 605) Thus the relationship of SE to FM is that SE defines the functions and the design to perform those functions to meet system goals and requirements, while FM detects the inability to perform those functions and takes action. SHM and FM are in essence "the dark side" of SE. For every function to be performed (SE), there is the possibility that it is not successfully performed (SHM); FM defines the means to operationally detect and respond to this lack of success. We can also describe this in terms of goals: for every goal to be achieved, there is the possibility that it is not achieved; FM defines the means to operationally detect and respond to this inability to achieve the goal. This brief description of relationships between SE, SHM, and FM provide hints to a modeling approach to provide formal connectivity between the nominal (SE), and off-nominal (SHM and FM) aspects of functions and designs. This paper describes a formal modeling approach to the initial phases of the development process that integrates the nominal and off-nominal perspectives in a model that unites SE goals and functions of with the failure to achieve goals and functions (SHM/FM). This methodology and corresponding model, known as a Goal-Function Tree (GFT), provides a means to represent, decompose, and elaborate system goals and functions in a rigorous manner that connects directly to design through use of state variables that translate natural language requirements and goals into logical-physical state language. The state variable-based approach also provides the means to directly connect FM to the design, by specifying the range in which state variables must be controlled to achieve goals, and conversely, the failures that exist if system behavior go out-of-range. This in turn allows for the systems engineers and SHM/FM engineers to determine which state variables to monitor, and what action(s) to take should the system fail to achieve that goal. In sum, the GFT representation provides a unified approach to early-phase SE and FM development. This representation and methodology has been successfully developed and implemented using Systems Modeling Language (SysML) on the NASA Space Launch System (SLS) Program. It enabled early design trade studies of failure detection coverage to ensure complete detection coverage of all crew-threatening failures. The representation maps directly both to FM algorithm designs, and to failure scenario definitions needed for design analysis and testing. The GFT representation provided the basis for mapping of abort triggers into scenarios, both needed for initial, and successful quantitative analyses of abort effectiveness (detection and response to crew-threatening events).
An adaptable architecture for patient cohort identification from diverse data sources.
Bache, Richard; Miles, Simon; Taweel, Adel
2013-12-01
We define and validate an architecture for systems that identify patient cohorts for clinical trials from multiple heterogeneous data sources. This architecture has an explicit query model capable of supporting temporal reasoning and expressing eligibility criteria independently of the representation of the data used to evaluate them. The architecture has the key feature that queries defined according to the query model are both pre and post-processed and this is used to address both structural and semantic heterogeneity. The process of extracting the relevant clinical facts is separated from the process of reasoning about them. A specific instance of the query model is then defined and implemented. We show that the specific instance of the query model has wide applicability. We then describe how it is used to access three diverse data warehouses to determine patient counts. Although the proposed architecture requires greater effort to implement the query model than would be the case for using just SQL and accessing a data-based management system directly, this effort is justified because it supports both temporal reasoning and heterogeneous data sources. The query model only needs to be implemented once no matter how many data sources are accessed. Each additional source requires only the implementation of a lightweight adaptor. The architecture has been used to implement a specific query model that can express complex eligibility criteria and access three diverse data warehouses thus demonstrating the feasibility of this approach in dealing with temporal reasoning and data heterogeneity.
Kinematic Repulsions Between Inertial Systems in AN Expanding Inflationary Universe
NASA Astrophysics Data System (ADS)
Savickas, D.
2013-09-01
The cosmological background radiation is observed to be isotropic only within a coordinate system that is at rest relative to its local Hubble drift. This indicates that the Hubble motion describes the recessional motion of an inertial system that is at rest relative to its local Hubble drift. It is shown that when the Hubble parameter is kinematically defined directly in terms of the positions and velocities of mass particles in the universe, it then also defines inertial systems themselves in terms of the distribution and motion of mass particles. It is independent of the velocity of photons because photons always have a speed c relative to the inertial system in which they are located. Therefore the definition of their velocity depends on the definition of the Hubble parameter itself and cannot be used to define H. The derivative of the kinematically defined Hubble parameter with respect to time is shown to always be positive and highly repulsive at the time of the origin of the universe. A model is used which describes a universe that is balanced at the time of its origin so that H approaches zero as the universe expands to infinity.
The Rothermel surface fire spread model and associated developments: A comprehensive explanation
Patricia L. Andrews
2018-01-01
The Rothermel surface fire spread model, with some adjustments by Frank A. Albini in 1976, has been used in fire and fuels management systems since 1972. It is generally used with other models including fireline intensity and flame length. Fuel models are often used to define fuel input parameters. Dynamic fuel models use equations for live fuel curing. Models have...
NASA Technical Reports Server (NTRS)
Kavi, K. M.
1984-01-01
There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.
Surface-directed capillary system; theory, experiments and applications.
Bouaidat, Salim; Hansen, Ole; Bruus, Henrik; Berendsen, Christian; Bau-Madsen, Niels Kristian; Thomsen, Peter; Wolff, Anders; Jonsmann, Jacques
2005-08-01
We present a capillary flow system for liquid transport in microsystems. Our simple microfluidic system consists of two planar parallel surfaces, separated by spacers. One of the surfaces is entirely hydrophobic, the other mainly hydrophobic, but with hydrophilic pathways defined on it by photolithographic means. By controlling the wetting properties of the surfaces in this manner, the liquid can be confined to certain areas defined by the hydrophilic pathways. This technique eliminates the need for alignment of the two surfaces. Patterned plasma-polymerized hexafluoropropene constitutes the hydrophobic areas, whereas the untreated glass surface constitutes the hydrophilic pathways. We developed a theoretical model of the capillary flow and obtained analytical solutions which are in good agreement with the experimental results. The capillarity-driven microflow system was also used to pattern and immobilize biological material on planar substrates: well-defined 200 microm wide strips of human cells (HeLa) and fluorescence labelled proteins (fluorescein isothiocyanate-labelled bovine serum albumin, i.e., FITC-BSA) were fabricated using the capillary flow system presented here.
Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.
2015-05-01
The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactormore » innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
A Chemical Properties Simulator to Support Integrated Environmental Modeling
Users of Integrated Environmental Modeling (IEM) systems are responsible for defining individual chemicals and their properties, a process that is time-consuming at best and overwhelming at worst, especially for new chemicals with new structures. A software tool is needed to allo...
A CONCEPTUAL MODEL FOR MULTI-SCALAR ASSESSMENTS OF ESTUARINE ECOLOGICAL INTEGRITY
A conceptual model was developed that relates an estuarine system's anthropogenic inputs to it's ecological integrity. Ecological integrity is operationally defined as an emergent property of an ecosystem that exists when the structural components are complete and the functional ...
Sustainable, Reliable Mission-Systems Architecture
NASA Technical Reports Server (NTRS)
O'Neil, Graham; Orr, James K.; Watson, Steve
2005-01-01
A mission-systems architecture, based on a highly modular infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is essential for affordable md sustainable space exploration programs. This mission-systems architecture requires (8) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, end verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered systems are applied to define the model. Technology projections reaching out 5 years are made to refine model details.
Sustainable, Reliable Mission-Systems Architecture
NASA Technical Reports Server (NTRS)
O'Neil, Graham; Orr, James K.; Watson, Steve
2007-01-01
A mission-systems architecture, based on a highly modular infrastructure utilizing: open-standards hardware and software interfaces as the enabling technology is essential for affordable and sustainable space exploration programs. This mission-systems architecture requires (a) robust communication between heterogeneous system, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered system are applied to define the model. Technology projections reaching out 5 years are mde to refine model details.
ERIC Educational Resources Information Center
Sobe, Noah W.; Ortegon, Nicole D.
2009-01-01
The purported "liquidity" of knowledge is often posed as one of the defining characteristics of the present "age of globalization." Liquidity describes the present moment as one marked by flows, flexibility and flux, and it also can be invoked to define the here-and-now by suggesting contrasts and departures from earlier historical eras. This…
An approach to define semantics for BPM systems interoperability
NASA Astrophysics Data System (ADS)
Rico, Mariela; Caliusco, María Laura; Chiotti, Omar; Rosa Galli, María
2015-04-01
This article proposes defining semantics for Business Process Management systems interoperability through the ontology of Electronic Business Documents (EBD) used to interchange the information required to perform cross-organizational processes. The semantic model generated allows aligning enterprise's business processes to support cross-organizational processes by matching the business ontology of each business partner with the EBD ontology. The result is a flexible software architecture that allows dynamically defining cross-organizational business processes by reusing the EBD ontology. For developing the semantic model, a method is presented, which is based on a strategy for discovering entity features whose interpretation depends on the context, and representing them for enriching the ontology. The proposed method complements ontology learning techniques that can not infer semantic features not represented in data sources. In order to improve the representation of these entity features, the method proposes using widely accepted ontologies, for representing time entities and relations, physical quantities, measurement units, official country names, and currencies and funds, among others. When the ontologies reuse is not possible, the method proposes identifying whether that feature is simple or complex, and defines a strategy to be followed. An empirical validation of the approach has been performed through a case study.
Cognitive Models for Learning to Control Dynamic Systems
2008-09-26
1992 [47] G. F. Franklin and J. D. Powell, Feedback Control of Dynamic Systems, New Jersey: Pearson Prentice Hall 2006 [48] M . Fishbein and I . Ajzen ...the course of decision making, the valence of an action Vi ( i = A or M ) is defined as the subjective expected payoff for each action also fluctuates...research: The role of formal models, IEEE Transactions on Systems, Man, and Cybernetics 16, 1986, pp. 439–449. [54] M . I . Jordan, Constrained
From scenarios to domain models: processes and representations
NASA Astrophysics Data System (ADS)
Haddock, Gail; Harbison, Karan
1994-03-01
The domain specific software architectures (DSSA) community has defined a philosophy for the development of complex systems. This philosophy improves productivity and efficiency by increasing the user's role in the definition of requirements, increasing the systems engineer's role in the reuse of components, and decreasing the software engineer's role to the development of new components and component modifications only. The scenario-based engineering process (SEP), the first instantiation of the DSSA philosophy, has been adopted by the next generation controller project. It is also the chosen methodology of the trauma care information management system project, and the surrogate semi-autonomous vehicle project. SEP uses scenarios from the user to create domain models and define the system's requirements. Domain knowledge is obtained from a variety of sources including experts, documents, and videos. This knowledge is analyzed using three techniques: scenario analysis, task analysis, and object-oriented analysis. Scenario analysis results in formal representations of selected scenarios. Task analysis of the scenario representations results in descriptions of tasks necessary for object-oriented analysis and also subtasks necessary for functional system analysis. Object-oriented analysis of task descriptions produces domain models and system requirements. This paper examines the representations that support the DSSA philosophy, including reference requirements, reference architectures, and domain models. The processes used to create and use the representations are explained through use of the scenario-based engineering process. Selected examples are taken from the next generation controller project.
Four dimensional studies in earth space
NASA Technical Reports Server (NTRS)
Mather, R. S.
1972-01-01
A system of reference which is directly related to observations, is proposed for four-dimensional studies in earth space. Global control network and polar wandering are defined. The determination of variations in the earth's gravitational field with time also forms part of such a system. Techniques are outlined for the unique definition of the motion of the geocenter, and the changes in the location of the axis of rotation of an instantaneous earth model, in relation to values at some epoch of reference. The instantaneous system referred to is directly related to a fundamental equation in geodynamics. The reference system defined would provide an unambiguous frame for long period studies in earth space, provided the scale of the space were specified.
Operationalizing sustainability in urban coastal systems: a system dynamics analysis.
Mavrommati, Georgia; Bithas, Kostas; Panayiotidis, Panayiotis
2013-12-15
We propose a system dynamics approach for Ecologically Sustainable Development (ESD) in urban coastal systems. A systematic analysis based on theoretical considerations, policy analysis and experts' knowledge is followed in order to define the concept of ESD. The principles underlying ESD feed the development of a System Dynamics Model (SDM) that connects the pollutant loads produced by urban systems' socioeconomic activities with the ecological condition of the coastal ecosystem that it is delineated in operational terms through key biological elements defined by the EU Water Framework Directive. The receiving waters of the Athens Metropolitan area, which bears the elements of typical high population density Mediterranean coastal city but which currently has also new dynamics induced by the ongoing financial crisis, are used as an experimental system for testing a system dynamics approach to apply the concept of ESD. Systems' thinking is employed to represent the complex relationships among the components of the system. Interconnections and dependencies that determine the potentials for achieving ESD are revealed. The proposed system dynamics analysis can facilitate decision makers to define paths of development that comply with the principles of ESD. Copyright © 2013 Elsevier Ltd. All rights reserved.
Simulation model for the Boeing 720B aircraft-flight control system in continuous flight.
DOT National Transportation Integrated Search
1971-08-01
A mathematical model of the Boeing 720B aircraft and autopilot has been derived. The model is representative of the 720B aircraft for continuous flight within a flight envelope defined by a Mach number of .4 at 20,000 feet altitude in a cruise config...
42 CFR § 414.1420 - Other payer advanced APMs.
Code of Federal Regulations, 2010 CFR
2017-10-01
... Merit-Based Incentive Payment System and Alternative Payment Model Incentive § 414.1420 Other payer... payment by the APM Entity to the payer. (2) Medicaid Medical Home Model financial risk standard. For an... APM benchmark, except for episode payment models, for which it is defined as the episode target price...
Upton, J; Murphy, M; Shalloo, L; Groot Koerkamp, P W G; De Boer, I J M
2014-01-01
Our objective was to define and demonstrate a mechanistic model that enables dairy farmers to explore the impact of a technical or managerial innovation on electricity consumption, associated CO2 emissions, and electricity costs. We, therefore, (1) defined a model for electricity consumption on dairy farms (MECD) capable of simulating total electricity consumption along with related CO2 emissions and electricity costs on dairy farms on a monthly basis; (2) validated the MECD using empirical data of 1yr on commercial spring calving, grass-based dairy farms with 45, 88, and 195 milking cows; and (3) demonstrated the functionality of the model by applying 2 electricity tariffs to the electricity consumption data and examining the effect on total dairy farm electricity costs. The MECD was developed using a mechanistic modeling approach and required the key inputs of milk production, cow number, and details relating to the milk-cooling system, milking machine system, water-heating system, lighting systems, water pump systems, and the winter housing facilities as well as details relating to the management of the farm (e.g., season of calving). Model validation showed an overall relative prediction error (RPE) of less than 10% for total electricity consumption. More than 87% of the mean square prediction error of total electricity consumption was accounted for by random variation. The RPE values of the milk-cooling systems, water-heating systems, and milking machine systems were less than 20%. The RPE values for automatic scraper systems, lighting systems, and water pump systems varied from 18 to 113%, indicating a poor prediction for these metrics. However, automatic scrapers, lighting, and water pumps made up only 14% of total electricity consumption across all farms, reducing the overall impact of these poor predictions. Demonstration of the model showed that total farm electricity costs increased by between 29 and 38% by moving from a day and night tariff to a flat tariff. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Limited Investigation into Regenerative Braking and Energy Storage for Mass Transit Systems
DOT National Transportation Integrated Search
1978-03-01
This study examines the technical and economic aspects of a regenerative braking/flywheel energy storage subway system. In order to define the analytical models accurately, it was necessary to gather data on the trains, rail network, schedules, and a...
Strategic Marketing for Educational Systems.
ERIC Educational Resources Information Center
Hanson, E. Mark; Henry, Walter
1992-01-01
Private-sector strategic marketing processes can significantly benefit schools desiring to develop public confidence and support and establish guidelines for future development. This article defines a strategic marketing model for school systems and articulates the sequence of related research and operational steps comprising it. Although schools…
Software For Graphical Representation Of A Network
NASA Technical Reports Server (NTRS)
Mcallister, R. William; Mclellan, James P.
1993-01-01
System Visualization Tool (SVT) computer program developed to provide systems engineers with means of graphically representing networks. Generates diagrams illustrating structures and states of networks defined by users. Provides systems engineers powerful tool simplifing analysis of requirements and testing and maintenance of complex software-controlled systems. Employs visual models supporting analysis of chronological sequences of requirements, simulation data, and related software functions. Applied to pneumatic, hydraulic, and propellant-distribution networks. Used to define and view arbitrary configurations of such major hardware components of system as propellant tanks, valves, propellant lines, and engines. Also graphically displays status of each component. Advantage of SVT: utilizes visual cues to represent configuration of each component within network. Written in Turbo Pascal(R), version 5.0.
A Preliminary Data Model for Orbital Flight Dynamics in Shuttle Mission Control
NASA Technical Reports Server (NTRS)
ONeill, John; Shalin, Valerie L.
2000-01-01
The Orbital Flight Dynamics group in Shuttle Mission Control is investigating new user interfaces in a project called RIOTS [RIOTS 2000]. Traditionally, the individual functions of hardware and software guide the design of displays, which results in an aggregated, if not integrated interface. The human work system has then been designed and trained to navigate, operate and integrate the processors and displays. The aim of RIOTS is to reduce the cognitive demands of the flight controllers by redesigning the user interface to support the work of the flight controller. This document supports the RIOTS project by defining a preliminary data model for Orbital Flight Dynamics. Section 2 defines an information-centric perspective. An information-centric approach aims to reduce the cognitive workload of the flight controllers by reducing the need for manual integration of information across processors and displays. Section 3 describes the Orbital Flight Dynamics domain. Section 4 defines the preliminary data model for Orbital Flight Dynamics. Section 5 examines the implications of mapping the data model to Orbital Flight Dynamics current information systems. Two recurring patterns are identified in the Orbital Flight Dynamics work the iteration/rework cycle and the decision-making/information integration/mirroring role relationship. Section 6 identifies new requirements on Orbital Flight Dynamics work and makes recommendations based on changing the information environment, changing the implementation of the data model, and changing the two recurring patterns.
Consideration of an Applied Model of Public Health Program Infrastructure
Lavinghouze, Rene; Snyder, Kimberly; Rieker, Patricia; Ottoson, Judith
2015-01-01
Systemic infrastructure is key to public health achievements. Individual public health program infrastructure feeds into this larger system. Although program infrastructure is rarely defined, it needs to be operationalized for effective implementation and evaluation. The Ecological Model of Infrastructure (EMI) is one approach to defining program infrastructure. The EMI consists of 5 core (Leadership, Partnerships, State Plans, Engaged Data, and Managed Resources) and 2 supporting (Strategic Understanding and Tactical Action) elements that are enveloped in a program’s context. We conducted a literature search across public health programs to determine support for the EMI. Four of the core elements were consistently addressed, and the other EMI elements were intermittently addressed. The EMI provides an initial and partial model for understanding program infrastructure, but additional work is needed to identify evidence-based indicators of infrastructure elements that can be used to measure success and link infrastructure to public health outcomes, capacity, and sustainability. PMID:23411417
Propulsion System Models for Rotorcraft Conceptual Design
NASA Technical Reports Server (NTRS)
Johnson, Wayne
2014-01-01
The conceptual design code NDARC (NASA Design and Analysis of Rotorcraft) was initially implemented to model conventional rotorcraft propulsion systems, consisting of turboshaft engines burning jet fuel, connected to one or more rotors through a mechanical transmission. The NDARC propulsion system representation has been extended to cover additional propulsion concepts, including electric motors and generators, rotor reaction drive, turbojet and turbofan engines, fuel cells and solar cells, batteries, and fuel (energy) used without weight change. The paper describes these propulsion system components, the architecture of their implementation in NDARC, and the form of the models for performance and weight. Requirements are defined for improved performance and weight models of the new propulsion system components. With these new propulsion models, NDARC can be used to develop environmentally-friendly rotorcraft designs.
Model authoring system for fail safe analysis
NASA Technical Reports Server (NTRS)
Sikora, Scott E.
1990-01-01
The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.
TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L
Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
International Space Station (ISS) Meteoroid/Orbital Debris Shielding
NASA Technical Reports Server (NTRS)
Christiansen, Eric L.
1999-01-01
Design practices to provide protection for International Space Station (ISS) crew and critical equipment from meteoroid and orbital debris (M/OD) Impacts have been developed. Damage modes and failure criteria are defined for each spacecraft system. Hypervolocity Impact -1 - and analyses are used to develop ballistic limit equations (BLEs) for each exposed spacecraft system. BLEs define Impact particle sizes that result in threshold failure of a particular spacecraft system as a function of Impact velocity, angles and particle density. The BUMPER computer code Is used to determine the probability of no penetration (PNP) that falls the spacecraft shielding based on NASA standard meteoroid/debris models, a spacecraft geometry model, and the BLEs. BUMPER results are used to verify spacecraft shielding requirements Low-weight, high-performance shielding alternatives have been developed at the NASA Johnson Space Center (JSC) Hypervelocity Impact Technology Facility (HITF) to meet spacecraft protection requirements.
Reference equations of motion for automatic rendezvous and capture
NASA Technical Reports Server (NTRS)
Henderson, David M.
1992-01-01
The analysis presented in this paper defines the reference coordinate frames, equations of motion, and control parameters necessary to model the relative motion and attitude of spacecraft in close proximity with another space system during the Automatic Rendezvous and Capture phase of an on-orbit operation. The relative docking port target position vector and the attitude control matrix are defined based upon an arbitrary spacecraft design. These translation and rotation control parameters could be used to drive the error signal input to the vehicle flight control system. Measurements for these control parameters would become the bases for an autopilot or feedback control system (FCS) design for a specific spacecraft.
An Interpreted Language and System for the Visualization of Unstructured Meshes
NASA Technical Reports Server (NTRS)
Moran, Patrick J.; Gerald-Yamasaki, Michael (Technical Monitor)
1998-01-01
We present an interpreted language and system supporting the visualization of unstructured meshes and the manipulation of shapes defined in terms of mesh subsets. The language features primitives inspired by geometric modeling, mathematical morphology and algebraic topology. The adaptation of the topology ideas to an interpreted environment, along with support for programming constructs such, as user function definition, provide a flexible system for analyzing a mesh and for calculating with shapes defined in terms of the mesh. We present results demonstrating some of the capabilities of the language, based on an implementation called the Shape Calculator, for tetrahedral meshes in R^3.
Estimating and validating harvesting system production through computer simulation
John E. Baumgras; Curt C. Hassler; Chris B. LeDoux
1993-01-01
A Ground Based Harvesting System Simulation model (GB-SIM) has been developed to estimate stump-to-truck production rates and multiproduct yields for conventional ground-based timber harvesting systems in Appalachian hardwood stands. Simulation results reflect inputs that define harvest site and timber stand attributes, wood utilization options, and key attributes of...
Life support system cost study: Addendum to cost analysis of carbon dioxide concentrators
NASA Technical Reports Server (NTRS)
Yakut, M. M.
1973-01-01
New cost data are presented for the Hydrogen-Depolarized Carbon Dioxide Concentrator (HDC), based on modifying the concentrator to delete the quick disconnect valves and filters included in the system model defined in MDC-G4631. System description, cost data and a comparison between CO2 concentrator costs are presented.
Psychological Sources of Systematic Rejection Among White and Black Adolescents.
ERIC Educational Resources Information Center
Long, Samuel
In this study, individual-oriented and system-oriented models of systemic rejection among white and black adolescents are investigated. Systemic rejection is defined as attitudes of political alienation and political violence justification. Twelve hypotheses were generated and tested using survey data collected in May 1976 from a random sample of…
Internal Quality Assurance Systems: "Tailor Made" or "One Size Fits All" Implementation?
ERIC Educational Resources Information Center
Cardoso, Sónia; Rosa, Maria J.; Videira, Pedro; Amaral, Alberto
2017-01-01
Purpose: This paper aims to look at the characteristics of internal quality assurance (IQA) systems of higher education institutions to understand whether these systems tend to reproduce a given model, externally defined and suggested to institutions, or rather to be shaped by institutions' features and interests. Design/methodology/approach: The…
Application of optimization technique for flood damage modeling in river system
NASA Astrophysics Data System (ADS)
Barman, Sangita Deb; Choudhury, Parthasarathi
2018-04-01
A river system is defined as a network of channels that drains different parts of a basin uniting downstream to form a common outflow. An application of various models found in literatures, to a river system having multiple upstream flows is not always straight forward, involves a lengthy procedure; and with non-availability of data sets model calibration and applications may become difficult. In the case of a river system the flow modeling can be simplified to a large extent if the channel network is replaced by an equivalent single channel. In the present work optimization model formulations based on equivalent flow and applications of the mixed integer programming based pre-emptive goal programming model in evaluating flood control alternatives for a real life river system in India are proposed to be covered in the study.
NASA Astrophysics Data System (ADS)
Duane, Gregory S.; Grabow, Carsten; Selten, Frank; Ghil, Michael
2017-12-01
The synchronization of loosely coupled chaotic systems has increasingly found applications to large networks of differential equations and to models of continuous media. These applications are at the core of the present Focus Issue. Synchronization between a system and its model, based on limited observations, gives a new perspective on data assimilation. Synchronization among different models of the same system defines a supermodel that can achieve partial consensus among models that otherwise disagree in several respects. Finally, novel methods of time series analysis permit a better description of synchronization in a system that is only observed partially and for a relatively short time. This Focus Issue discusses synchronization in extended systems or in components thereof, with particular attention to data assimilation, supermodeling, and their applications to various areas, from climate modeling to macroeconomics.
Duane, Gregory S; Grabow, Carsten; Selten, Frank; Ghil, Michael
2017-12-01
The synchronization of loosely coupled chaotic systems has increasingly found applications to large networks of differential equations and to models of continuous media. These applications are at the core of the present Focus Issue. Synchronization between a system and its model, based on limited observations, gives a new perspective on data assimilation. Synchronization among different models of the same system defines a supermodel that can achieve partial consensus among models that otherwise disagree in several respects. Finally, novel methods of time series analysis permit a better description of synchronization in a system that is only observed partially and for a relatively short time. This Focus Issue discusses synchronization in extended systems or in components thereof, with particular attention to data assimilation, supermodeling, and their applications to various areas, from climate modeling to macroeconomics.
Compression and release dynamics of an active matter system of Euglena gracilis
NASA Astrophysics Data System (ADS)
Lam, Amy; Tsang, Alan C. H.; Ouellette, Nicholas; Riedel-Kruse, Ingmar
Active matter, defined as ensembles of self-propelled particles, encompasses a large variety of systems at all scales, from nanoparticles to bird flocks. Though various models and simulations have been created to describe the dynamics of these systems, experimental verification has been difficult to obtain. This is frequently due to the complex interaction rules which govern the particle behavior, in turn making systematic varying of parameters impossible. Here, we propose a model for predicting the system evolution of compression and release of an active system based on experiments and simulations. In particular, we consider ensembles of the unicellular, photo-responsive algae, Euglena gracilis, under light stimulation. By varying the spatiotemporal light patterns, we are able to finely adjust cell densities and achieve arbitrary non-homogeneous distributions, including compression into high-density aggregates of varying geometries. We observe the formation of depletion zones after the release of the confining stimulus and investigate the effects of the density distribution and particle rotational noise on the depletion. These results provide implications for defining state parameters which determine system evolution.
A FINITE-DIFFERENCE, DISCRETE-WAVENUMBER METHOD FOR CALCULATING RADAR TRACES
A hybrid of the finite-difference method and the discrete-wavenumber method is developed to calculate radar traces. The method is based on a three-dimensional model defined in the Cartesian coordinate system; the electromagnetic properties of the model are symmetric with respect ...
A Chemical Properties Simulator to Support Integrated Environmental Modeling (proceeding)
Users of Integrated Environmental Modeling (IEM) systems are responsible for defining individual chemicals and their properties, a process that is time-consuming at best and overwhelming at worst, especially for new chemicals with new structures. A software tool is needed to allo...
Measuring the ROI on Knowledge Management Systems.
ERIC Educational Resources Information Center
Wickhorst, Vickie
2002-01-01
Defines knowledge management and corporate portals and provides a model that can be applied to assessing return on investment (ROI) for a knowledge management solution. Highlights include leveraging knowledge in an organization; assessing the value of human capital; and the Intellectual Capital Performance Measurement Model. (LRW)
A Model for Implementing a Career Education System.
ERIC Educational Resources Information Center
Ryan, T. Antoinette
The model for career education implementation defines three major functions which constitute the essential elements in the implementation process: planning, implementation, and evaluation. Emphasis is placed on the interrelatedness of implementation to both planning and evaluation of career education. The 11 subsystems involved in implementing…
NASA Astrophysics Data System (ADS)
Sato, K. Y.; Tomko, D. L.; Levine, H. G.; Quincy, C. D.; Rayl, N. A.; Sowa, M. B.; Taylor, E. M.; Sun, S. C.; Kundrot, C. E.
2018-02-01
Model organisms are foundational for conducting physiological and systems biology research to define how life responds to the deep space environment. The organisms, areas of research, and Deep Space Gateway capabilities needed will be presented.
Space Generic Open Avionics Architecture (SGOAA) reference model technical guide
NASA Technical Reports Server (NTRS)
Wray, Richard B.; Stovall, John R.
1993-01-01
This report presents a full description of the Space Generic Open Avionics Architecture (SGOAA). The SGOAA consists of a generic system architecture for the entities in spacecraft avionics, a generic processing architecture, and a six class model of interfaces in a hardware/software system. The purpose of the SGOAA is to provide an umbrella set of requirements for applying the generic architecture interface model to the design of specific avionics hardware/software systems. The SGOAA defines a generic set of system interface points to facilitate identification of critical interfaces and establishes the requirements for applying appropriate low level detailed implementation standards to those interface points. The generic core avionics system and processing architecture models provided herein are robustly tailorable to specific system applications and provide a platform upon which the interface model is to be applied.
An expert system for municipal solid waste management simulation analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsieh, M.C.; Chang, N.B.
1996-12-31
Optimization techniques were usually used to model the complicated metropolitan solid waste management system to search for the best dynamic combination of waste recycling, facility siting, and system operation, where sophisticated and well-defined interrelationship are required in the modeling process. But this paper applied the Concurrent Object-Oriented Simulation (COOS), a new simulation software construction method, to bridge the gap between the physical system and its computer representation. The case study of Kaohsiung solid waste management system in Taiwan is prepared for the illustration of the analytical methodology of COOS and its implementation in the creation of an expert system.
Contemporary approaches to neural circuit manipulation and mapping: focus on reward and addiction
Saunders, Benjamin T.; Richard, Jocelyn M.; Janak, Patricia H.
2015-01-01
Tying complex psychological processes to precisely defined neural circuits is a major goal of systems and behavioural neuroscience. This is critical for understanding adaptive behaviour, and also how neural systems are altered in states of psychopathology, such as addiction. Efforts to relate psychological processes relevant to addiction to activity within defined neural circuits have been complicated by neural heterogeneity. Recent advances in technology allow for manipulation and mapping of genetically and anatomically defined neurons, which when used in concert with sophisticated behavioural models, have the potential to provide great insight into neural circuit bases of behaviour. Here we discuss contemporary approaches for understanding reward and addiction, with a focus on midbrain dopamine and cortico-striato-pallidal circuits. PMID:26240425
Performability modeling based on real data: A case study
NASA Technical Reports Server (NTRS)
Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.
1988-01-01
Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of apparent types of errors.
Performability modeling based on real data: A casestudy
NASA Technical Reports Server (NTRS)
Hsueh, M. C.; Iyer, R. K.; Trivedi, K. S.
1987-01-01
Described is a measurement-based performability model based on error and resource usage data collected on a multiprocessor system. A method for identifying the model structure is introduced and the resulting model is validated against real data. Model development from the collection of raw data to the estimation of the expected reward is described. Both normal and error behavior of the system are characterized. The measured data show that the holding times in key operational and error states are not simple exponentials and that a semi-Markov process is necessary to model the system behavior. A reward function, based on the service rate and the error rate in each state, is then defined in order to estimate the performability of the system and to depict the cost of different types of errors.
Investigations of respiratory control systems simulation
NASA Technical Reports Server (NTRS)
Gallagher, R. R.
1973-01-01
The Grodins' respiratory control model was investigated and it was determined that the following modifications were necessary before the model would be adaptable for current research efforts: (1) the controller equation must be modified to allow for integration of the respiratory system model with other physiological systems; (2) the system must be more closely correlated to the salient physiological functionings; (3) the respiratory frequency and the heart rate should be expanded to illustrate other physiological relationships and dependencies; and (4) the model should be adapted to particular individuals through a better defined set of initial parameter values in addition to relating these parameter values to the desired environmental conditions. Several of Milhorn's respiratory control models were also investigated in hopes of using some of their features as modifications for Grodins' model.
Modeling The Frontal Collison In Vehicles And Determining The Degree Of Injury On The Driver
NASA Astrophysics Data System (ADS)
Oţăt, Oana Victoria
2015-09-01
The present research study aims at analysing the kinematic and the dynamic behaviour of the vehicle's driver in a frontal collision. Hence, a subsequent objective of the research paper is to establish the degree of injury suffered by the driver. Therefore, in order to achieve the objectives set, first, we had to define the type of the dummy placed in the position of the driver, and then to design the three-element assembly, i.e. the chair-steering wheel-dashboard assembly. Based on this model, the following step focused on the positioning of the dummy, which has also integrated the defining of the contacts between the components of the dummy and the seat elements. Seeking to model such a behaviour that would highly accurately reflect the driver's movements in a frontal collision, passive safety systems have also been defined and simulated, namely the seatbelt and the frontal airbag.
NASA Astrophysics Data System (ADS)
Glavev, Victor
2016-12-01
The types of software applications used by public administrations can be divided in three main groups: document management systems, record management systems and business process systems. Each one of them generates outputs that can be used as input data to the others. This is the main reason that requires exchange of data between these three groups and well defined models that should be followed. There are also many other reasons that will be discussed in the paper. Interoperability is a key aspect when those models are implemented, especially when there are different manufactures of systems in the area of software applications used by public authorities. The report includes examples of implementation of models for exchange of data between software systems deployed in one of the biggest administration in Bulgaria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Honrubia-Escribano, A.; Jimenez-Buendia, F.; Molina-Garcia, A.
This paper presents the current status of simplified wind turbine models used for power system stability analysis. This work is based on the ongoing work being developed in IEC 61400-27. This international standard, for which a technical committee was convened in October 2009, is focused on defining generic (also known as simplified) simulation models for both wind turbines and wind power plants. The results of the paper provide an improved understanding of the usability of generic models to conduct power system simulations.
2012-01-01
our own work for this discussion. DoD Instruction 5000.61 defines model validation as “the pro - cess of determining the degree to which a model and its... determined that RMAT is highly con - crete code, potentially leading to redundancies in the code itself and making RMAT more difficult to maintain...system con - ceptual models valid, and are the data used to support them adequate? (Chapters Two and Three) 2. Are the sources and methods for populating
Statistical physics of the symmetric group.
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Statistical physics of the symmetric group
NASA Astrophysics Data System (ADS)
Williams, Mobolaji
2017-04-01
Ordered chains (such as chains of amino acids) are ubiquitous in biological cells, and these chains perform specific functions contingent on the sequence of their components. Using the existence and general properties of such sequences as a theoretical motivation, we study the statistical physics of systems whose state space is defined by the possible permutations of an ordered list, i.e., the symmetric group, and whose energy is a function of how certain permutations deviate from some chosen correct ordering. Such a nonfactorizable state space is quite different from the state spaces typically considered in statistical physics systems and consequently has novel behavior in systems with interacting and even noninteracting Hamiltonians. Various parameter choices of a mean-field model reveal the system to contain five different physical regimes defined by two transition temperatures, a triple point, and a quadruple point. Finally, we conclude by discussing how the general analysis can be extended to state spaces with more complex combinatorial properties and to other standard questions of statistical mechanics models.
Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan
2012-01-01
Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).
Pugliese, F; Albini, E; Serio, O; Apostoli, P
2011-01-01
The 81/2008 Act has defined a model of a health and safety management system that can contribute to prevent the occupational health and safety risks. We have developed the structure of a health and safety management system model and the necessary tools for its implementation in health care facilities. The realization of a model is structured in various phases: initial review, safety policy, planning, implementation, monitoring, management review and continuous improvement. Such a model, in continuous evolution, is based on the responsibilities of the different corporate characters and on an accurate analysis of risks and involved norms.
A structural model decomposition framework for systems health management
NASA Astrophysics Data System (ADS)
Roychoudhury, I.; Daigle, M.; Bregon, A.; Pulido, B.
Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.
A Structural Model Decomposition Framework for Systems Health Management
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino
2013-01-01
Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.
Defining Support Requirements During Conceptual Design of Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Morris, W. D.; White, N. H.; Davis, W. T.; Ebeling, C. E.
1995-01-01
Current methods for defining the operational support requirements of new systems are data intensive and require significant design information. Methods are being developed to aid in the analysis process of defining support requirements for new launch vehicles during their conceptual design phase that work with the level of information available during this phase. These methods will provide support assessments based on the vehicle design and the operating scenarios. The results can be used both to define expected support requirements for new launch vehicle designs and to help evaluate the benefits of using new technologies. This paper describes the models, their current status, and provides examples of their use.
NASA Astrophysics Data System (ADS)
Caton, R. G.; Colman, J. J.; Parris, R. T.; Nickish, L.; Bullock, G.
2017-12-01
The Air Force Research Laboratory, in collaboration with NorthWest Research Associates, is developing advanced software capabilities for high fidelity simulations of high frequency (HF) sky wave propagation and performance analysis of HF systems. Based on the HiCIRF (High-frequency Channel Impulse Response Function) platform [Nickisch et. al, doi:10.1029/2011RS004928], the new Air Force Coverage Analysis Program (AFCAP) provides the modular capabilities necessary for a comprehensive sensitivity study of the large number of variables which define simulations of HF propagation modes. In this paper, we report on an initial exercise of AFCAP to analyze the sensitivities of the tool to various environmental and geophysical parameters. Through examination of the channel scattering function and amplitude-range-Doppler output on two-way propagation paths with injected target signals, we will compare simulated returns over a range of geophysical conditions as well as varying definitions for environmental noise, meteor clutter, and sea state models for Bragg backscatter. We also investigate the impacts of including clutter effects due to field-aligned backscatter from small scale ionization structures at varied levels of severity as defined by the climatologically WideBand Model (WBMOD). In the absence of additional user provided information, AFCAP relies on International Reference Ionosphere (IRI) model to define the ionospheric state for use in 2D ray tracing algorithms. Because the AFCAP architecture includes the option for insertion of a user defined gridded ionospheric representation, we compare output from the tool using the IRI and ionospheric definitions from assimilative models such as GPSII (GPS Ionospheric Inversion).
Development of modelling algorithm of technological systems by statistical tests
NASA Astrophysics Data System (ADS)
Shemshura, E. A.; Otrokov, A. V.; Chernyh, V. G.
2018-03-01
The paper tackles the problem of economic assessment of design efficiency regarding various technological systems at the stage of their operation. The modelling algorithm of a technological system was performed using statistical tests and with account of the reliability index allows estimating the level of machinery technical excellence and defining the efficiency of design reliability against its performance. Economic feasibility of its application shall be determined on the basis of service quality of a technological system with further forecasting of volumes and the range of spare parts supply.
Bionic models for identification of biological systems
NASA Astrophysics Data System (ADS)
Gerget, O. M.
2017-01-01
This article proposes a clinical decision support system that processes biomedical data. For this purpose a bionic model has been designed based on neural networks, genetic algorithms and immune systems. The developed system has been tested on data from pregnant women. The paper focuses on the approach to enable selection of control actions that can minimize the risk of adverse outcome. The control actions (hyperparameters of a new type) are further used as an additional input signal. Its values are defined by a hyperparameter optimization method. A software developed with Python is briefly described.
Hybrid and electric advanced vehicle systems (heavy) simulation
NASA Technical Reports Server (NTRS)
Hammond, R. A.; Mcgehee, R. K.
1981-01-01
A computer program to simulate hybrid and electric advanced vehicle systems (HEAVY) is described. It is intended for use early in the design process: concept evaluation, alternative comparison, preliminary design, control and management strategy development, component sizing, and sensitivity studies. It allows the designer to quickly, conveniently, and economically predict the performance of a proposed drive train. The user defines the system to be simulated using a library of predefined component models that may be connected to represent a wide variety of propulsion systems. The development of three models are discussed as examples.
Maximized Gust Loads of a Closed-Loop, Nonlinear Aeroelastic System Using Nonlinear Systems Theory
NASA Technical Reports Server (NTRS)
Silva, Walter A.
1999-01-01
The problem of computing the maximized gust load for a nonlinear, closed-loop aeroelastic aircraft is discusses. The Volterra theory of nonlinear systems is applied in order to define a linearized system that provides a bounds on the response of the nonlinear system of interest. The method is applied to a simplified model of an Airbus A310.
Computer system for definition of the quantitative geometry of musculature from CT images.
Daniel, Matej; Iglic, Ales; Kralj-Iglic, Veronika; Konvicková, Svatava
2005-02-01
The computer system for quantitative determination of musculoskeletal geometry from computer tomography (CT) images has been developed. The computer system processes series of CT images to obtain three-dimensional (3D) model of bony structures where the effective muscle fibres can be interactively defined. Presented computer system has flexible modular structure and is suitable also for educational purposes.
National Information Exchange Model (NIEM): DoD Adoption and Implications for C2 (Briefing Charts)
2014-06-18
Application Data Consumers Information Exchange Package ( IEP ) the data exchanged at runtime Data Producers IES defines Information Exchange...Specification (IES) build-time description of the data to be exchanged Developers System / Application System / Application IEP | 9 | Data...Exchange Package ( IEP ) the data exchanged at runtime Data Producers System / Application System / Application IEP Consumer’s Understanding
Generic Sensor Failure Modeling for Cooperative Systems.
Jäger, Georg; Zug, Sebastian; Casimiro, António
2018-03-20
The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application's fault tolerance and thereby promises maintainability of such system's safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques.
Use of Satellite Remote Sensing to Improve Coastal Hypoxia Prediction
We describe the use of Giovanni satellite remote sensing products in the development and testing of a new modeling system that represents the processes leading to hypoxia (defined as water O2 concentration < 63 mmol m-3) on the Louisiana continental shelf (LCS). The modeling ...
A FINITE-DIFFERENCE, DISCRETE-WAVENUMBER METHOD FOR CALCULATING RADAR TRACES
A hybrid of the finite-difference method and the discrete-wavenumber method is developed to calculate radar traces. The method is based on a three-dimensional model defined in the Cartesian coordinate system; the electromag-netic properties of the model are symmetric with respect...
Extended Relation Metadata for SCORM-Based Learning Content Management Systems
ERIC Educational Resources Information Center
Lu, Eric Jui-Lin; Horng, Gwoboa; Yu, Chia-Ssu; Chou, Ling-Ying
2010-01-01
To increase the interoperability and reusability of learning objects, Advanced Distributed Learning Initiative developed a model called Content Aggregation Model (CAM) to describe learning objects and express relationships between learning objects. However, the suggested relations defined in the CAM can only describe structure-oriented…
Equivalent Viscous Damping Methodologies Applied on VEGA Launch Vehicle Numerical Model
NASA Astrophysics Data System (ADS)
Bartoccini, D.; Di Trapani, C.; Fransen, S.
2014-06-01
Part of the mission analysis of a spacecraft is the so- called launcher-satellite coupled loads analysis which aims at computing the dynamic environment of the satellite and of the launch vehicle for the most severe load cases in flight. Evidently the damping of the coupled system shall be defined with care as to not overestimate or underestimate the loads derived for the spacecraft. In this paper the application of several EqVD (Equivalent Viscous Damping) for Craig an Bampton (CB)-systems are investigated. Based on the structural damping defined for the various materials in the parent FE-models of the CB-components, EqVD matrices can be computed according to different methodologies. The effect of these methodologies on the numerical reconstruction of the VEGA launch vehicle dynamic environment will be presented.
Global stability and exact solution of an arbitrary-solute nonlinear cellular mass transport system.
Benson, James D
2014-12-01
The prediction of the cellular state as a function of extracellular concentrations and temperatures has been of interest to physiologists for nearly a century. One of the most widely used models in the field is one where mass flux is linearly proportional to the concentration difference across the membrane. These fluxes define a nonlinear differential equation system for the intracellular state, which when coupled with appropriate initial conditions, define the intracellular state as a function of the extracellular concentrations of both permeating and nonpermeating solutes. Here we take advantage of a reparametrization scheme to extend existing stability results to a more general setting and to a develop analytical solutions to this model for an arbitrary number of extracellular solutes. Copyright © 2014 Elsevier Inc. All rights reserved.
Development of a rotorcraft. Propulsion dynamics interface analysis, volume 2
NASA Technical Reports Server (NTRS)
Hull, R.
1982-01-01
A study was conducted to establish a coupled rotor/propulsion analysis that would be applicable to a wide range of rotorcraft systems. The effort included the following tasks: (1) development of a model structure suitable for simulating a wide range of rotorcraft configurations; (2) defined a methodology for parameterizing the model structure to represent a particular rotorcraft; (3) constructing a nonlinear coupled rotor/propulsion model as a test case to use in analyzing coupled system dynamics; and (4) an attempt to develop a mostly linear coupled model derived from the complete nonlinear simulations. Documentation of the computer models developed is presented.
NASA Technical Reports Server (NTRS)
Knezovich, F. M.
1976-01-01
A modular structured system of computer programs is presented utilizing earth and ocean dynamical data keyed to finitely defined parameters. The model is an assemblage of mathematical algorithms with an inherent capability of maturation with progressive improvements in observational data frequencies, accuracies and scopes. The Eom in its present state is a first-order approach to a geophysical model of the earth's dynamics.
Investigation of traveler acceptance factors in short haul air carrier operations
NASA Technical Reports Server (NTRS)
Kuhlthau, A. R.; Jacobson, I. D.
1972-01-01
The development of a mathematical model for human reaction to variables involved in transportation systems is discussed. The techniques, activities, and results related to defining certain specific inputs to the model are presented. A general schematic diagram of the problem solution is developed. The application of the model to short haul air carrier operations is examined.
NASA Astrophysics Data System (ADS)
Montero, J. T.; Lintz, H. E.; Sharp, D.
2013-12-01
Do emergent properties that result from models of complex systems match emergent properties from real systems? This question targets a type of uncertainty that we argue requires more attention in system modeling and validation efforts. We define an ';emergent property' to be an attribute or behavior of a modeled or real system that can be surprising or unpredictable and result from complex interactions among the components of a system. For example, thresholds are common across diverse systems and scales and can represent emergent system behavior that is difficult to predict. Thresholds or other types of emergent system behavior can be characterized by their geometry in state space (where state space is the space containing the set of all states of a dynamic system). One way to expedite our growing mechanistic understanding of how emergent properties emerge from complex systems is to compare the geometry of surfaces in state space between real and modeled systems. Here, we present an index (threshold strength) that can quantify a geometric attribute of a surface in state space. We operationally define threshold strength as how strongly a surface in state space resembles a step or an abrupt transition between two system states. First, we validated the index for application in greater than three dimensions of state space using simulated data. Then, we demonstrated application of the index in measuring geometric state space uncertainty between a real system and a deterministic, modeled system. In particular, we looked at geometric space uncertainty between climate behavior in 20th century and modeled climate behavior simulated by global climate models (GCMs) in the Coupled Model Intercomparison Project phase 5 (CMIP5). Surfaces from the climate models came from running the models over the same domain as the real data. We also created response surfaces from a real, climate data based on an empirical model that produces a geometric surface of predicted values in state space. We used a kernel regression method designed to capture the geometry of real data pattern without imposing shape assumptions a priori on the data; this kernel regression method is known as Non-parametric Multiplicative Regression (NPMR). We found that quantifying and comparing a geometric attribute in more than three dimensions of state space can discern whether the emergent nature of complex interactions in modeled systems matches that of real systems. Further, this method has potentially wider application in contexts where searching for abrupt change or ';action' in any hyperspace is desired.
Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.
Blobel, B G M E; Engel, K; Pharow, P
2006-01-01
To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.
Landslide risk mitigation by means of early warning systems
NASA Astrophysics Data System (ADS)
Calvello, Michele
2017-04-01
Among the many options available to mitigate landslide risk, early warning systems may be used where, in specific circumstances, the risk to life increases above tolerable levels. A coherent framework to classify and analyse landslide early warning systems (LEWS) is herein presented. Once the objectives of an early warning strategy are defined depending on the scale of analysis and the type of landslides to address, the process of designing and managing a LEWS should synergically employ technical and social skills. A classification scheme for the main components of LEWSs is proposed for weather-induced landslides. The scheme is based on a clear distinction among: i) the landslide model, i.e. a functional relationship between weather characteristics and landslide events considering the geotechnical, geomorphological and hydro-geological characterization of the area as well as an adequate monitoring strategy; ii) the warning model, i.e. the landslide model plus procedures to define the warning events and to issue the warnings; iii) the warning system, i.e. the warning model plus warning dissemination procedures, communication and education tools, strategies for community involvement and emergency plans. Each component of a LEWS is related to a number of actors involved with their deployment, operational activities and management. For instance, communication and education, community involvement and emergency plans are all significantly influenced by people's risk perception and by operational aspects system managers need to address in cooperation with scientists.
Modelling invasion for a habitat generalist and a specialist plant species
Evangelista, P.H.; Kumar, S.; Stohlgren, T.J.; Jarnevich, C.S.; Crall, A.W.; Norman, J. B.; Barnett, D.T.
2008-01-01
Predicting suitable habitat and the potential distribution of invasive species is a high priority for resource managers and systems ecologists. Most models are designed to identify habitat characteristics that define the ecological niche of a species with little consideration to individual species' traits. We tested five commonly used modelling methods on two invasive plant species, the habitat generalist Bromus tectorum and habitat specialist Tamarix chinensis, to compare model performances, evaluate predictability, and relate results to distribution traits associated with each species. Most of the tested models performed similarly for each species; however, the generalist species proved to be more difficult to predict than the specialist species. The highest area under the receiver-operating characteristic curve values with independent validation data sets of B. tectorum and T. chinensis was 0.503 and 0.885, respectively. Similarly, a confusion matrix for B. tectorum had the highest overall accuracy of 55%, while the overall accuracy for T. chinensis was 85%. Models for the generalist species had varying performances, poor evaluations, and inconsistent results. This may be a result of a generalist's capability to persist in a wide range of environmental conditions that are not easily defined by the data, independent variables or model design. Models for the specialist species had consistently strong performances, high evaluations, and similar results among different model applications. This is likely a consequence of the specialist's requirement for explicit environmental resources and ecological barriers that are easily defined by predictive models. Although defining new invaders as generalist or specialist species can be challenging, model performances and evaluations may provide valuable information on a species' potential invasiveness.
User's guide to the Reliability Estimation System Testbed (REST)
NASA Technical Reports Server (NTRS)
Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam
1992-01-01
The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.
A Goal Seeking Strategy for Constructing Systems from Alternative Components
NASA Technical Reports Server (NTRS)
Valentine, Mark E.
1999-01-01
This paper describes a methodology to efficiently construct feasible systems then modify feasible systems to meet successive goals by selecting from alternative components, a problem recognized to be n-p complete. The methodology provides a means to catalog and model alternative components. A presented system modeling Structure is robust enough to model a wide variety of systems and provides a means to compare and evaluate alternative systems. These models act as input to a methodology for selecting alternative components to construct feasible systems and modify feasible systems to meet design goals and objectives. The presented algorithm's ability to find a restricted solution, as defined by a unique set of requirements, is demonstrated against an exhaustive search of a sample of proposed shuttle modifications. The utility of the algorithm is demonstrated by comparing results from the algorithm with results from three NASA shuttle evolution studies using their value systems and assumptions.
Space station automation study. Volume 2: Technical report. Autonomous systems and assembly
NASA Technical Reports Server (NTRS)
1984-01-01
The application of automation to space station functions is discussed. A summary is given of the evolutionary functions associated with long range missions and objectives. Mission tasks and requirements are defined. Space station sub-systems, mission models, assembly, and construction are discussed.
An Evaporative Cooling Model for Teaching Applied Psychrometrics
ERIC Educational Resources Information Center
Johnson, Donald M.
2004-01-01
Evaporative cooling systems are commonly used in controlled environment plant and animal production. These cooling systems operate based on well defined psychrometric principles. However, students often experience considerable difficulty in learning these principles when they are taught in an abstract, verbal manner. This article describes an…
A generic model to simulate air-borne diseases as a function of crop architecture.
Casadebaig, Pierre; Quesnel, Gauthier; Langlais, Michel; Faivre, Robert
2012-01-01
In a context of pesticide use reduction, alternatives to chemical-based crop protection strategies are needed to control diseases. Crop and plant architectures can be viewed as levers to control disease outbreaks by affecting microclimate within the canopy or pathogen transmission between plants. Modeling and simulation is a key approach to help analyze the behaviour of such systems where direct observations are difficult and tedious. Modeling permits the joining of concepts from ecophysiology and epidemiology to define structures and functions generic enough to describe a wide range of epidemiological dynamics. Additionally, this conception should minimize computing time by both limiting the complexity and setting an efficient software implementation. In this paper, our aim was to present a model that suited these constraints so it could first be used as a research and teaching tool to promote discussions about epidemic management in cropping systems. The system was modelled as a combination of individual hosts (population of plants or organs) and infectious agents (pathogens) whose contacts are restricted through a network of connections. The system dynamics were described at an individual scale. Additional attention was given to the identification of generic properties of host-pathogen systems to widen the model's applicability domain. Two specific pathosystems with contrasted crop architectures were considered: ascochyta blight on pea (homogeneously layered canopy) and potato late blight (lattice of individualized plants). The model behavior was assessed by simulation and sensitivity analysis and these results were discussed against the model ability to discriminate between the defined types of epidemics. Crop traits related to disease avoidance resulting in a low exposure, a slow dispersal or a de-synchronization of plant and pathogen cycles were shown to strongly impact the disease severity at the crop scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moriarty, M.P.
1993-01-15
The heat transport subsystem for a liquid metal cooled thermionic space nuclear power system was modelled using algorithms developed in support of previous nuclear power system study programs, which date back to the SNAP-10A flight system. The model was used to define the optimum dimensions of the various components in the heat transport subsystem subjected to the constraints of minimizing mass and achieving a launchable package that did not require radiator deployment. The resulting design provides for the safe and reliable cooling of the nuclear reactor in a proven lightweight design.
NASA Astrophysics Data System (ADS)
Moriarty, Michael P.
1993-01-01
The heat transport subsystem for a liquid metal cooled thermionic space nuclear power system was modelled using algorithms developed in support of previous nuclear power system study programs, which date back to the SNAP-10A flight system. The model was used to define the optimum dimensions of the various components in the heat transport subsystem subjected to the constraints of minimizing mass and achieving a launchable package that did not require radiator deployment. The resulting design provides for the safe and reliable cooling of the nuclear reactor in a proven lightweight design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elia, Valerio; Gnoni, Maria Grazia, E-mail: mariagrazia.gnoni@unisalento.it; Tornese, Fabiana
Highlights: • Pay-As-You-Throw (PAYT) schemes are becoming widespread in several countries. • Economic, organizational and technological issues have to be integrated in an efficient PAYT model design. • Efficiency refers to a PAYT system which support high citizen participation rates as well as economic sustainability. • Different steps and constraints have to be evaluated from collection services to type technologies. • An holistic approach is discussed to support PAYT systems diffusion. - Abstract: Pay-As-You-Throw (PAYT) strategies are becoming widely applied in solid waste management systems; the main purpose is to support a more sustainable – from economic, environmental and socialmore » points of view – management of waste flows. Adopting PAYT charging models increases the complexity level of the waste management service as new organizational issues have to be evaluated compared to flat charging models. In addition, innovative technological solutions could also be adopted to increase the overall efficiency of the service. Unit pricing, user identification and waste measurement represent the three most important processes to be defined in a PAYT system. The paper proposes a holistic framework to support an effective design and management process. The framework defines most critical processes and effective organizational and technological solutions for supporting waste managers as well as researchers.« less
An adaptable architecture for patient cohort identification from diverse data sources
Bache, Richard; Miles, Simon; Taweel, Adel
2013-01-01
Objective We define and validate an architecture for systems that identify patient cohorts for clinical trials from multiple heterogeneous data sources. This architecture has an explicit query model capable of supporting temporal reasoning and expressing eligibility criteria independently of the representation of the data used to evaluate them. Method The architecture has the key feature that queries defined according to the query model are both pre and post-processed and this is used to address both structural and semantic heterogeneity. The process of extracting the relevant clinical facts is separated from the process of reasoning about them. A specific instance of the query model is then defined and implemented. Results We show that the specific instance of the query model has wide applicability. We then describe how it is used to access three diverse data warehouses to determine patient counts. Discussion Although the proposed architecture requires greater effort to implement the query model than would be the case for using just SQL and accessing a data-based management system directly, this effort is justified because it supports both temporal reasoning and heterogeneous data sources. The query model only needs to be implemented once no matter how many data sources are accessed. Each additional source requires only the implementation of a lightweight adaptor. Conclusions The architecture has been used to implement a specific query model that can express complex eligibility criteria and access three diverse data warehouses thus demonstrating the feasibility of this approach in dealing with temporal reasoning and data heterogeneity. PMID:24064442
Saraiva, Renata M; Bezerra, João; Perkusich, Mirko; Almeida, Hyggo; Siebra, Clauirton
2015-01-01
Recently there has been an increasing interest in applying information technology to support the diagnosis of diseases such as cancer. In this paper, we present a hybrid approach using case-based reasoning (CBR) and rule-based reasoning (RBR) to support cancer diagnosis. We used symptoms, signs, and personal information from patients as inputs to our model. To form specialized diagnoses, we used rules to define the input factors' importance according to the patient's characteristics. The model's output presents the probability of the patient having a type of cancer. To carry out this research, we had the approval of the ethics committee at Napoleão Laureano Hospital, in João Pessoa, Brazil. To define our model's cases, we collected real patient data at Napoleão Laureano Hospital. To define our model's rules and weights, we researched specialized literature and interviewed health professional. To validate our model, we used K-fold cross validation with the data collected at Napoleão Laureano Hospital. The results showed that our approach is an effective CBR system to diagnose cancer.
TRIM.FaTE is a spatially explicit, compartmental mass balance model that describes the movement and transformation of pollutants over time, through a user-defined, bounded system that includes both biotic and abiotic compartments.
3D Model of the Tuscarora Geothermal Area
Faulds, James E.
2013-12-31
The Tuscarora geothermal system sits within a ~15 km wide left-step in a major west-dipping range-bounding normal fault system. The step over is defined by the Independence Mountains fault zone and the Bull Runs Mountains fault zone which overlap along strike. Strain is transferred between these major fault segments via and array of northerly striking normal faults with offsets of 10s to 100s of meters and strike lengths of less than 5 km. These faults within the step over are one to two orders of magnitude smaller than the range-bounding fault zones between which they reside. Faults within the broad step define an anticlinal accommodation zone wherein east-dipping faults mainly occupy western half of the accommodation zone and west-dipping faults lie in the eastern half of the accommodation zone. The 3D model of Tuscarora encompasses 70 small-offset normal faults that define the accommodation zone and a portion of the Independence Mountains fault zone, which dips beneath the geothermal field. The geothermal system resides in the axial part of the accommodation, straddling the two fault dip domains. The Tuscarora 3D geologic model consists of 10 stratigraphic units. Unconsolidated Quaternary alluvium has eroded down into bedrock units, the youngest and stratigraphically highest bedrock units are middle Miocene rhyolite and dacite flows regionally correlated with the Jarbidge Rhyolite and modeled with uniform cumulative thickness of ~350 m. Underlying these lava flows are Eocene volcanic rocks of the Big Cottonwood Canyon caldera. These units are modeled as intracaldera deposits, including domes, flows, and thick ash deposits that change in thickness and locally pinch out. The Paleozoic basement of consists metasedimenary and metavolcanic rocks, dominated by argillite, siltstone, limestone, quartzite, and metabasalt of the Schoonover and Snow Canyon Formations. Paleozoic formations are lumped in a single basement unit in the model. Fault blocks in the eastern portion of the model are tilted 5-30 degrees toward the Independence Mountains fault zone. Fault blocks in the western portion of the model are tilted toward steeply east-dipping normal faults. These opposing fault block dips define a shallow extensional anticline. Geothermal production is from 4 closely-spaced wells, that exploit a west-dipping, NNE-striking fault zone near the axial part of the accommodation zone.
The University Münster Model Surgery System for Orthognathic Surgery. Part II -- KD-MMS.
Ehmer, Ulrike; Joos, Ulrich; Ziebura, Thomas; Flieger, Stefanie; Wiechmann, Dirk
2013-01-04
Model surgery is an integral part of the planning procedure in orthognathic surgery. Most concepts comprise cutting the dental cast off its socket. The standardized spacer plates of the KD-MMS provide for a non-destructive, reversible and reproducible means of maxillary and/or mandibular plaster cast separation. In the course of development of the system various articulator types were evaluated with regard to their capability to provide a means of realizing the concepts comprised of the KD-MMS. Special attention was dedicated to the ability to perform three-dimensional displacements without cutting of plaster casts. Various utilities were developed to facilitate maxillary displacement in accordance to the planning. Objectives of this development comprised the ability to implement the values established in the course of two-dimensional ceph planning. The system - KD-MMS comprises a set of hardware components as well as a defined procedure. Essential hardware components are red spacer and blue mounting plates. The blue mounting plates replace the standard yellow SAM mounting elements. The red spacers provide for a defined leeway of 8 mm for three-dimensional movements. The non-destructive approach of the KD-MMS makes it possible to conduct different model surgeries with the same plaster casts as well as to restore the initial, pre-surgical situation at any time. Thereby, surgical protocol generation and gnathologic splint construction are facilitated. The KD-MMS hardware components in conjunction with the defined procedures are capable of increasing efficiency and accuracy of model surgery and splint construction. In cases where different surgical approaches need to be evaluated in the course of model surgery, a significant reduction of chair time may be achieved.
Global Change adaptation in water resources management: the Water Change project.
Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine
2012-12-01
In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.
Defining clinical deterioration.
Jones, Daryl; Mitchell, Imogen; Hillman, Ken; Story, David
2013-08-01
To review literature reporting adverse events and physiological instability in order to develop frameworks that describe and define clinical deterioration in hospitalised patients. Literature review of publications from 1960 to August 2012. Conception and refinement of models to describe clinical deterioration based on prevailing themes that developed chronologically in adverse event literature. We propose four frameworks or models that define clinical deterioration and discuss the utility of each. Early attempts used retrospective chart review and focussed on the end result of deterioration (adverse events) and iatrogenesis. Subsequent models were also retrospective, but used discrete complications (e.g. sepsis, cardiac arrest) to define deterioration, had a more clinical focus, and identified the concept of antecedent physiological instability. Current models for defining clinical deterioration are based on the presence of abnormalities in vital signs and other clinical observations and attempt to prospectively assist clinicians in predicting subsequent risk. However, use of deranged vital signs in isolation does not consider important patient-, disease-, or system-related factors that are known to adversely affect the outcome of hospitalised patients. These include pre-morbid function, frailty, extent and severity of co-morbidity, nature of presenting illness, delays in responding to deterioration and institution of treatment, and patient response to therapy. There is a need to develop multiple-variable models for deteriorating ward patients similar to those used in intensive care units. Such models may assist clinician education, prospective and real-time patient risk stratification, and guide quality improvement initiatives that prevent and improve response to clinical deterioration. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.
Ontology patterns for complex topographic feature yypes
Varanka, Dalia E.
2011-01-01
Complex feature types are defined as integrated relations between basic features for a shared meaning or concept. The shared semantic concept is difficult to define in commonly used geographic information systems (GIS) and remote sensing technologies. The role of spatial relations between complex feature parts was recognized in early GIS literature, but had limited representation in the feature or coverage data models of GIS. Spatial relations are more explicitly specified in semantic technology. In this paper, semantics for topographic feature ontology design patterns (ODP) are developed as data models for the representation of complex features. In the context of topographic processes, component assemblages are supported by resource systems and are found on local landscapes. The topographic ontology is organized across six thematic modules that can account for basic feature types, resource systems, and landscape types. Types of complex feature attributes include location, generative processes and physical description. Node/edge networks model standard spatial relations and relations specific to topographic science to represent complex features. To demonstrate these concepts, data from The National Map of the U. S. Geological Survey was converted and assembled into ODP.
Information Systems Curricula: A Fifty Year Journey
ERIC Educational Resources Information Center
Longenecker, Herbert E., Jr.; Feinstein, David; Clark, Jon D.
2013-01-01
This article presents the results of research to explore the nature of changes in skills over a fifty year period spanning the life of Information Systems model curricula. Work begun in 1999 was expanded both backwards in time, as well as forwards to 2012 to define skills relevant to Information Systems curricula. The work in 1999 was based on job…
Large-scale systems: Complexity, stability, reliability
NASA Technical Reports Server (NTRS)
Siljak, D. D.
1975-01-01
After showing that a complex dynamic system with a competitive structure has highly reliable stability, a class of noncompetitive dynamic systems for which competitive models can be constructed is defined. It is shown that such a construction is possible in the context of the hierarchic stability analysis. The scheme is based on the comparison principle and vector Liapunov functions.
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
2016-01-21
Metaproteomics - the large-scale characterization of the entire protein complement of environmental microbiota at a given point in time - added unique features and possibilities to study environmental microbial communities and to unravel these “black boxes”. New technical challenges arose which were not an issue for classical proteome analytics before and choosing the appropriate model system applicable to the research question can be difficult. Here, we reviewed different model systems for metaproteome analysis. Following a short introduction to microbial communities and systems, we discussed the most used systems ranging from technical systems over rhizospheric models to systems for the medicalmore » field. This includes acid mine drainage, anaerobic digesters, activated sludge, planted fixed bed reactors, gastrointestinal simulators and in vivo models. Model systems are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability or reliable protein extraction. The implementation of model systems can be considered as a step forward to better understand microbial responses and ecological distribution of member organisms. In the future, novel improvements are necessary to fully engage complex environmental systems.« less
Composite Socio-Technical Systems: A Method for Social Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yingchen; He, Fulin; Hao, Jun
In order to model and study the interactions between social on technical systems, a systemic method, namely the composite socio-technical systems (CSTS), is proposed to incorporate social systems, technical systems and the interaction mechanism between them. A case study on University of Denver (DU) campus grid is presented in paper to demonstrate the application of the proposed method. In the case study, the social system, technical system, and the interaction mechanism are defined and modelled within the framework of CSTS. Distributed and centralized control and management schemes are investigated, respectively, and numerical results verifies the feasibility and performance of themore » proposed composite system method.« less
Automated Environment Generation for Software Model Checking
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.
2003-01-01
A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.
Infrastructure Vulnerability Assessment Model (I-VAM).
Ezell, Barry Charles
2007-06-01
Quantifying vulnerability to critical infrastructure has not been adequately addressed in the literature. Thus, the purpose of this article is to present a model that quantifies vulnerability. Vulnerability is defined as a measure of system susceptibility to threat scenarios. This article asserts that vulnerability is a condition of the system and it can be quantified using the Infrastructure Vulnerability Assessment Model (I-VAM). The model is presented and then applied to a medium-sized clean water system. The model requires subject matter experts (SMEs) to establish value functions and weights, and to assess protection measures of the system. Simulation is used to account for uncertainty in measurement, aggregate expert assessment, and to yield a vulnerability (Omega) density function. Results demonstrate that I-VAM is useful to decisionmakers who prefer quantification to qualitative treatment of vulnerability. I-VAM can be used to quantify vulnerability to other infrastructures, supervisory control and data acquisition systems (SCADA), and distributed control systems (DCS).
Considering the difficulty in measuring restoration success for nonpoint source pollutants, nutrient assimilative capacity (NAS) offers an attractive systems-based metric. Here NAS was defined using an impulse-response model of nitrate fate and transport. Eleven parameters were e...
USDA-ARS?s Scientific Manuscript database
Topography exerts critical controls on many hydrologic, geomorphologic, and environmental biophysical processes. Unfortunately many watershed modeling systems use topography only to define basin boundaries and stream channels and do not explicitly account for the topographic controls on processes su...
Cultural propagation on social networks
NASA Astrophysics Data System (ADS)
Kuperman, M. N.
2006-04-01
In this work we present a model for the propagation of culture on networks of different topology and by considering different underlying dynamics. We extend a previous model proposed by Axelrod by letting a majority govern the dynamics of changes. This in turn allows us to define a Lyapunov functional for the system.
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
Thoma, Eva C; Heckel, Tobias; Keller, David; Giroud, Nicolas; Leonard, Brian; Christensen, Klaus; Roth, Adrian; Bertinetti-Lapatki, Cristina; Graf, Martin; Patsch, Christoph
2016-10-25
Due to their broad differentiation potential, pluripotent stem cells (PSCs) offer a promising approach for generating relevant cellular models for various applications. While human PSC-based cellular models are already advanced, similar systems for non-human primates (NHPs) are still lacking. However, as NHPs are the most appropriate animals for evaluating the safety of many novel pharmaceuticals, the availability of in vitro systems would be extremely useful to bridge the gap between cellular and animal models. Here, we present a NHP in vitro endothelial cell system using induced pluripotent stem cells (IPSCs) from Cynomolgus monkey (Macaca fascicularis). Based on an adapted protocol for human IPSCs, we directly differentiated macaque IPSCs into endothelial cells under chemically defined conditions. The resulting endothelial cells can be enriched using immuno-magnetic cell sorting and display endothelial marker expression and function. RNA sequencing revealed that the differentiation process closely resembled vasculogenesis. Moreover, we showed that endothelial cells derived from macaque and human IPSCs are highly similar with respect to gene expression patterns and key endothelial functions, such as inflammatory responses. These data demonstrate the power of IPSC differentiation technology to generate defined cell types for use as translational in vitro models to compare cell type-specific responses across species.
A pattern-based analysis of clinical computer-interpretable guideline modeling languages.
Mulyar, Nataliya; van der Aalst, Wil M P; Peleg, Mor
2007-01-01
Languages used to specify computer-interpretable guidelines (CIGs) differ in their approaches to addressing particular modeling challenges. The main goals of this article are: (1) to examine the expressive power of CIG modeling languages, and (2) to define the differences, from the control-flow perspective, between process languages in workflow management systems and modeling languages used to design clinical guidelines. The pattern-based analysis was applied to guideline modeling languages Asbru, EON, GLIF, and PROforma. We focused on control-flow and left other perspectives out of consideration. We evaluated the selected CIG modeling languages and identified their degree of support of 43 control-flow patterns. We used a set of explicitly defined evaluation criteria to determine whether each pattern is supported directly, indirectly, or not at all. PROforma offers direct support for 22 of 43 patterns, Asbru 20, GLIF 17, and EON 11. All four directly support basic control-flow patterns, cancellation patterns, and some advance branching and synchronization patterns. None support multiple instances patterns. They offer varying levels of support for synchronizing merge patterns and state-based patterns. Some support a few scenarios not covered by the 43 control-flow patterns. CIG modeling languages are remarkably close to traditional workflow languages from the control-flow perspective, but cover many fewer workflow patterns. CIG languages offer some flexibility that supports modeling of complex decisions and provide ways for modeling some decisions not covered by workflow management systems. Workflow management systems may be suitable for clinical guideline applications.
Brooker, Simon; Beasley, Michael; Ndinaromtan, Montanan; Madjiouroum, Ester Mobele; Baboguel, Marie; Djenguinabe, Elie; Hay, Simon I.; Bundy, Don A. P.
2002-01-01
OBJECTIVE: To design and implement a rapid and valid epidemiological assessment of helminths among schoolchildren in Chad using ecological zones defined by remote sensing satellite sensor data and to investigate the environmental limits of helminth distribution. METHODS: Remote sensing proxy environmental data were used to define seven ecological zones in Chad. These were combined with population data in a geographical information system (GIS) in order to define a sampling protocol. On this basis, 20 schools were surveyed. Multilevel analysis, by means of generalized estimating equations to account for clustering at the school level, was used to investigate the relationship between infection patterns and key environmental variables. FINDINGS: In a sample of 1023 schoolchildren, 22.5% were infected with Schistosoma haematobium and 32.7% with hookworm. None were infected with Ascaris lumbricoides or Trichuris trichiura. The prevalence of S. haematobium and hookworm showed marked geographical heterogeneity and the observed patterns showed a close association with the defined ecological zones and significant relationships with environmental variables. These results contribute towards defining the thermal limits of geohelminth species. Predictions of infection prevalence were made for each school surveyed with the aid of models previously developed for Cameroon. These models correctly predicted that A. lumbricoides and T. trichiura would not occur in Chad but the predictions for S. haematobium were less reliable at the school level. CONCLUSION: GIS and remote sensing can play an important part in the rapid planning of helminth control programmes where little information on disease burden is available. Remote sensing prediction models can indicate patterns of geohelminth infection but can only identify potential areas of high risk for S. haematobium. PMID:12471398
Using sensitivity analysis in model calibration efforts
Tiedeman, Claire; Hill, Mary C.
2003-01-01
In models of natural and engineered systems, sensitivity analysis can be used to assess relations among system state observations, model parameters, and model predictions. The model itself links these three entities, and model sensitivities can be used to quantify the links. Sensitivities are defined as the derivatives of simulated quantities (such as simulated equivalents of observations, or model predictions) with respect to model parameters. We present four measures calculated from model sensitivities that quantify the observation-parameter-prediction links and that are especially useful during the calibration and prediction phases of modeling. These four measures are composite scaled sensitivities (CSS), prediction scaled sensitivities (PSS), the value of improved information (VOII) statistic, and the observation prediction (OPR) statistic. These measures can be used to help guide initial calibration of models, collection of field data beneficial to model predictions, and recalibration of models updated with new field information. Once model sensitivities have been calculated, each of the four measures requires minimal computational effort. We apply the four measures to a three-layer MODFLOW-2000 (Harbaugh et al., 2000; Hill et al., 2000) model of the Death Valley regional ground-water flow system (DVRFS), located in southern Nevada and California. D’Agnese et al. (1997, 1999) developed and calibrated the model using nonlinear regression methods. Figure 1 shows some of the observations, parameters, and predictions for the DVRFS model. Observed quantities include hydraulic heads and spring flows. The 23 defined model parameters include hydraulic conductivities, vertical anisotropies, recharge rates, evapotranspiration rates, and pumpage. Predictions of interest for this regional-scale model are advective transport paths from potential contamination sites underlying the Nevada Test Site and Yucca Mountain.
Systems Engineering for Space Exploration Medical Capabilities
NASA Technical Reports Server (NTRS)
Mindock, Jennifer; Reilly, Jeffrey; Rubin, David; Urbina, Michelle; Hailey, Melinda; Hanson, Andrea; Burba, Tyler; McGuire, Kerry; Cerro, Jeffrey; Middour, Chris;
2017-01-01
Human exploration missions that reach destinations beyond low Earth orbit, such as Mars, will present significant new challenges to crew health management. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is applying systems engineering principles and practices to accomplish its goals. This paper discusses the structured and integrative approach that is guiding the medical system technical development. Assumptions for the required levels of care on exploration missions, medical system goals, and a Concept of Operations are early products that capture and clarify stakeholder expectations. Model-Based Systems Engineering techniques are then applied to define medical system behavior and architecture. Interfaces to other flight and ground systems, and within the medical system are identified and defined. Initial requirements and traceability are established, which sets the stage for identification of future technology development needs. An early approach for verification and validation, taking advantage of terrestrial and near-Earth exploration system analogs, is also defined to further guide system planning and development.
A discrete model of Drosophila eggshell patterning reveals cell-autonomous and juxtacrine effects.
Fauré, Adrien; Vreede, Barbara M I; Sucena, Elio; Chaouiya, Claudine
2014-03-01
The Drosophila eggshell constitutes a remarkable system for the study of epithelial patterning, both experimentally and through computational modeling. Dorsal eggshell appendages arise from specific regions in the anterior follicular epithelium that covers the oocyte: two groups of cells expressing broad (roof cells) bordered by rhomboid expressing cells (floor cells). Despite the large number of genes known to participate in defining these domains and the important modeling efforts put into this developmental system, key patterning events still lack a proper mechanistic understanding and/or genetic basis, and the literature appears to conflict on some crucial points. We tackle these issues with an original, discrete framework that considers single-cell models that are integrated to construct epithelial models. We first build a phenomenological model that reproduces wild type follicular epithelial patterns, confirming EGF and BMP signaling input as sufficient to establish the major features of this patterning system within the anterior domain. Importantly, this simple model predicts an instructive juxtacrine signal linking the roof and floor domains. To explore this prediction, we define a mechanistic model that integrates the combined effects of cellular genetic networks, cell communication and network adjustment through developmental events. Moreover, we focus on the anterior competence region, and postulate that early BMP signaling participates with early EGF signaling in its specification. This model accurately simulates wild type pattern formation and is able to reproduce, with unprecedented level of precision and completeness, various published gain-of-function and loss-of-function experiments, including perturbations of the BMP pathway previously seen as conflicting results. The result is a coherent model built upon rules that may be generalized to other epithelia and developmental systems.
A Consideration of Factors Accounting for Goal Effectiveness: A Longitudinal Study.
ERIC Educational Resources Information Center
Stewart, James H.
This research paper presents a model of organizational effectiveness based on the open system perspective and tests four hypotheses concerning organizational effectiveness factors. Organizational effectiveness can be defined as the extent to which a social system makes progress toward objectives based on the four phases of organizational…
USDA-ARS?s Scientific Manuscript database
The mechanisms controlling allometric development of the mammary ductal tree have largely been defined through key studies in rodent model systems. The development of this system is known to depend on the integrated actions of pituitary and ovarian hormones, locally produced growth factors, extracel...
How Do You Evaluate Everyone Who Isn't a Teacher?
ERIC Educational Resources Information Center
Tucker, Pamela D.; Stronge, James H.
1994-01-01
Most states mandate evaluation of all certified employees, but most school systems lack a prescribed evaluation process for counselors, nurses, librarians, media specialists, and school psychologists. The Professional Support Personnel Evaluation Model defines a prescriptive, yet flexible seven-step process based on identifying system needs and…
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594
An AI approach for scheduling space-station payloads at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Castillo, D.; Ihrie, D.; Mcdaniel, M.; Tilley, R.
1987-01-01
The Payload Processing for Space-Station Operations (PHITS) is a prototype modeling tool capable of addressing many Space Station related concerns. The system's object oriented design approach coupled with a powerful user interface provide the user with capabilities to easily define and model many applications. PHITS differs from many artificial intelligence based systems in that it couples scheduling and goal-directed simulation to ensure that on-orbit requirement dates are satisfied.
NASA Technical Reports Server (NTRS)
Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.
1989-01-01
In a previous work we have defined a general architecture model for autonomous systems, which can be mapped easily to describe the functions of any automated system (SDAG-86-01). In this note, we use the model to describe the problem of thermal management in space stations. First we briefly review the architecture, then we present the environment of our application, and finally we detail the specific function for each functional block of the architecture for that environment.
ERIC Educational Resources Information Center
Witmer, J. Melvin
Five models for the behavior change process at both the individual and the systemic level are proposed. The author sees them as comprising an integrated or eclectic approach, which he defines as using that which is most appropriate for achieving goals. The five models, which are central in the author's program of counselor education, are: (1) the…
Parametric study of laser photovoltaic energy converters
NASA Technical Reports Server (NTRS)
Walker, G. H.; Heinbockel, J. H.
1987-01-01
Photovoltaic converters are of interest for converting laser power to electrical power in a space-based laser power system. This paper describes a model for photovoltaic laser converters and the application of this model to a neodymium laser silicon photovoltaic converter system. A parametric study which defines the sensitivity of the photovoltaic parameters is described. An optimized silicon photovoltaic converter has an efficiency greater than 50 percent for 1000 W/sq cm of neodymium laser radiation.
Process Model of A Fusion Fuel Recovery System for a Direct Drive IFE Power Reactor
NASA Astrophysics Data System (ADS)
Natta, Saswathi; Aristova, Maria; Gentile, Charles
2008-11-01
A task has been initiated to develop a detailed representative model for the fuel recovery system (FRS) in the prospective direct drive inertial fusion energy (IFE) reactor. As part of the conceptual design phase of the project, a chemical process model is developed in order to observe the interaction of system components. This process model is developed using FEMLAB Multiphysics software with the corresponding chemical engineering module (CEM). Initially, the reactants, system structure, and processes are defined using known chemical species of the target chamber exhaust. Each step within the Fuel recovery system is modeled compartmentally and then merged to form the closed loop fuel recovery system. The output, which includes physical properties and chemical content of the products, is analyzed after each step of the system to determine the most efficient and productive system parameters. This will serve to attenuate possible bottlenecks in the system. This modeling evaluation is instrumental in optimizing and closing the fusion fuel cycle in a direct drive IFE power reactor. The results of the modeling are presented in this paper.
Operational Space Weather Activities in the US
NASA Astrophysics Data System (ADS)
Berger, Thomas; Singer, Howard; Onsager, Terrance; Viereck, Rodney; Murtagh, William; Rutledge, Robert
2016-07-01
We review the current activities in the civil operational space weather forecasting enterprise of the United States. The NOAA/Space Weather Prediction Center is the nation's official source of space weather watches, warnings, and alerts, working with partners in the Air Force as well as international operational forecast services to provide predictions, data, and products on a large variety of space weather phenomena and impacts. In October 2015, the White House Office of Science and Technology Policy released the National Space Weather Strategy (NSWS) and associated Space Weather Action Plan (SWAP) that define how the nation will better forecast, mitigate, and respond to an extreme space weather event. The SWAP defines actions involving multiple federal agencies and mandates coordination and collaboration with academia, the private sector, and international bodies to, among other things, develop and sustain an operational space weather observing system; develop and deploy new models of space weather impacts to critical infrastructure systems; define new mechanisms for the transition of research models to operations and to ensure that the research community is supported for, and has access to, operational model upgrade paths; and to enhance fundamental understanding of space weather through support of research models and observations. The SWAP will guide significant aspects of space weather operational and research activities for the next decade, with opportunities to revisit the strategy in the coming years through the auspices of the National Science and Technology Council.
Interplay of node connectivity and epidemic rates in the dynamics of epidemic networks
Kostova, Tanya
2010-07-09
We present and analyze a discrete-time susceptible-infected epidemic network model which represents each host as a separate entity and allows heterogeneous hosts and contacts. We establish a necessary and sufficient condition for global stability of the disease-free equilibrium of the system (defined as epidemic controllability) which defines the epidemic reproduction number of the network. When this condition is not fulfilled, we show that the system has a unique, locally stable equilibrium. As a result, we further derive sufficient conditions for epidemic controllability in terms of the epidemic rates and the network topology.
Matrix management for aerospace 2000
NASA Technical Reports Server (NTRS)
Mccarthy, J. F., Jr.
1980-01-01
The martix management approach to program management is an organized effort for attaining program objectives by defining and structuring all elements so as to form a single system whose parts are united by interaction. The objective of the systems approach is uncompromisingly complete coverage of the program management endeavor. Starting with an analysis of the functions necessary to carry out a given program, a model must be defined; a matrix of responsibility assignment must be prepared; and each operational process must be examined to establish how it is to be carried out and how it relates to all other processes.
Stochastic Blockmodeling of the Modules and Core of the Caenorhabditis elegans Connectome
Pavlovic, Dragana M.; Vértes, Petra E.; Bullmore, Edward T.; Schafer, William R.; Nichols, Thomas E.
2014-01-01
Recently, there has been much interest in the community structure or mesoscale organization of complex networks. This structure is characterised either as a set of sparsely inter-connected modules or as a highly connected core with a sparsely connected periphery. However, it is often difficult to disambiguate these two types of mesoscale structure or, indeed, to summarise the full network in terms of the relationships between its mesoscale constituents. Here, we estimate a community structure with a stochastic blockmodel approach, the Erdős-Rényi Mixture Model, and compare it to the much more widely used deterministic methods, such as the Louvain and Spectral algorithms. We used the Caenorhabditis elegans (C. elegans) nervous system (connectome) as a model system in which biological knowledge about each node or neuron can be used to validate the functional relevance of the communities obtained. The deterministic algorithms derived communities with 4–5 modules, defined by sparse inter-connectivity between all modules. In contrast, the stochastic Erdős-Rényi Mixture Model estimated a community with 9 blocks or groups which comprised a similar set of modules but also included a clearly defined core, made of 2 small groups. We show that the “core-in-modules” decomposition of the worm brain network, estimated by the Erdős-Rényi Mixture Model, is more compatible with prior biological knowledge about the C. elegans nervous system than the purely modular decomposition defined deterministically. We also show that the blockmodel can be used both to generate stochastic realisations (simulations) of the biological connectome, and to compress network into a small number of super-nodes and their connectivity. We expect that the Erdős-Rényi Mixture Model may be useful for investigating the complex community structures in other (nervous) systems. PMID:24988196
Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems
NASA Technical Reports Server (NTRS)
Holda, Julie
2004-01-01
The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.
Feature-based data assimilation in geophysics
NASA Astrophysics Data System (ADS)
Morzfeld, Matthias; Adams, Jesse; Lunderman, Spencer; Orozco, Rafael
2018-05-01
Many applications in science require that computational models and data be combined. In a Bayesian framework, this is usually done by defining likelihoods based on the mismatch of model outputs and data. However, matching model outputs and data in this way can be unnecessary or impossible. For example, using large amounts of steady state data is unnecessary because these data are redundant. It is numerically difficult to assimilate data in chaotic systems. It is often impossible to assimilate data of a complex system into a low-dimensional model. As a specific example, consider a low-dimensional stochastic model for the dipole of the Earth's magnetic field, while other field components are ignored in the model. The above issues can be addressed by selecting features of the data, and defining likelihoods based on the features, rather than by the usual mismatch of model output and data. Our goal is to contribute to a fundamental understanding of such a feature-based approach that allows us to assimilate selected aspects of data into models. We also explain how the feature-based approach can be interpreted as a method for reducing an effective dimension and derive new noise models, based on perturbed observations, that lead to computationally efficient solutions. Numerical implementations of our ideas are illustrated in four examples.
2010-06-01
autonomic and pain functions, and facilitating/inhibiting voluntary movements. The external segment of the globus pallidus (globus pallidus externa, GPe...or less responsive to pain stimuli. 1.2.4. Other cortico-basal ganglia loops Alexander, Strick and colleagues have additionally defined a number of... orofacial loop and loops through inferotemporal and posterior parietal cortical areas have also been defined. 1.2.5. Interactions between loops Once
Norris, Jill M; White, Deborah E; Nowell, Lorelli; Mrklas, Kelly; Stelfox, Henry T
2017-08-01
Engaging stakeholders from varied organizational levels is essential to successful healthcare quality improvement. However, engagement has been hard to achieve and to measure across diverse stakeholders. Further, current implementation science models provide little clarity about what engagement means, despite its importance. The aim of this study was to understand how stakeholders of healthcare improvement initiatives defined engagement. Participants (n = 86) in this qualitative thematic study were purposively sampled for individual interviews. Participants included leaders, core members, frontline clinicians, support personnel, and other stakeholders of Strategic Clinical Networks in Alberta Health Services, a Canadian provincial health system with over 108,000 employees. We used an iterative thematic approach to analyze participants' responses to the question, "How do you define engagement?" Regardless of their organizational role, participants defined engagement through three interrelated themes. First, engagement was active participation from willing and committed stakeholders, with levels that ranged from information sharing to full decision-making. Second, engagement centered on a shared decision-making process about meaningful change for everyone "around the table," those who are most impacted. Third, engagement was two-way interactions that began early in the change process, where exchanges were respectful and all stakeholders felt heard and understood. This study highlights the commonalities of how stakeholders in a large healthcare system defined engagement-a shared understanding and terminology-to guide and improve stakeholder engagement. Overall, engagement was an active and committed decision-making about a meaningful problem through respectful interactions and dialog where everyone's voice is considered. Our results may be used in conjunction with current implementation models to provide clarity about what engagement means and how to engage various stakeholders.
On the Way to Appropriate Model Complexity
NASA Astrophysics Data System (ADS)
Höge, M.
2016-12-01
When statistical models are used to represent natural phenomena they are often too simple or too complex - this is known. But what exactly is model complexity? Among many other definitions, the complexity of a model can be conceptualized as a measure of statistical dependence between observations and parameters (Van der Linde, 2014). However, several issues remain when working with model complexity: A unique definition for model complexity is missing. Assuming a definition is accepted, how can model complexity be quantified? How can we use a quantified complexity to the better of modeling? Generally defined, "complexity is a measure of the information needed to specify the relationships between the elements of organized systems" (Bawden & Robinson, 2015). The complexity of a system changes as the knowledge about the system changes. For models this means that complexity is not a static concept: With more data or higher spatio-temporal resolution of parameters, the complexity of a model changes. There are essentially three categories into which all commonly used complexity measures can be classified: (1) An explicit representation of model complexity as "Degrees of freedom" of a model, e.g. effective number of parameters. (2) Model complexity as code length, a.k.a. "Kolmogorov complexity": The longer the shortest model code, the higher its complexity (e.g. in bits). (3) Complexity defined via information entropy of parametric or predictive uncertainty. Preliminary results show that Bayes theorem allows for incorporating all parts of the non-static concept of model complexity like data quality and quantity or parametric uncertainty. Therefore, we test how different approaches for measuring model complexity perform in comparison to a fully Bayesian model selection procedure. Ultimately, we want to find a measure that helps to assess the most appropriate model.
Analysis of GaAs and Si solar energy hybrid systems
NASA Technical Reports Server (NTRS)
Heinbockel, J. H.; Roberts, A. S., Jr.
1977-01-01
Various silicon hybrid systems are modeled and compared with a gallium arsenide hybrid system. The hybrid systems modeled produce electric power and also thermal power which can be used for heating or air conditioning. Various performance indices are defined and used to compare the system performance: capital cost per electric power out; capital cost per total power out; capital cost per electric power plus mechanical power; annual cost per annual electric energy; and annual cost per annual electric energy plus annual mechanical work. These performance indices indicate that concentrator hybrid systems can be cost effective when compared with present day energy costs.
On domain modelling of the service system with its application to enterprise information systems
NASA Astrophysics Data System (ADS)
Wang, J. W.; Wang, H. F.; Ding, J. L.; Furuta, K.; Kanno, T.; Ip, W. H.; Zhang, W. J.
2016-01-01
Information systems are a kind of service systems and they are throughout every element of a modern industrial and business system, much like blood in our body. Types of information systems are heterogeneous because of extreme uncertainty in changes in modern industrial and business systems. To effectively manage information systems, modelling of the work domain (or domain) of information systems is necessary. In this paper, a domain modelling framework for the service system is proposed and its application to the enterprise information system is outlined. The framework is defined based on application of a general domain modelling tool called function-context-behaviour-principle-state-structure (FCBPSS). The FCBPSS is based on a set of core concepts, namely: function, context, behaviour, principle, state and structure and system decomposition. Different from many other applications of FCBPSS in systems engineering, the FCBPSS is applied to both infrastructure and substance systems, which is novel and effective to modelling of service systems including enterprise information systems. It is to be noted that domain modelling of systems (e.g. enterprise information systems) is a key to integration of heterogeneous systems and to coping with unanticipated situations facing to systems.
An analysis of electronic document management in oncology care.
Poulter, Thomas; Gannon, Brian; Bath, Peter A
2012-06-01
In this research in progress, a reference model for the use of electronic patient record (EPR) systems in oncology is described. The model, termed CICERO, comprises technical and functional components, and emphasises usability, clinical safety and user acceptance. One of the functional components of the model-an electronic document and records management (EDRM) system-is monitored in the course of its deployment at a leading oncology centre in the UK. Specifically, the user requirements and design of the EDRM solution are described.The study is interpretative and forms part a wider research programme to define and validate the CICERO model. Preliminary conclusions confirm the importance of a socio-technical perspective in Onco-EPR system design.
Building Excellence in Project Execution: Integrated Project Management
2015-04-30
challenge by adopting and refining the CMMI Model and building the tenets of integrated project management (IPM) into project planning and execution...Systems Center Pacific (SSC Pacific) is addressing this challenge by adopting and refining the CMMI Model, and building the tenets of integrated project...successfully managing stakeholder expectations and meeting requirements. Under the Capability Maturity Model Integration ( CMMI ), IPM is defined as
An expert system for water quality modelling.
Booty, W G; Lam, D C; Bobba, A G; Wong, I; Kay, D; Kerby, J P; Bowen, G S
1992-12-01
The RAISON-micro (Regional Analysis by Intelligent System ON a micro-computer) expert system is being used to predict the effects of mine effluents on receiving waters in Ontario. The potential of this system to assist regulatory agencies and mining industries to define more acceptable effluent limits was shown in an initial study. This system has been further developed so that the expert system helps the model user choose the most appropriate model for a particular application from a hierarchy of models. The system currently contains seven models which range from steady state to time dependent models, for both conservative and nonconservative substances in rivers and lakes. The menu driven expert system prompts the model user for information such as the nature of the receiving water system, the type of effluent being considered, and the range of background data available for use as input to the models. The system can also be used to determine the nature of the environmental conditions at the site which are not available in the textual information database, such as the components of river flow. Applications of the water quality expert system are presented for representative mine sites in the Timmins area of Ontario.
Agent-based models in translational systems biology
An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram
2013-01-01
Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989
UH-60A Black Hawk engineering simulation program. Volume 1: Mathematical model
NASA Technical Reports Server (NTRS)
Howlett, J. J.
1981-01-01
A nonlinear mathematical model of the UR-60A Black Hawk helicopter was developed. This mathematical model, which was based on the Sikorsky General Helicopter (Gen Hel) Flight Dynamics Simulation, provides NASA with an engineering simulation for performance and handling qualities evaluations. This mathematical model is total systems definition of the Black Hawk helicopter represented at a uniform level of sophistication considered necessary for handling qualities evaluations. The model is a total force, large angle representation in six rigid body degrees of freedom. Rotor blade flapping, lagging, and hub rotational degrees of freedom are also represented. In addition to the basic helicopter modules, supportive modules were defined for the landing interface, power unit, ground effects, and gust penetration. Information defining the cockpit environment relevant to pilot in the loop simulation is presented.
Quantum thermodynamics of the resonant-level model with driven system-bath coupling
NASA Astrophysics Data System (ADS)
Haughian, Patrick; Esposito, Massimiliano; Schmidt, Thomas L.
2018-02-01
We study nonequilibrium thermodynamics in a fermionic resonant-level model with arbitrary coupling strength to a fermionic bath, taking the wide-band limit. In contrast to previous theories, we consider a system where both the level energy and the coupling strength depend explicitly on time. We find that, even in this generalized model, consistent thermodynamic laws can be obtained, up to the second order in the drive speed, by splitting the coupling energy symmetrically between system and bath. We define observables for the system energy, work, heat, and entropy, and calculate them using nonequilibrium Green's functions. We find that the observables fulfill the laws of thermodynamics, and connect smoothly to the known equilibrium results.
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Space station onboard propulsion system: Technology study
NASA Technical Reports Server (NTRS)
Mcallister, J. G.; Rudland, R. S.; Redd, L. R.; Beekman, D. H.; Cuffin, S. M.; Beer, C. M.; Mccarthy, K. K.
1987-01-01
The objective was to prepare for the design of the space station propulsion system. Propulsion system concepts were defined and schematics were developed for the most viable concepts. A dual model bipropellant system was found to deliver the largest amount of payload. However, when resupply is considered, an electrolysis system with 10 percent accumulators requires less resupply propellant, though it is penalized by the amount of time required to fill the accumulators and the power requirements for the electrolyzer. A computer simulation was prepared, which was originally intended to simulate the water electrolysis propulsion system but which was expanded to model other types of systems such as cold gas, monopropellant and bipropellant storable systems.
An Adaptive Technique for a Redundant-Sensor Navigation System. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Chien, T. T.
1972-01-01
An on-line adaptive technique is developed to provide a self-contained redundant-sensor navigation system with a capability to utilize its full potentiality in reliability and performance. The gyro navigation system is modeled as a Gauss-Markov process, with degradation modes defined as changes in characteristics specified by parameters associated with the model. The adaptive system is formulated as a multistage stochastic process: (1) a detection system, (2) an identification system and (3) a compensation system. It is shown that the sufficient statistics for the partially observable process in the detection and identification system is the posterior measure of the state of degradation, conditioned on the measurement history.
Advantages and Disadvantages of Health Care Accreditation Mod-els.
Tabrizi, Jafar S; Gharibi, Farid; Wilson, Andrew J
2011-01-01
This systematic review seeks to define the general advantages and disadvan-tages of accreditation programs to assist in choosing the most appropriate approach. Systematic search of SID, Ovid Medline & PubMed databases was conducted by the keywords of accreditation, hospital, medical practice, clinic, accreditation models, health care and Persian meanings. From 2379 initial articles, 83 articles met the full inclusion criteria. From initial analysis, 23 attributes were identified which appeared to define advantages and disadvantages of different accreditation approaches and the available systems were compared on these. Six systems were identified in the international literature including the JCAHO from USA, the Canadian program of CCHSA, and the accreditation programs of UK, Australia, New Zealand and France. The main distinguishing attributes among them were: quality improve-ment, patient and staff safety, improving health services integration, public's confi-dence, effectiveness and efficiency of health services, innovation, influence global standards, information management, breadth of activity, history, effective relationship with stakeholders, agreement with AGIL attributes and independence from government. Based on 23 attributes of comprehensive accreditation systems we have defined from a systematic review, the JCAHO accreditation program of USA and then CCHSA of Can-ada offered the most comprehensive systems with the least disadvantages. Other programs such as the ACHS of Australia, ANAES of France, QHNZ of New Zealand and UK accredita-tion programs were fairly comparable according to these criteria. However the decision for any country or health system should be based on an assessment weighing up their specific objec-tives and needs.
BioModels.net Web Services, a free and integrated toolkit for computational modelling software.
Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille
2010-05-01
Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.
An industrial information integration approach to in-orbit spacecraft
NASA Astrophysics Data System (ADS)
Du, Xiaoning; Wang, Hong; Du, Yuhao; Xu, Li Da; Chaudhry, Sohail; Bi, Zhuming; Guo, Rong; Huang, Yongxuan; Li, Jisheng
2017-01-01
To operate an in-orbit spacecraft, the spacecraft status has to be monitored autonomously by collecting and analysing real-time data, and then detecting abnormities and malfunctions of system components. To develop an information system for spacecraft state detection, we investigate the feasibility of using ontology-based artificial intelligence in the system development. We propose a new modelling technique based on the semantic web, agent, scenarios and ontologies model. In modelling, the subjects of astronautics fields are classified, corresponding agents and scenarios are defined, and they are connected by the semantic web to analyse data and detect failures. We introduce the modelling methodologies and the resulted framework of the status detection information system in this paper. We discuss system components as well as their interactions in details. The system has been prototyped and tested to illustrate its feasibility and effectiveness. The proposed modelling technique is generic which can be extended and applied to the system development of other large-scale and complex information systems.
NASA Astrophysics Data System (ADS)
Vasquez, D. A.; Swift, J. N.; Tan, S.; Darrah, T. H.
2013-12-01
The integration of precise geochemical analyses with quantitative engineering modeling into an interactive GIS system allows for a sophisticated and efficient method of reservoir engineering and characterization. Geographic Information Systems (GIS) is utilized as an advanced technique for oil field reservoir analysis by combining field engineering and geological/geochemical spatial datasets with the available systematic modeling and mapping methods to integrate the information into a spatially correlated first-hand approach in defining surface and subsurface characteristics. Three key methods of analysis include: 1) Geostatistical modeling to create a static and volumetric 3-dimensional representation of the geological body, 2) Numerical modeling to develop a dynamic and interactive 2-dimensional model of fluid flow across the reservoir and 3) Noble gas geochemistry to further define the physical conditions, components and history of the geologic system. Results thus far include using engineering algorithms for interpolating electrical well log properties across the field (spontaneous potential, resistivity) yielding a highly accurate and high-resolution 3D model of rock properties. Results so far also include using numerical finite difference methods (crank-nicholson) to solve for equations describing the distribution of pressure across field yielding a 2D simulation model of fluid flow across reservoir. Ongoing noble gas geochemistry results will also include determination of the source, thermal maturity and the extent/style of fluid migration (connectivity, continuity and directionality). Future work will include developing an inverse engineering algorithm to model for permeability, porosity and water saturation.This combination of new and efficient technological and analytical capabilities is geared to provide a better understanding of the field geology and hydrocarbon dynamics system with applications to determine the presence of hydrocarbon pay zones (or other reserves) and improve oil field management (e.g. perforating, drilling, EOR and reserves estimation)
Maximally Expressive Task Modeling
NASA Technical Reports Server (NTRS)
Japp, John; Davis, Elizabeth; Maxwell, Theresa G. (Technical Monitor)
2002-01-01
Planning and scheduling systems organize "tasks" into a timeline or schedule. The tasks are defined within the scheduling system in logical containers called models. The dictionary might define a model of this type as "a system of things and relations satisfying a set of rules that, when applied to the things and relations, produce certainty about the tasks that are being modeled." One challenging domain for a planning and scheduling system is the operation of on-board experiment activities for the Space Station. The equipment used in these experiments is some of the most complex hardware ever developed by mankind, the information sought by these experiments is at the cutting edge of scientific endeavor, and the procedures for executing the experiments are intricate and exacting. Scheduling is made more difficult by a scarcity of space station resources. The models to be fed into the scheduler must describe both the complexity of the experiments and procedures (to ensure a valid schedule) and the flexibilities of the procedures and the equipment (to effectively utilize available resources). Clearly, scheduling space station experiment operations calls for a "maximally expressive" modeling schema. Modeling even the simplest of activities cannot be automated; no sensor can be attached to a piece of equipment that can discern how to use that piece of equipment; no camera can quantify how to operate a piece of equipment. Modeling is a human enterprise-both an art and a science. The modeling schema should allow the models to flow from the keyboard of the user as easily as works of literature flowed from the pen of Shakespeare. The Ground Systems Department at the Marshall Space Flight Center has embarked on an effort to develop a new scheduling engine that is highlighted by a maximally expressive modeling schema. This schema, presented in this paper, is a synergy of technological advances and domain-specific innovations.
TRIM.FaTE Public Reference Library Documentation
TRIM.FaTE is a spatially explicit, compartmental mass balance model that describes the movement and transformation of pollutants over time, through a user-defined, bounded system that includes both biotic and abiotic compartments.
Structuralism and Its Heuristic Implications.
ERIC Educational Resources Information Center
Greene, Ruth M.
1984-01-01
The author defines structuralism (a method for modeling and analyzing event systems in a space-time framework), traces its origins to the work of J. Piaget and M. Fourcault, and discusses its implications for learning. (CL)
ERIC Educational Resources Information Center
Rogers, Pat
1972-01-01
Criteria for a reasonable axiomatic system are discussed. A discussion of the historical attempts to prove the independence of Euclids parallel postulate introduces non-Euclidean geometries. Poincare's model for a non-Euclidean geometry is defined and analyzed. (LS)
A Framework for Curriculum Research.
ERIC Educational Resources Information Center
Kimpston, Richard D.; Rogers, Karen B.
1986-01-01
A framework for generating curriculum research is proposed from a synthesis of Dunkin and Biddle's model of teaching variables with Beauchamp's "curriculum system" planning functions. The framework systematically defines variables that delineate curriculum planning processes. (CJH)
Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets
NASA Technical Reports Server (NTRS)
Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.
1978-01-01
A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.
Avalanches and scaling collapse in the large-N Kuramoto model
NASA Astrophysics Data System (ADS)
Coleman, J. Patrick; Dahmen, Karin A.; Weaver, Richard L.
2018-04-01
We study avalanches in the Kuramoto model, defined as excursions of the order parameter due to ephemeral episodes of synchronization. We present scaling collapses of the avalanche sizes, durations, heights, and temporal profiles, extracting scaling exponents, exponent relations, and scaling functions that are shown to be consistent with the scaling behavior of the power spectrum, a quantity independent of our particular definition of an avalanche. A comprehensive scaling picture of the noise in the subcritical finite-N Kuramoto model is developed, linking this undriven system to a larger class of driven avalanching systems.
A UML-based ontology for describing hospital information system architectures.
Winter, A; Brigl, B; Wendt, T
2001-01-01
To control the heterogeneity inherent to hospital information systems the information management needs appropriate hospital information systems modeling methods or techniques. This paper shows that, for several reasons, available modeling approaches are not able to answer relevant questions of information management. To overcome this major deficiency we offer an UML-based ontology for describing hospital information systems architectures. This ontology views at three layers: the domain layer, the logical tool layer, and the physical tool layer, and defines the relevant components. The relations between these components, especially between components of different layers make the answering of our information management questions possible.
An ontology for component-based models of water resource systems
NASA Astrophysics Data System (ADS)
Elag, Mostafa; Goodall, Jonathan L.
2013-08-01
Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.
Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand
ERIC Educational Resources Information Center
James, Ryan Gregory
2013-01-01
How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…
Ecological Systems Theory: Using Spheres of Influence to Support Small-unit Climate and Training
2016-03-01
identifying the model’s elements and influential individuals, define spheres of influence and construct a model that details the ecological systems...Research Report 1997 Ecological Systems Theory: Using Spheres of Influence to Support Small-unit Climate and Training...Technical review by: Sena Garven, U.S. Army Research Institute Michael D. Wood , Walter Reed Army Institute of Research
Information Sharing for Computing Trust Metrics on COTS Electronic Components
2008-09-01
8 a. Standard SDLCs ...........................8 b. The Waterfall Model ......................9 c. V -shaped Model ...development of a system. There are many well-known SDLC models , the most popular of which are: • Waterfall • V -shaped • Spiral • Agile a. Standard...the SDLC or applied to software and hardware distribution chain. A. JØSANG’S MODEL DEFINED Jøsang expresses "opinions" mathematically as: 1
Enabling interoperability in planetary sciences and heliophysics: The case for an information model
NASA Astrophysics Data System (ADS)
Hughes, J. Steven; Crichton, Daniel J.; Raugh, Anne C.; Cecconi, Baptiste; Guinness, Edward A.; Isbell, Christopher E.; Mafi, Joseph N.; Gordon, Mitchell K.; Hardman, Sean H.; Joyner, Ronald S.
2018-01-01
The Planetary Data System has developed the PDS4 Information Model to enable interoperability across diverse science disciplines. The Information Model is based on an integration of International Organization for Standardization (ISO) level standards for trusted digital archives, information model development, and metadata registries. Where controlled vocabularies provides a basic level of interoperability by providing a common set of terms for communication between both machines and humans the Information Model improves interoperability by means of an ontology that provides semantic information or additional related context for the terms. The information model was defined by team of computer scientists and science experts from each of the diverse disciplines in the Planetary Science community, including Atmospheres, Geosciences, Cartography and Imaging Sciences, Navigational and Ancillary Information, Planetary Plasma Interactions, Ring-Moon Systems, and Small Bodies. The model was designed to be extensible beyond the Planetary Science community, for example there are overlaps between certain PDS disciplines and the Heliophysics and Astrophysics disciplines. "Interoperability" can apply to many aspects of both the developer and the end-user experience, for example agency-to-agency, semantic level, and application level interoperability. We define these types of interoperability and focus on semantic level interoperability, the type of interoperability most directly enabled by an information model.
LOX/hydrocarbon auxiliary propulsion system study
NASA Technical Reports Server (NTRS)
Orton, G. F.; Mark, T. D.; Weber, D. D.
1982-01-01
Liquid oxygen/hydrocarbon propulsion systems applicable to a second generation orbiter OMS/RCS were compared, and major system/component options were evaluated. A large number of propellant combinations and system concepts were evaluated. The ground rules were defined in terms of candidate propellants, system/component design options, and design requirements. System and engine component math models were incorporated into existing computer codes for system evaluations. The detailed system evaluations and comparisons were performed to identify the recommended propellant combination and system approach.
OBO to UML: Support for the development of conceptual models in the biomedical domain.
Waldemarin, Ricardo C; de Farias, Cléver R G
2018-04-01
A conceptual model abstractly defines a number of concepts and their relationships for the purposes of understanding and communication. Once a conceptual model is available, it can also be used as a starting point for the development of a software system. The development of conceptual models using the Unified Modeling Language (UML) facilitates the representation of modeled concepts and allows software developers to directly reuse these concepts in the design of a software system. The OBO Foundry represents the most relevant collaborative effort towards the development of ontologies in the biomedical domain. The development of UML conceptual models in the biomedical domain may benefit from the use of domain-specific semantics and notation. Further, the development of these models may also benefit from the reuse of knowledge contained in OBO ontologies. This paper investigates the support for the development of conceptual models in the biomedical domain using UML as a conceptual modeling language and using the support provided by the OBO Foundry for the development of biomedical ontologies, namely entity kind and relationship types definitions provided by the Basic Formal Ontology (BFO) and the OBO Core Relations Ontology (OBO Core), respectively. Further, the paper investigates the support for the reuse of biomedical knowledge currently available in OBOFFF ontologies in the development these conceptual models. The paper describes a UML profile for the OBO Core Relations Ontology, which basically defines a number of stereotypes to represent BFO entity kinds and OBO Core relationship types definitions. The paper also presents a support toolset consisting of a graphical editor named OBO-RO Editor, which directly supports the development of UML models using the extensions defined by our profile, and a command-line tool named OBO2UML, which directly converts an OBOFFF ontology into a UML model. Copyright © 2018 Elsevier Inc. All rights reserved.
Adaptive hyperspectral imager: design, modeling, and control
NASA Astrophysics Data System (ADS)
McGregor, Scot; Lacroix, Simon; Monmayrant, Antoine
2015-08-01
An adaptive, hyperspectral imager is presented. We propose a system with easily adaptable spectral resolution, adjustable acquisition time, and high spatial resolution which is independent of spectral resolution. The system yields the possibility to define a variety of acquisition schemes, and in particular near snapshot acquisitions that may be used to measure the spectral content of given or automatically detected regions of interest. The proposed system is modelled and simulated, and tests on a first prototype validate the approach to achieve near snapshot spectral acquisitions without resorting to any computationally heavy post-processing, nor cumbersome calibration
Bounded Parametric Model Checking for Elementary Net Systems
NASA Astrophysics Data System (ADS)
Knapik, Michał; Szreter, Maciej; Penczek, Wojciech
Bounded Model Checking (BMC) is an efficient verification method for reactive systems. BMC has been applied so far to verification of properties expressed in (timed) modal logics, but never to their parametric extensions. In this paper we show, for the first time that BMC can be extended to PRTECTL - a parametric extension of the existential version of CTL. To this aim we define a bounded semantics and a translation from PRTECTL to SAT. The implementation of the algorithm for Elementary Net Systems is presented, together with some experimental results.
NASA Technical Reports Server (NTRS)
ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.
2005-01-01
The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.
Three-Loop Automatic of Control System the Landfill of Household Solid Waste
NASA Astrophysics Data System (ADS)
Sereda, T. G.; Kostarev, S. N.
2017-05-01
The analysis of models of governance ground municipal solid waste (MSW). Considered a distributed circuit (spatio-temporal) ground control model. Developed a dynamic model of multicontour control landfill. Adjustable parameters are defined (the ratio of CH4 CO2 emission/fluxes, concentrations of heavy metals ions) and control (purging array, irrigation, adding reagents). Based on laboratory studies carried out with the analysis of equity flows and procedures developed by the transferring matrix that takes into account the relationship control loops. A system of differential equations in the frequency and time domains. Given the numerical approaches solving systems of differential equations in finite differential form.
Fractional discrete-time consensus models for single- and double-summator dynamics
NASA Astrophysics Data System (ADS)
Wyrwas, Małgorzata; Mozyrska, Dorota; Girejko, Ewa
2018-04-01
The leader-following consensus problem of fractional-order multi-agent discrete-time systems is considered. In the systems, interactions between opinions are defined like in Krause and Cucker-Smale models but the memory is included by taking the fractional-order discrete-time operator on the left-hand side of the nonlinear systems. In this paper, we investigate fractional-order models of opinions for the single- and double-summator dynamics of discrete-time by analytical methods as well as by computer simulations. The necessary and sufficient conditions for the leader-following consensus are formulated by proposing a consensus control law for tracking the virtual leader.
Advanced and secure architectural EHR approaches.
Blobel, Bernd
2006-01-01
Electronic Health Records (EHRs) provided as a lifelong patient record advance towards core applications of distributed and co-operating health information systems and health networks. For meeting the challenge of scalable, flexible, portable, secure EHR systems, the underlying EHR architecture must be based on the component paradigm and model driven, separating platform-independent and platform-specific models. Allowing manageable models, real systems must be decomposed and simplified. The resulting modelling approach has to follow the ISO Reference Model - Open Distributing Processing (RM-ODP). The ISO RM-ODP describes any system component from different perspectives. Platform-independent perspectives contain the enterprise view (business process, policies, scenarios, use cases), the information view (classes and associations) and the computational view (composition and decomposition), whereas platform-specific perspectives concern the engineering view (physical distribution and realisation) and the technology view (implementation details from protocols up to education and training) on system components. Those views have to be established for components reflecting aspects of all domains involved in healthcare environments including administrative, legal, medical, technical, etc. Thus, security-related component models reflecting all view mentioned have to be established for enabling both application and communication security services as integral part of the system's architecture. Beside decomposition and simplification of system regarding the different viewpoint on their components, different levels of systems' granularity can be defined hiding internals or focusing on properties of basic components to form a more complex structure. The resulting models describe both structure and behaviour of component-based systems. The described approach has been deployed in different projects defining EHR systems and their underlying architectural principles. In that context, the Australian GEHR project, the openEHR initiative, the revision of CEN ENV 13606 "Electronic Health Record communication", all based on Archetypes, but also the HL7 version 3 activities are discussed in some detail. The latter include the HL7 RIM, the HL7 Development Framework, the HL7's clinical document architecture (CDA) as well as the set of models from use cases, activity diagrams, sequence diagrams up to Domain Information Models (DMIMs) and their building blocks Common Message Element Types (CMET) Constraining Models to their underlying concepts. The future-proof EHR architecture as open, user-centric, user-friendly, flexible, scalable, portable core application in health information systems and health networks has to follow advanced architectural paradigms.
Preliminary study of the space adaptation of the MELiSSA life support system
NASA Astrophysics Data System (ADS)
Mas-Albaigès, Joan L.; Duatis, Jordi; Podhajsky, Sandra; Guirado, Víctor; Poughon, Laurent
MELiSSA (Micro-Ecological Life Support System Alternative) is an European Space Agency (ESA) project focused on the development of a closed regenerative life support system to aid the development of technologies for future life support systems for long term manned planetary missions, e.g. a lunar base or missions to Mars. In order to understand the potential evolution of the MELiSSA concept towards its future use in the referred manned planetary mission context the MELiSSA Space Adaptation (MSA) activity has been undertaken. MSA's main objective is to model the different MELiSSA compartments using EcosimPro R , a specialized simulation tool for life support applications, in order to define a preliminary MELiSSA implementation for service in a man-tended lunar base scenario, with a four-member crew rotating in six-month increments, and performing the basic LSS functions of air revitalization, food production, and waste and water recycling. The MELiSSA EcosimPro R Model features a dedicated library for the different MELiSSA elements (bioreactors, greenhouse, crew, interconnecting elements, etc.). It is used to dimension the MELiSSA system in terms of major parameters like mass, volume and energy needs, evaluate the accuracy of the results and define the strategy for a progressive loop closure from the initial required performance (approx.100 The MELiSSA configuration(s) obtained through the EcosimPro R simulation are further analysed using the Advanced Life Support System Evaluation (ALISSE) metric, relying on mass, energy, efficiency, human risk, system reliability and crew time, for trade-off and optimization of results. The outcome of the MSA activity is, thus, a potential Life Support System architecture description, based on combined MELiSSA and other physico-chemical technologies, defining its expected performance, associated operational conditions and logistic needs.
Saha, Krishanu; Mei, Ying; Reisterer, Colin M; Pyzocha, Neena Kenton; Yang, Jing; Muffat, Julien; Davies, Martyn C; Alexander, Morgan R; Langer, Robert; Anderson, Daniel G; Jaenisch, Rudolf
2011-11-15
The current gold standard for the culture of human pluripotent stem cells requires the use of a feeder layer of cells. Here, we develop a spatially defined culture system based on UV/ozone radiation modification of typical cell culture plastics to define a favorable surface environment for human pluripotent stem cell culture. Chemical and geometrical optimization of the surfaces enables control of early cell aggregation from fully dissociated cells, as predicted from a numerical model of cell migration, and results in significant increases in cell growth of undifferentiated cells. These chemically defined xeno-free substrates generate more than three times the number of cells than feeder-containing substrates per surface area. Further, reprogramming and typical gene-targeting protocols can be readily performed on these engineered surfaces. These substrates provide an attractive cell culture platform for the production of clinically relevant factor-free reprogrammed cells from patient tissue samples and facilitate the definition of standardized scale-up friendly methods for disease modeling and cell therapeutic applications.
Design and Implementation of a Metadata-rich File System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, S; Gokhale, M B; Maltzahn, C
2010-01-19
Despite continual improvements in the performance and reliability of large scale file systems, the management of user-defined file system metadata has changed little in the past decade. The mismatch between the size and complexity of large scale data stores and their ability to organize and query their metadata has led to a de facto standard in which raw data is stored in traditional file systems, while related, application-specific metadata is stored in relational databases. This separation of data and semantic metadata requires considerable effort to maintain consistency and can result in complex, slow, and inflexible system operation. To address thesemore » problems, we have developed the Quasar File System (QFS), a metadata-rich file system in which files, user-defined attributes, and file relationships are all first class objects. In contrast to hierarchical file systems and relational databases, QFS defines a graph data model composed of files and their relationships. QFS incorporates Quasar, an XPATH-extended query language for searching the file system. Results from our QFS prototype show the effectiveness of this approach. Compared to the de facto standard, the QFS prototype shows superior ingest performance and comparable query performance on user metadata-intensive operations and superior performance on normal file metadata operations.« less
Satellite services system analysis study. Volume 1, part 2: Executive summary
NASA Technical Reports Server (NTRS)
1981-01-01
The early mission model was developed through a survey of the potential user market. Service functions were defined and a group of design reference missions were selected which represented needs for each of the service functions. Servicing concepts were developed through mission analysis and STS timeline constraint analysis. The hardware needs for accomplishing the service functions were identified with emphasis being placed on applying equipment in the current NASA inventory and that in advanced stages of planning. A more comprehensive service model was developed based on the NASA and DoD mission models segregated by mission class. The number of service events of each class were estimated based on average revisit and service assumptions. Service Kits were defined as collections of equipment applicable to performing one or more service functions. Preliminary design was carrid out on a selected set of hardware needed for early service missions. The organization and costing of the satellie service systems were addressed.
Ogawa, K
1992-01-01
This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.
A computer program for thermal radiation from gaseous rocket exhuast plumes (GASRAD)
NASA Technical Reports Server (NTRS)
Reardon, J. E.; Lee, Y. C.
1979-01-01
A computer code is presented for predicting incident thermal radiation from defined plume gas properties in either axisymmetric or cylindrical coordinate systems. The radiation model is a statistical band model for exponential line strength distribution with Lorentz/Doppler line shapes for 5 gaseous species (H2O, CO2, CO, HCl and HF) and an appoximate (non-scattering) treatment of carbon particles. The Curtis-Godson approximation is used for inhomogeneous gases, but a subroutine is available for using Young's intuitive derivative method for H2O with Lorentz line shape and exponentially-tailed-inverse line strength distribution. The geometry model provides integration over a hemisphere with up to 6 individually oriented identical axisymmetric plumes, a single 3-D plume, Shading surfaces may be used in any of 7 shapes, and a conical limit may be defined for the plume to set individual line-of-signt limits. Intermediate coordinate systems may specified to simplify input of plumes and shading surfaces.
Enhancing metaproteomics-The value of models and defined environmental microbial systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
Enhancing metaproteomics-The value of models and defined environmental microbial systems
Herbst, Florian-Alexander; Lünsmann, Vanessa; Kjeldal, Henrik; ...
2016-01-21
Metaproteomicsthe large-scale characterization of the entire protein complement of environmental microbiota at a given point in timehas provided new features to study complex microbial communities in order to unravel these black boxes. Some new technical challenges arose that were not an issue for classical proteome analytics before that could be tackled by the application of different model systems. Here, we review different current and future model systems for metaproteome analysis. We introduce model systems for clinical and biotechnological research questions including acid mine drainage, anaerobic digesters, and activated sludge, following a short introduction to microbial communities and metaproteomics. Model systemsmore » are useful to evaluate the challenges encountered within (but not limited to) metaproteomics, including species complexity and coverage, biomass availability, or reliable protein extraction. Moreover, the implementation of model systems can be considered as a step forward to better understand microbial community responses and ecological functions of single member organisms. In the future, improvements are necessary to fully explore complex environmental systems by metaproteomics.« less
The ISO Edi Conceptual Model Activity and Its Relationship to OSI.
ERIC Educational Resources Information Center
Fincher, Judith A.
1990-01-01
The edi conceptual model is being developed to define common structures, services, and processes that syntax-specific standards like X12 and EDIFACT could adopt. Open Systems Interconnection (OSI) is of interest to edi because of its potential to help enable global interoperability across Electronic Data Interchange (EDI) functional groups. A…
Academic Work from a Comparative Perspective: A Survey of Faculty Working Time across 13 Countries
ERIC Educational Resources Information Center
Bentley, Peter James; Kyvik, Svein
2012-01-01
Sociological institutional theory views universities as model driven organizations. The world's stratification system promotes conformity, imitation and isomorphism towards the "best" university models. Accordingly, academic roles may be locally shaped in minor ways, but are defined and measured explicitly in global terms. We test this proposition…
Use a Building Learning Center Enrichment Program to Meet Needs of Gifted/Talented.
ERIC Educational Resources Information Center
Schurr, Sandra
The paper describes the Learning Center Enrichment Program for elementary school gifted and talented children. The nomenclature associated with the program model (learning center, enrichment, and management system) is defined; and it is explained that the program is organized according to the enrichment triad model advocated by J. Renzulli because…
HOW TO MODEL HYDRODYNAMICS AND RESIDENCE TIMES OF 27 ESTUARIES IN 4 MONTHS
The hydrodynamics and residence times of 27 embayments were modeled during the first year of a project whose goal is to define the relation between nitrogen loadings and ecological responses of 44 systems that range from small to the size of Narragansett Bay and Buzzards Bay. The...
In Search of the Nordic Model in Education
ERIC Educational Resources Information Center
Antikainen, Ari
2006-01-01
The Nordic model of education is defined in this article as an attempt to construct a national education system on the foundation of specific local values and practices, but at the same time subject to international influences. According to the author, equity, participation, and welfare are the major goals and the publicly funded comprehensive…
Derivation and definition of a linear aircraft model
NASA Technical Reports Server (NTRS)
Duke, Eugene L.; Antoniewicz, Robert F.; Krambeer, Keith D.
1988-01-01
A linear aircraft model for a rigid aircraft of constant mass flying over a flat, nonrotating earth is derived and defined. The derivation makes no assumptions of reference trajectory or vehicle symmetry. The linear system equations are derived and evaluated along a general trajectory and include both aircraft dynamics and observation variables.
Population modeling and its role in toxicological studies
Sauer, John R.; Pendleton, Grey W.; Hoffman, David J.; Rattner, Barnett A.; Burton, G. Allen; Cairns, John
1995-01-01
A model could be defined as any abstraction from reality that is used to provide some insight into the real system. In this discussion, we will use a more specific definition that a model is a set of rules or assumptions, expressed as mathematical equations, that describe how animals survive and reproduce, including the external factors that affect these characteristics. A model simplifies a system, retaining essential components while eliminating parts that are not of interest. ecology has a rich history of using models to gain insight into populations, often borrowing both model structures and analysis methods from demographers and engineers. Much of the development of the models has been a consequence of mathematicians and physicists seeing simple analogies between their models and patterns in natural systems. Consequently, one major application of ecological modeling has been to emphasize the analysis of dynamics of often complex models to provide insight into theoretical aspects of ecology.1
Voronoi cell patterns: Theoretical model and applications
NASA Astrophysics Data System (ADS)
González, Diego Luis; Einstein, T. L.
2011-11-01
We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We use our model to describe the Voronoi cell patterns of several systems. Specifically, we study the island nucleation with irreversible attachment, the 1D car-parking problem, the formation of second-level administrative divisions, and the pattern formed by the Paris Métro stations.
Neurobiological constraints on behavioral models of motivation.
Nader, K; Bechara, A; van der Kooy, D
1997-01-01
The application of neurobiological tools to behavioral questions has produced a number of working models of the mechanisms mediating the rewarding and aversive properties of stimuli. The authors review and compare three models that differ in the nature and number of the processes identified. The dopamine hypothesis, a single system model, posits that the neurotransmitter dopamine plays a fundamental role in mediating the rewarding properties of all classes of stimuli. In contrast, both nondeprived/deprived and saliency attribution models claim that separate systems make independent contributions to reward. The former identifies the psychological boundary defined by the two systems as being between states of nondeprivation (e.g. food sated) and deprivation (e.g. hunger). The latter identifies a boundary between liking and wanting systems. Neurobiological dissociations provide tests of and explanatory power for behavioral theories of goal-directed behavior.
Solid state SPS microwave generation and transmission study. Volume 2, phase 2: Appendices
NASA Technical Reports Server (NTRS)
Maynard, O. E.
1980-01-01
The solid state sandwich concept for SPS was further defined. The design effort concentrated on the spacetenna, but did include some system analysis for parametric comparison reasons. Basic solid state microwave devices were defined and modeled. An initial conceptual subsystems and system design was performed as well as sidelobe control and system selection. The selected system concept and parametric solid state microwave power transmission system data were assessed relevant to the SPS concept. Although device efficiency was not a goal, the sensitivities to design of this efficiency were parametrically treated. Sidelobe control consisted of various single step tapers, multistep tapers and Gaussian tapers. A hybrid concept using tubes and solid state was evaluated. Thermal analyses are included with emphasis on sensitivities to waste heat radiator form factor, emissivity, absorptivity, amplifier efficiency, material and junction temperature.
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
Exploration Medical Capability System Engineering Overview
NASA Technical Reports Server (NTRS)
Mindock, J.; McGuire, K.
2018-01-01
Deep Space Gateway and Transport missions will change the way NASA currently practices medicine. The missions will require more autonomous capability compared to current low Earth orbit operations. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The ExMC Systems Engineering team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is using Model-Based System Engineering (MBSE) to accomplish its integrative goals. The MBSE approach to medical system design offers a paradigm shift toward greater integration between vehicle and the medical system, and directly supports the transition of Earth-reliant ISS operations to the Earth-independent operations envisioned for Mars. This talk will discuss how ExMC is using MBSE to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. How MBSE is being used to integrate across disciplines and NASA Centers will also be described. The medical system being discussed in this talk is one system within larger habitat systems. Data generated within the medical system will be inputs to other systems and vice versa. This talk will also describe the next steps in model development that include: modeling the different systems that comprise the larger system and interact with the medical system, understanding how the various systems work together, and developing tools to support trade studies.
Exploration Medical Cap Ability System Engineering Overview
NASA Technical Reports Server (NTRS)
McGuire, K.; Mindock, J.
2018-01-01
Deep Space Gateway and Transport missions will change the way NASA currently practices medicine. The missions will require more autonomous capability compared to current low Earth orbit operations. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The ExMC Systems Engineering team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is using Model-Based System Engineering (MBSE) to accomplish its integrative goals. The MBSE approach to medical system design offers a paradigm shift toward greater integration between vehicle and the medical system, and directly supports the transition of Earth-reliant ISS operations to the Earth-independent operations envisioned for Mars. This talk will discuss how ExMC is using MBSE to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. How MBSE is being used to integrate across disciplines and NASA Centers will also be described. The medical system being discussed in this talk is one system within larger habitat systems. Data generated within the medical system will be inputs to other systems and vice versa. This talk will also describe the next steps in model development that include: modeling the different systems that comprise the larger system and interact with the medical system, understanding how the various systems work together, and developing tools to support trade studies.
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Goka, T.; Phatak, A. V.; Schmidt, S. F.
1980-01-01
A detailed system model of a VTOL aircraft approaching a small aviation facility ship was developed and used to investigate several approach guidance concepts. A preliminary anaysis of the aircraft-vessel landing guidance requirements was conducted. The various subelements and constraints of the flight system are described including the landing scenario, lift fan aircraft, state rate feedback flight control, MLS-based navigation, sea state induced ship motion, and wake turbulence due to wind-over-deck effects. These elements are integrated into a systems model with various guidance concepts. Guidance is described in terms of lateral, vertical, and longitudinal axes steering modes and approach and landing phases divided by a nominal hover (or stationkeeping) point defined with respect to the landing pad. The approach guidance methods are evaluated, and the two better steering concepts are studied by both single pass and Monte Carlo statistical simulation runs. Four different guidance concepts are defined for further analysis for the landing phase of flight.
A low-cost three-dimensional laser surface scanning approach for defining body segment parameters.
Pandis, Petros; Bull, Anthony Mj
2017-11-01
Body segment parameters are used in many different applications in ergonomics as well as in dynamic modelling of the musculoskeletal system. Body segment parameters can be defined using different methods, including techniques that involve time-consuming manual measurements of the human body, used in conjunction with models or equations. In this study, a scanning technique for measuring subject-specific body segment parameters in an easy, fast, accurate and low-cost way was developed and validated. The scanner can obtain the body segment parameters in a single scanning operation, which takes between 8 and 10 s. The results obtained with the system show a standard deviation of 2.5% in volumetric measurements of the upper limb of a mannequin and 3.1% difference between scanning volume and actual volume. Finally, the maximum mean error for the moment of inertia by scanning a standard-sized homogeneous object was 2.2%. This study shows that a low-cost system can provide quick and accurate subject-specific body segment parameter estimates.
NASA Technical Reports Server (NTRS)
Motyka, P.
1983-01-01
A methodology is developed and applied for quantitatively analyzing the reliability of a dual, fail-operational redundant strapdown inertial measurement unit (RSDIMU). A Markov evaluation model is defined in terms of the operational states of the RSDIMU to predict system reliability. A 27 state model is defined based upon a candidate redundancy management system which can detect and isolate a spectrum of failure magnitudes. The results of parametric studies are presented which show the effect on reliability of the gyro failure rate, both the gyro and accelerometer failure rates together, false alarms, probability of failure detection, probability of failure isolation, and probability of damage effects and mission time. A technique is developed and evaluated for generating dynamic thresholds for detecting and isolating failures of the dual, separated IMU. Special emphasis is given to the detection of multiple, nonconcurrent failures. Digital simulation time histories are presented which show the thresholds obtained and their effectiveness in detecting and isolating sensor failures.
Development of a preprototype trace contaminant control system. [for space stations
NASA Technical Reports Server (NTRS)
1977-01-01
The steady state contaminant load model based on shuttle equipment and material test programs, and on the current space station studies was revised. An emergency upset contaminant load model based on anticipated emergency upsets that could occur in an operational space station was defined. Control methods for the contaminants generated by the emergency upsets were established by test. Preliminary designs of both steady state and emergency contaminant control systems for the space station application are presented.
Pool, D.R.; Blasch, Kyle W.; Callegary, James B.; Leake, Stanley A.; Graser, Leslie F.
2011-01-01
A numerical flow model (MODFLOW) of the groundwater flow system in the primary aquifers in northern Arizona was developed to simulate interactions between the aquifers, perennial streams, and springs for predevelopment and transient conditions during 1910 through 2005. Simulated aquifers include the Redwall-Muav, Coconino, and basin-fill aquifers. Perennial stream reaches and springs that derive base flow from the aquifers were simulated, including the Colorado River, Little Colorado River, Salt River, Verde River, and perennial reaches of tributary streams. Simulated major springs include Blue Spring, Del Rio Springs, Havasu Springs, Verde River headwater springs, several springs that discharge adjacent to major Verde River tributaries, and many springs that discharge to the Colorado River. Estimates of aquifer hydraulic properties and groundwater budgets were developed from published reports and groundwater-flow models. Spatial extents of aquifers and confining units were developed from geologic data, geophysical models, a groundwater-flow model for the Prescott Active Management Area, drill logs, geologic logs, and geophysical logs. Spatial and temporal distributions of natural recharge were developed by using a water-balance model that estimates recharge from direct infiltration. Additional natural recharge from ephemeral channel infiltration was simulated in alluvial basins. Recharge at wastewater treatment facilities and incidental recharge at agricultural fields and golf courses were also simulated. Estimates of predevelopment rates of groundwater discharge to streams, springs, and evapotranspiration by phreatophytes were derived from previous reports and on the basis of streamflow records at gages. Annual estimates of groundwater withdrawals for agriculture, municipal, industrial, and domestic uses were developed from several sources, including reported withdrawals for nonexempt wells, estimated crop requirements for agricultural wells, and estimated per capita water use for exempt wells. Accuracy of the simulated groundwater-flow system was evaluated by using observational control from water levels in wells, estimates of base flow from streamflow records, and estimates of spring discharge. Major results from the simulations include the importance of variations in recharge rates throughout the study area and recharge along ephemeral and losing stream reaches in alluvial basins. Insights about the groundwater-flow systems in individual basins include the hydrologic influence of geologic structures in some areas and that stream-aquifer interactions along the lower part of the Little Colorado River are an effective control on water level distributions throughout the Little Colorado River Plateau basin. Better information on several aspects of the groundwater flow system are needed to reduce uncertainty of the simulated system. Many areas lack documentation of the response of the groundwater system to changes in withdrawals and recharge. Data needed to define groundwater flow between vertically adjacent water-bearing units is lacking in many areas. Distributions of recharge along losing stream reaches are poorly defined. Extents of aquifers and alluvial lithologies are poorly defined in parts of the Big Chino and Verde Valley sub-basins. Aquifer storage properties are poorly defined throughout most of the study area. Little data exist to define the hydrologic importance of geologic structures such as faults and fractures. Discharge of regional groundwater flow to the Verde River is difficult to identify in the Verde Valley sub-basin because of unknown contributions from deep percolation of excess surface water irrigation.
Telerobotic control of a mobile coordinated robotic server. M.S. Thesis Annual Technical Report
NASA Technical Reports Server (NTRS)
Lee, Gordon
1993-01-01
The annual report on telerobotic control of a mobile coordinated robotic server is presented. The goal of this effort is to develop advanced control methods for flexible space manipulator systems. As such, an adaptive fuzzy logic controller was developed in which model structure as well as parameter constraints are not required for compensation. The work builds upon previous work on fuzzy logic controllers. Fuzzy logic controllers have been growing in importance in the field of automatic feedback control. Hardware controllers using fuzzy logic have become available as an alternative to the traditional PID controllers. Software has also been introduced to aid in the development of fuzzy logic rule-bases. The advantages of using fuzzy logic controllers include the ability to merge the experience and intuition of expert operators into the rule-base and that a model of the system is not required to construct the controller. A drawback of the classical fuzzy logic controller, however, is the many parameters needed to be turned off-line prior to application in the closed-loop. In this report, an adaptive fuzzy logic controller is developed requiring no system model or model structure. The rule-base is defined to approximate a state-feedback controller while a second fuzzy logic algorithm varies, on-line, parameters of the defining controller. Results indicate the approach is viable for on-line adaptive control of systems when the model is too complex or uncertain for application of other more classical control techniques.
Approaching Resistance to Targeted Inhibition of PI3K in Breast Cancer
2011-10-01
promise, there are concerns that drug resistance may emerge within the cancerous cells, thus limiting clinical efficacy. Using genetically defined human...mechanism of such resistance. Using genetically defined human mammary epithelial cells (HMECs), a model system which has previously been used for PI3K...pathway driven transformation due to its dependence on oncogenic PI3K signaling, we screened for emergence of BEZ235-resistance and identified genetic
On Roles of Models in Information Systems
NASA Astrophysics Data System (ADS)
Sølvberg, Arne
The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.
Report of the Power Sub systems Panel. [spacecraft instrumentation technology
NASA Technical Reports Server (NTRS)
1979-01-01
Problems in spacecraft power system design, testing, integration, and operation are identified and solutions are defined. The specific technology development problems discussed include substorm and plasma design data, modeling of the power subsystem and components, power system monitoring and degraded system management, rotary joints for transmission of power and signals, nickel cadmium battery manufacturing and application, on-array power management, high voltage technology, and solar arrays.
Heat pipe solar receiver with thermal energy storage
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
An HPSR Stirling engine generator system featuring latent heat thermal energy storge, excellent thermal stability and self regulating, effective thermal transport at low system delta T is described. The system was supported by component technology testing of heat pipes and of thermal storage and energy transport models which define the expected performance of the system. Preliminary and detailed design efforts were completed and manufacturing of HPSR components has begun.
Vertical integration models to prepare health systems for capitation.
Cave, D G
1995-01-01
Health systems will profit most under capitation if their vertical integration strategy provides operational stability, a strong primary care physician base, efficient delivery of medical services, and geographic access to physicians. Staff- and equity-based systems best meet these characteristics for success because they have one governance structure and a defined mission statement. Moreover, physician bonds are strong because these systems maximize physicians' income potential and control the revenue stream.
Correcting the initialization of models with fractional derivatives via history-dependent conditions
NASA Astrophysics Data System (ADS)
Du, Maolin; Wang, Zaihua
2016-04-01
Fractional differential equations are more and more used in modeling memory (history-dependent, non-local, or hereditary) phenomena. Conventional initial values of fractional differential equations are defined at a point, while recent works define initial conditions over histories. We prove that the conventional initialization of fractional differential equations with a Riemann-Liouville derivative is wrong with a simple counter-example. The initial values were assumed to be arbitrarily given for a typical fractional differential equation, but we find one of these values can only be zero. We show that fractional differential equations are of infinite dimensions, and the initial conditions, initial histories, are defined as functions over intervals. We obtain the equivalent integral equation for Caputo case. With a simple fractional model of materials, we illustrate that the recovery behavior is correct with the initial creep history, but is wrong with initial values at the starting point of the recovery. We demonstrate the application of initial history by solving a forced fractional Lorenz system numerically.
Cloud Computing in the Curricula of Schools of Computer Science and Information Systems
ERIC Educational Resources Information Center
Lawler, James P.
2011-01-01
The cloud continues to be a developing area of information systems. Evangelistic literature in the practitioner field indicates benefit for business firms but disruption for technology departments of the firms. Though the cloud currently is immature in methodology, this study defines a model program by which computer science and information…
In-Flight Pitot-Static Calibration
NASA Technical Reports Server (NTRS)
Foster, John V. (Inventor); Cunningham, Kevin (Inventor)
2016-01-01
A GPS-based pitot-static calibration system uses global output-error optimization. High data rate measurements of static and total pressure, ambient air conditions, and GPS-based ground speed measurements are used to compute pitot-static pressure errors over a range of airspeed. System identification methods rapidly compute optimal pressure error models with defined confidence intervals.
Vlasov-Maxwell and Vlasov-Poisson equations as models of a one-dimensional electron plasma
NASA Technical Reports Server (NTRS)
Klimas, A. J.; Cooper, J.
1983-01-01
The Vlasov-Maxwell and Vlasov-Poisson systems of equations for a one-dimensional electron plasma are defined and discussed. A method for transforming a solution of one system which is periodic over a bounded or unbounded spatial interval to a similar solution of the other is constructed.
Establishing an Evidence-Based Adult Education System. NCSALL Occasional Paper.
ERIC Educational Resources Information Center
Comings, John P.; Beder, Hal; Bingman, Beth; Reder, Stephen; Smith, Cristine
To benefit from the support of public and private sector leaders and to ensure that all students receive effective services, the adult education system must identify program models that have empirical evidence to support claims of effectiveness. The U.S. Department of Education's Institute of Education Sciences defines evidence-based education as…
The Open Systems University and Organizational Intelligence.
ERIC Educational Resources Information Center
Counelis, James Steve
The open systems model of the university defines the function of institutional research to be a cybernetic one. The internal and external reality-testing function is a vital duty and a moral charge. Though policy makers and educational practitioners can carry on for a considerable length of time with organizational intelligence of low validity,…
NASA Technical Reports Server (NTRS)
Assanis, D. N.; Ekchian, J. E.; Frank, R. M.; Heywood, J. B.
1985-01-01
A computer simulation of the turbocharged turbocompounded direct-injection diesel engine system was developed in order to study the performance characteristics of the total system as major design parameters and materials are varied. Quasi-steady flow models of the compressor, turbines, manifolds, intercooler, and ducting are coupled with a multicylinder reciprocator diesel model, where each cylinder undergoes the same thermodynamic cycle. The master cylinder model describes the reciprocator intake, compression, combustion and exhaust processes in sufficient detail to define the mass and energy transfers in each subsystem of the total engine system. Appropriate thermal loading models relate the heat flow through critical system components to material properties and design details. From this information, the simulation predicts the performance gains, and assesses the system design trade-offs which would result from the introduction of selected heat transfer reduction materials in key system components, over a range of operating conditions.
An early warning system for marine storm hazard mitigation
NASA Astrophysics Data System (ADS)
Vousdoukas, M. I.; Almeida, L. P.; Pacheco, A.; Ferreira, O.
2012-04-01
The present contribution presents efforts towards the development of an operational Early Warning System for storm hazard prediction and mitigation. The system consists of a calibrated nested-model train which consists of specially calibrated Wave Watch III, SWAN and XBeach models. The numerical simulations provide daily forecasts of the hydrodynamic conditions, morphological change and overtopping risk at the area of interest. The model predictions are processed by a 'translation' module which is based on site-specific Storm Impact Indicators (SIIs) (Ciavola et al., 2011, Storm impacts along European coastlines. Part 2: lessons learned from the MICORE project, Environmental Science & Policy, Vol 14), and warnings are issued when pre-defined threshold values are exceeded. For the present site the selected SIIs were (i) the maximum wave run-up height during the simulations; and (ii) the dune-foot horizontal retreat at the end of the simulations. Both SIIs and pre-defined thresholds were carefully selected on the grounds of existing experience and field data. Four risk levels were considered, each associated with an intervention approach, recommended to the responsible coastal protection authority. Regular updating of the topography/bathymetry is critical for the performance of the storm impact forecasting, especially when there are significant morphological changes. The system can be extended to other critical problems, like implications of global warming and adaptive management strategies, while the approach presently followed, from model calibration to the early warning system for storm hazard mitigation, can be applied to other sites worldwide, with minor adaptations.
Generic Sensor Failure Modeling for Cooperative Systems
Jäger, Georg; Zug, Sebastian
2018-01-01
The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application’s fault tolerance and thereby promises maintainability of such system’s safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques. PMID:29558435
Assessing the performance of regional landslide early warning models: the EDuMaP method
NASA Astrophysics Data System (ADS)
Calvello, M.; Piciullo, L.
2016-01-01
A schematic of the components of regional early warning systems for rainfall-induced landslides is herein proposed, based on a clear distinction between warning models and warning systems. According to this framework an early warning system comprises a warning model as well as a monitoring and warning strategy, a communication strategy and an emergency plan. The paper proposes the evaluation of regional landslide warning models by means of an original approach, called the "event, duration matrix, performance" (EDuMaP) method, comprising three successive steps: identification and analysis of the events, i.e., landslide events and warning events derived from available landslides and warnings databases; definition and computation of a duration matrix, whose elements report the time associated with the occurrence of landslide events in relation to the occurrence of warning events, in their respective classes; evaluation of the early warning model performance by means of performance criteria and indicators applied to the duration matrix. During the first step the analyst identifies and classifies the landslide and warning events, according to their spatial and temporal characteristics, by means of a number of model parameters. In the second step, the analyst computes a time-based duration matrix with a number of rows and columns equal to the number of classes defined for the warning and landslide events, respectively. In the third step, the analyst computes a series of model performance indicators derived from a set of performance criteria, which need to be defined by considering, once again, the features of the warning model. The applicability, potentialities and limitations of the EDuMaP method are tested and discussed using real landslides and warning data from the municipal early warning system operating in Rio de Janeiro (Brazil).
Total Risk Integrated Methodology (TRIM) - TRIM.FaTE
TRIM.FaTE is a spatially explicit, compartmental mass balance model that describes the movement and transformation of pollutants over time, through a user-defined, bounded system that includes both biotic and abiotic compartments.
Ablative Thermal Protection Systems Fundamentals
NASA Technical Reports Server (NTRS)
Beck, Robin A. S.
2017-01-01
This is a presentation of the fundamentals of ablative TPS materials for a short course at TFAWS 2017. It gives an overall description of what an ablator is, the equations that define it, and how to model it.
The Analog (Computer) As a Physiology Adjunct.
ERIC Educational Resources Information Center
Stewart, Peter A.
1979-01-01
Defines and discusses the analog computer and its use in a physiology laboratory. Includes two examples: (1) The Respiratory Control Function and (2) CO-Two Control in the Respiratory System. Presents diagrams and mathematical models. (MA)
Optimal Resource Allocation in Library Systems
ERIC Educational Resources Information Center
Rouse, William B.
1975-01-01
Queueing theory is used to model processes as either waiting or balking processes. The optimal allocation of resources to these processes is defined as that which maximizes the expected value of the decision-maker's utility function. (Author)
Investigation of bus transit schedule behavior modeling using advanced techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalaputapu, R.; Demetsky, M.J.
This research focused on investigating the application of artificial neural networks (ANN) and the Box-Jenkins technique for developing and testing schedule behavior models using data obtained for a test route from Tidewater Regional Transit`s AVL system. The three ANN architectures investigated were: Feedforward Network, Elman Network and Jordan Network. In addition, five different model structures were investigated. The time-series methodology was adopted for developing the schedule behavior models. Finally, the role of a schedule behavior model within the framework of an intelligent transit management system is defined and the potential utility of the schedule behavior model is discussed using anmore » example application.« less
Territorial Developments Based on Graffiti: a Statistical Mechanics Approach
2011-10-28
defined on a lattice . We introduce a two-gang Hamiltonian model where agents have red or blue affiliation but are otherwise indistinguishable. In this...ramifications of our results. Keywords: Territorial Formation, Spin Systems, Phase Transitions 1. Introduction Lattice models have been extensively used in...inconsequential. In short, lattice models have proved extremely useful in the context of the physical, biological and even chemical sciences. In more
The Specific Features of design and process engineering in branch of industrial enterprise
NASA Astrophysics Data System (ADS)
Sosedko, V. V.; Yanishevskaya, A. G.
2017-06-01
Production output of industrial enterprise is organized in debugged working mechanisms at each stage of product’s life cycle from initial design documentation to product and finishing it with utilization. The topic of article is mathematical model of the system design and process engineering in branch of the industrial enterprise, statistical processing of estimated implementation results of developed mathematical model in branch, and demonstration of advantages at application at this enterprise. During the creation of model a data flow about driving of information, orders, details and modules in branch of enterprise groups of divisions were classified. Proceeding from the analysis of divisions activity, a data flow, details and documents the state graph of design and process engineering was constructed, transitions were described and coefficients are appropriated. To each condition of system of the constructed state graph the corresponding limiting state probabilities were defined, and also Kolmogorov’s equations are worked out. When integration of sets of equations of Kolmogorov the state probability of system activity the specified divisions and production as function of time in each instant is defined. On the basis of developed mathematical model of uniform system of designing and process engineering and manufacture, and a state graph by authors statistical processing the application of mathematical model results was carried out, and also advantage at application at this enterprise is shown. Researches on studying of loading services probability of branch and third-party contractors (the orders received from branch within a month) were conducted. The developed mathematical model of system design and process engineering and manufacture can be applied to definition of activity state probability of divisions and manufacture as function of time in each instant that will allow to keep account of loading of performance of work in branches of the enterprise.
Determination of effective loss factors in reduced SEA models
NASA Astrophysics Data System (ADS)
Chimeno Manguán, M.; Fernández de las Heras, M. J.; Roibás Millán, E.; Simón Hidalgo, F.
2017-01-01
The definition of Statistical Energy Analysis (SEA) models for large complex structures is highly conditioned by the classification of the structure elements into a set of coupled subsystems and the subsequent determination of the loss factors representing both the internal damping and the coupling between subsystems. The accurate definition of the complete system can lead to excessively large models as the size and complexity increases. This fact can also rise practical issues for the experimental determination of the loss factors. This work presents a formulation of reduced SEA models for incomplete systems defined by a set of effective loss factors. This reduced SEA model provides a feasible number of subsystems for the application of the Power Injection Method (PIM). For structures of high complexity, their components accessibility can be restricted, for instance internal equipments or panels. For these cases the use of PIM to carry out an experimental SEA analysis is not possible. New methods are presented for this case in combination with the reduced SEA models. These methods allow defining some of the model loss factors that could not be obtained through PIM. The methods are validated with a numerical analysis case and they are also applied to an actual spacecraft structure with accessibility restrictions: a solar wing in folded configuration.
Topological magnetoelectric pump in three dimensions
NASA Astrophysics Data System (ADS)
Fukui, Takahiro; Fujiwara, Takanori
2017-11-01
We study the topological pump for a lattice fermion model mainly in three spatial dimensions. We first calculate the U(1) current density for the Dirac model defined in continuous space-time to review the known results as well as to introduce some technical details convenient for the calculations of the lattice model. We next investigate the U(1) current density for a lattice fermion model, a variant of the Wilson-Dirac model. The model we introduce is defined on a lattice in space but in continuous time, which is suited for the study of the topological pump. For such a model, we derive the conserved U(1) current density and calculate it directly for the (1 +1 )-dimensional system as well as (3 +1 )-dimensional system in the limit of the small lattice constant. We find that the current includes a nontrivial lattice effect characterized by the Chern number, and therefore the pumped particle number is quantized by the topological reason. Finally, we study the topological temporal pump in 3 +1 dimensions by numerical calculations. We discuss the relationship between the second Chern number and the first Chern number, the bulk-edge correspondence, and the generalized Streda formula which enables us to compute the second Chern number using the spectral asymmetry.