Formal Specification of Information Systems Requirements.
ERIC Educational Resources Information Center
Kampfner, Roberto R.
1985-01-01
Presents a formal model for specification of logical requirements of computer-based information systems that incorporates structural and dynamic aspects based on two separate models: the Logical Information Processing Structure and the Logical Information Processing Network. The model's role in systems development is discussed. (MBR)
A Conceptual Model of the Information Requirements of Nursing Organizations
Miller, Emmy
1989-01-01
Three related issues play a role in the identification of the information requirements of nursing organizations. These issues are the current state of computer systems in health care organizations, the lack of a well-defined data set for nursing, and the absence of models representing data and information relevant to clinical and administrative nursing practice. This paper will examine current methods of data collection, processing, and storage in clinical and administrative nursing practice for the purpose of identifying the information requirements of nursing organizations. To satisfy these information requirements, database technology can be used; however, a model for database design is needed that reflects the conceptual framework of nursing and the professional concerns of nurses. A conceptual model of the types of data necessary to produce the desired information will be presented and the relationships among data will be delineated.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... who are subject to the information collection requirements may introduce up to 15 new models in a 3... will be required are to insert the specific information that pertains to the new model. Additionally... model to collect the information and mail it to the Commission. Therefore, an additional 2.5 hours have...
Requirements engineering for cross-sectional information chain models
Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O
2012-01-01
Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed. PMID:24199080
Requirements for data integration platforms in biomedical research networks: a reference model.
Ganzinger, Matthias; Knaup, Petra
2015-01-01
Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper.
NASA Technical Reports Server (NTRS)
Ricks, Wendell R.; Jonnson, Jon E.; Barry, John S.
1996-01-01
Adequately presenting all necessary information on an approach chart represents a challenge for cartographers. Since many tasks associated with using approach charts are cognitive (e.g., planning the approach and monitoring its progress), and since the characteristic of a successful interface is one that conforms to the users' mental models, understanding pilots' underlying models of approach chart information would greatly assist cartographers. To provide such information, a new methodology was developed for this study that enhances traditional information requirements analyses by combining psychometric scaling techniques with a simulation task to provide quantifiable links between pilots' cognitive representations of approach information and their use of approach information. Results of this study should augment previous information requirements analyses by identifying what information is acquired, when it is acquired, and what presentation concepts might facilitate its efficient use by better matching the pilots' cognitive model of the information. The primary finding in this study indicated that pilots mentally organize approach chart information into ten primary categories: communications, geography, validation, obstructions, navigation, missed approach, final items, other runways, visibility requirement, and navigation aids. These similarity categories were found to underlie the pilots' information acquisitions, other mental models, and higher level cognitive processes that are used to accomplish their approach and landing tasks.
NASA Astrophysics Data System (ADS)
Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2015-08-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
NASA Astrophysics Data System (ADS)
Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis
2016-04-01
Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.
Requirements for data integration platforms in biomedical research networks: a reference model
Knaup, Petra
2015-01-01
Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper. PMID:25699205
Requirements for clinical information modelling tools.
Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak
2015-07-01
This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Digital Avionics Information System (DAIS): Training Requirements Analysis Model (TRAMOD).
ERIC Educational Resources Information Center
Czuchry, Andrew J.; And Others
The training requirements analysis model (TRAMOD) described in this report represents an important portion of the larger effort called the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study. TRAMOD is the second of three models that comprise an LCC impact modeling system for use in the early stages of system development. As…
Capturing business requirements for the Swedish national information structure.
Kajbjer, Karin; Johansson, Catharina
2009-01-01
As a subproject for the National Information Structure project of the National Board of Health and Welfare, four different stakeholder groups were used to capture business requirements. These were: Subjects of care, Health professionals, Managers/Research and Industry. The process is described with formulating goal models, concept, process and information models.
Models Extracted from Text for System-Software Safety Analyses
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2010-01-01
This presentation describes extraction and integration of requirements information and safety information in visualizations to support early review of completeness, correctness, and consistency of lengthy and diverse system safety analyses. Software tools have been developed and extended to perform the following tasks: 1) extract model parts and safety information from text in interface requirements documents, failure modes and effects analyses and hazard reports; 2) map and integrate the information to develop system architecture models and visualizations for safety analysts; and 3) provide model output to support virtual system integration testing. This presentation illustrates the methods and products with a rocket motor initiation case.
Data Requirements and the Basis for Designing Health Information Kiosks.
Afzali, Mina; Ahmadi, Maryam; Mahmoudvand, Zahra
2017-09-01
Health kiosks are an innovative and cost-effective solution that organizations can easily implement to help educate people. To determine the data requirements and basis for designing health information kiosks as a new technology to maintain the health of society. By reviewing the literature, a list of information requirements was provided in 4 sections (demographic information, general information, diagnostic information and medical history), and questions related to the objectives, data elements, stakeholders, requirements, infrastructures and the applications of health information kiosks were provided. In order to determine the content validity of the designed set, the opinions of 2 physicians and 2 specialists in medical informatics were obtained. The test-retest method was used to measure its reliability. Data were analyzed using SPSS software. In the proposed model for Iran, 170 data elements in 6 sections were presented for experts' opinion, which ultimately, on 106 elements, a collective agreement was reached. To provide a model of health information kiosk, creating a standard data set is a critical point. According to a survey conducted on the various literature review studies related to the health information kiosk, the most important components of a health information kiosk include six categories; information needs, data elements, applications, stakeholders, requirements and infrastructure of health information kiosks that need to be considered when designing a health information kiosk.
Directory of Energy Information Administration model abstracts 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-01-01
This directory contains descriptions about each basic and auxiliary model, including the title, acronym, purpose, and type, followed by more detailed information on characteristics, uses, and requirements. For developing models, limited information is provided. Sources for additional information are identified. Included in this directory are 44 EIA models active as of February 1, 1988; 16 of which operate on personal computers. Models that run on personal computers are identified by ''PC'' as part of the acronyms. The main body of this directory is an alphabetical listing of all basic and auxiliary EIA models. Appendix A identifies major EIA modeling systemsmore » and the models within these systems, and Appendix B identifies EIA models by type (basic or auxiliary). Appendix C lists developing models and contact persons for those models. A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained support and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here.« less
Live, Model, Learn: Experiencing Information Systems Requirements through Simulation
ERIC Educational Resources Information Center
Hartzel, Kathleen S.; Pike, Jacqueline C.
2015-01-01
Information system professionals strive to determine requirements by interviewing clients, observing activities at the client's site, and studying existing system documentation. Still this often leads to vague and inaccurate requirements documentation. When teaching the skills needed to determine requirements, it is important to recreate a…
Enriching step-based product information models to support product life-cycle activities
NASA Astrophysics Data System (ADS)
Sarigecili, Mehmet Ilteris
The representation and management of product information in its life-cycle requires standardized data exchange protocols. Standard for Exchange of Product Model Data (STEP) is such a standard that has been used widely by the industries. Even though STEP-based product models are well defined and syntactically correct, populating product data according to these models is not easy because they are too big and disorganized. Data exchange specifications (DEXs) and templates provide re-organized information models required in data exchange of specific activities for various businesses. DEXs show us it would be possible to organize STEP-based product models in order to support different engineering activities at various stages of product life-cycle. In this study, STEP-based models are enriched and organized to support two engineering activities: materials information declaration and tolerance analysis. Due to new environmental regulations, the substance and materials information in products have to be screened closely by manufacturing industries. This requires a fast, unambiguous and complete product information exchange between the members of a supply chain. Tolerance analysis activity, on the other hand, is used to verify the functional requirements of an assembly considering the worst case (i.e., maximum and minimum) conditions for the part/assembly dimensions. Another issue with STEP-based product models is that the semantics of product data are represented implicitly. Hence, it is difficult to interpret the semantics of data for different product life-cycle phases for various application domains. OntoSTEP, developed at NIST, provides semantically enriched product models in OWL. In this thesis, we would like to present how to interpret the GD & T specifications in STEP for tolerance analysis by utilizing OntoSTEP.
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.
Modeling traceability information and functionality requirement in export-oriented tilapia chain.
Zhang, Xiaoshuan; Feng, Jianying; Xu, Mark; Hu, Jinyou
2011-05-01
Tilapia has been named as the 'food fish of the 21st century' and has become the most important farmed fish. China is the world leader in tilapia production and export. Identifying information and functional requirements is critical in developing an efficient traceability system because traceability has become a fundamental prerequisite for exporting aquaculture products. This paper examines the export-oriented tilapia chains and information flow in the chains, and identifies the key actors, information requirements and information-capturing points. Unified Modeling Language (UML) technology is adopted to describe the information and functionality requirement for chain traceability. The barriers of traceability system adoption are also identified. The results show that the traceability data consist of four categories that must be recorded by each link in the chain. The functionality requirement is classified into four categories from the fundamental information record to decisive quality control; the top three barriers to the traceability system adoption are: high costs of implementing the system, lack of experienced and professional staff; and low level of government involvement and support. Copyright © 2011 Society of Chemical Industry.
Information Models, Data Requirements, and Agile Data Curation
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Dan; Ritschel, Bernd; Hardman, Sean; Joyner, Ron
2015-04-01
The Planetary Data System's next generation system, PDS4, is an example of the successful use of an ontology-based Information Model (IM) to drive the development and operations of a data system. In traditional systems engineering, requirements or statements about what is necessary for the system are collected and analyzed for input into the design stage of systems development. With the advent of big data the requirements associated with data have begun to dominate and an ontology-based information model can be used to provide a formalized and rigorous set of data requirements. These requirements address not only the usual issues of data quantity, quality, and disposition but also data representation, integrity, provenance, context, and semantics. In addition the use of these data requirements during system's development has many characteristics of Agile Curation as proposed by Young et al. [Taking Another Look at the Data Management Life Cycle: Deconstruction, Agile, and Community, AGU 2014], namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. For example customers can be satisfied through early and continuous delivery of system software and services that are configured directly from the information model. This presentation will describe the PDS4 architecture and its three principle parts: the ontology-based Information Model (IM), the federated registries and repositories, and the REST-based service layer for search, retrieval, and distribution. The development of the IM will be highlighted with special emphasis on knowledge acquisition, the impact of the IM on development and operations, and the use of shared ontologies at multiple governance levels to promote system interoperability and data correlation.
NASA Technical Reports Server (NTRS)
Milroy, Audrey; Hale, Joe
2006-01-01
NASA s Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model s fidelity, credibility, and quality, including the verification, validation and accreditation information. The NASA MSRR will be implemented leveraging M&S industry best practices. This presentation will discuss the requirements that will enable NASA to capture and make available the "meta data" or "simulation biography" data associated with a model. The presentation will also describe the requirements that drive how NASA will collect and document relevant information for models or suites of models in order to facilitate use and reuse of relevant models and provide visibility across NASA organizations and the larger M&S community.
An Object-Based Requirements Modeling Method.
ERIC Educational Resources Information Center
Cordes, David W.; Carver, Doris L.
1992-01-01
Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…
A Hybrid 3D Indoor Space Model
NASA Astrophysics Data System (ADS)
Jamali, Ali; Rahman, Alias Abdul; Boguslawski, Pawel
2016-10-01
GIS integrates spatial information and spatial analysis. An important example of such integration is for emergency response which requires route planning inside and outside of a building. Route planning requires detailed information related to indoor and outdoor environment. Indoor navigation network models including Geometric Network Model (GNM), Navigable Space Model, sub-division model and regular-grid model lack indoor data sources and abstraction methods. In this paper, a hybrid indoor space model is proposed. In the proposed method, 3D modeling of indoor navigation network is based on surveying control points and it is less dependent on the 3D geometrical building model. This research proposes a method of indoor space modeling for the buildings which do not have proper 2D/3D geometrical models or they lack semantic or topological information. The proposed hybrid model consists of topological, geometrical and semantical space.
Directory of Energy Information Administration model abstracts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-08-11
This report contains brief statements from the model managers about each model's title, acronym, purpose, and status, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. All models ''active'' through March 1987 are included. The main body of this directory is an alphabetical list of all active EIA models. Appendix A identifies major EIA modeling systems and the models within these systems, and Appendix B identifies active EIA models by type (basic, auxiliary, and developing). A basic model is one designated by the EIA Administrator as being sufficiently important to require sustained supportmore » and public scrutiny. An auxiliary model is one designated by the EIA Administrator as being used only occasionally in analyses, and therefore requires minimal levels of documentation. A developing model is one designated by the EIA Administrator as being under development and yet of sufficient interest to require a basic level of documentation at a future date. EIA also leases models developed by proprietary software vendors. Documentation for these ''proprietary'' models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here. The directory is intended for the use of energy and energy-policy analysts in the public and private sectors.« less
3D Surveying, Modeling and Geo-Information System of the New Campus of ITB-Indonesia
NASA Astrophysics Data System (ADS)
Suwardhi, D.; Trisyanti, S. W.; Ainiyah, N.; Fajri, M. N.; Hanan, H.; Virtriana, R.; Edmarani, A. A.
2016-10-01
The new campus of ITB-Indonesia, which is located at Jatinangor, requires good facilities and infrastructures to supporting all of campus activities. Those can not be separated from procurement and maintenance activities. Technology for procurement and maintenance of facilities and infrastructures -based computer (information system)- has been known as Building Information Modeling (BIM). Nowadays, that technology is more affordable with some of free software that easy to use and tailored to user needs. BIM has some disadvantages and it requires other technologies to complete it, namely Geographic Information System (GIS). BIM and GIS require surveying data to visualized landscape and buildings on Jatinangor ITB campus. This paper presents the on-going of an internal service program conducted by the researcher, academic staff and students for the university. The program including 3D surveying to support the data requirements for 3D modeling of buildings in CityGML and Industry Foundation Classes (IFC) data model. The entire 3D surveying will produce point clouds that can be used to make 3D model. The 3D modeling is divided into low and high levels of detail modeling. The low levels model is stored in 3D CityGML database, and the high levels model including interiors is stored in BIM Server. 3D model can be used to visualized the building and site of Jatinangor ITB campus. For facility management of campus, an geo-information system is developed that can be used for planning, constructing, and maintaining Jatinangor ITB's facilities and infrastructures. The system uses openMAINT, an open source solution for the Property & Facility Management.
Use of fuzzy sets in modeling of GIS objects
NASA Astrophysics Data System (ADS)
Mironova, Yu N.
2018-05-01
The paper discusses modeling and methods of data visualization in geographic information systems. Information processing in Geoinformatics is based on the use of models. Therefore, geoinformation modeling is a key in the chain of GEODATA processing. When solving problems, using geographic information systems often requires submission of the approximate or insufficient reliable information about the map features in the GIS database. Heterogeneous data of different origin and accuracy have some degree of uncertainty. In addition, not all information is accurate: already during the initial measurements, poorly defined terms and attributes (e.g., "soil, well-drained") are used. Therefore, there are necessary methods for working with uncertain requirements, classes, boundaries. The author proposes using spatial information fuzzy sets. In terms of a characteristic function, a fuzzy set is a natural generalization of ordinary sets, when one rejects the binary nature of this feature and assumes that it can take any value in the interval.
Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong
2015-02-01
Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.
Sizing the science data processing requirements for EOS
NASA Technical Reports Server (NTRS)
Wharton, Stephen W.; Chang, Hyo D.; Krupp, Brian; Lu, Yun-Chi
1991-01-01
The methodology used in the compilation and synthesis of baseline science requirements associated with the 30 + EOS (Earth Observing System) instruments and over 2,400 EOS data products (both output and required input) proposed by EOS investigators is discussed. A brief background on EOS and the EOS Data and Information System (EOSDIS) is presented, and the approach is outlined in terms of a multilayer model. The methodology used to compile, synthesize, and tabulate requirements within the model is described. The principal benefit of this approach is the reduction of effort needed to update the analysis and maintain the accuracy of the science data processing requirements in response to changes in EOS platforms, instruments, data products, processing center allocations, or other model input parameters. The spreadsheets used in the model provide a compact representation, thereby facilitating review and presentation of the information content.
NASA Technical Reports Server (NTRS)
Southall, J. W.
1979-01-01
The engineering-specified requirements for integrated information processing by means of the Integrated Programs for Aerospace-Vehicle Design (IPAD) system are presented. A data model is described and is based on the design process of a typical aerospace vehicle. General data management requirements are specified for data storage, retrieval, generation, communication, and maintenance. Information management requirements are specified for a two-component data model. In the general portion, data sets are managed as entities, and in the specific portion, data elements and the relationships between elements are managed by the system, allowing user access to individual elements for the purpose of query. Computer program management requirements are specified for support of a computer program library, control of computer programs, and installation of computer programs into IPAD.
NASA Astrophysics Data System (ADS)
ten Veldhuis, Marie-claire; van Riemsdijk, Birna
2013-04-01
Hydrological analysis of urban catchments requires high resolution rainfall and catchment information because of the small size of these catchments, high spatial variability of the urban fabric, fast runoff processes and related short response times. Rainfall information available from traditional radar and rain gauge networks does no not meet the relevant scales of urban hydrology. A new type of weather radars, based on X-band frequency and equipped with Doppler and dual polarimetry capabilities, promises to provide more accurate rainfall estimates at the spatial and temporal scales that are required for urban hydrological analysis. Recently, the RAINGAIN project was started to analyse the applicability of this new type of radars in the context of urban hydrological modelling. In this project, meteorologists and hydrologists work closely together in several stages of urban hydrological analysis: from the acquisition procedure of novel and high-end radar products to data acquisition and processing, rainfall data retrieval, hydrological event analysis and forecasting. The project comprises of four pilot locations with various characteristics of weather radar equipment, ground stations, urban hydrological systems, modelling approaches and requirements. Access to data processing and modelling software is handled in different ways in the pilots, depending on ownership and user context. Sharing of data and software among pilots and with the outside world is an ongoing topic of discussion. The availability of high resolution weather data augments requirements with respect to the resolution of hydrological models and input data. This has led to the development of fully distributed hydrological models, the implementation of which remains limited by the unavailability of hydrological input data. On the other hand, if models are to be used in flood forecasting, hydrological models need to be computationally efficient to enable fast responses to extreme event conditions. This presentation will highlight ICT-related requirements and limitations in high resolution urban hydrological modelling and analysis. Further ICT challenges arise in provision of high resolution radar data for diverging information needs as well as in combination with other data sources in the urban environment. Different types of information are required for such diverse activities as operational flood protection, traffic management, large event organisation, business planning in shopping districts and restaurants, timing of family activities. These different information needs may require different configurations and data processing for radars and other data sources. An ICT challenge is to develop techniques for deciding how to automatically respond to these diverging information needs (e.g., through (semi-)automated negotiation). Diverse activities also provide a wide variety of information resources that can supplement traditional networks of weather sensors, such as rain sensors on cars and social media. Another ICT challenge is how to combine data from these different sources for answering a particular information need. Examples will be presented of solutions are currently being explored.
ERIC Educational Resources Information Center
Mursu, Anja; Luukkonen, Irmeli; Toivanen, Marika; Korpela, Mikko
2007-01-01
Introduction: The purpose of information systems is to facilitate work activities: here we consider how Activity Theory can be applied in information systems development. Method. The requirements for an analytical model for emancipatory, work-oriented information systems research and practice are specified. Previous research work in Activity…
78 FR 12623 - Insurer Reporting Requirements
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-25
... NHTSA's regulation requiring motor vehicle insurers to submit information on the number of thefts and recoveries of insured vehicles and actions taken by the insurer to deter or reduce motor vehicle theft. NHTSA..., which requires insurers to submit information about the make, model, and year of all vehicle thefts, the...
Defining the Core Archive Data Standards of the International Planetary Data Alliance (IPDA)
NASA Technical Reports Server (NTRS)
Hughes, J. Steven; Crichton, Dan; Beebe, Reta; Guinness, Ed; Heather, David; Zender, Joe
2007-01-01
A goal of the International Planetary Data Alliance (lPDA) is to develop a set of archive data standards that enable the sharing of scientific data across international agencies and missions. To help achieve this goal, the IPDA steering committee initiated a six month proj ect to write requirements for and draft an information model based on the Planetary Data System (PDS) archive data standards. The project had a special emphasis on data formats. A set of use case scenarios were first developed from which a set of requirements were derived for the IPDA archive data standards. The special emphasis on data formats was addressed by identifying data formats that have been used by PDS nodes and other agencies in the creation of successful data sets for the Planetary Data System (PDS). The dependency of the IPDA information model on the PDS archive standards required the compilation of a formal specification of the archive standards currently in use by the PDS. An ontology modelling tool was chosen to capture the information model from various sources including the Planetary Science Data Dictionary [I] and the PDS Standards Reference [2]. Exports of the modelling information from the tool database were used to produce the information model document using an object-oriented notation for presenting the model. The tool exports can also be used for software development and are directly accessible by semantic web applications.
Design Requirements for Communication-Intensive Interactive Applications
NASA Astrophysics Data System (ADS)
Bolchini, Davide; Garzotto, Franca; Paolini, Paolo
Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.
The Importance of Information Requirements in Designing Acquisition to Information Systems
NASA Technical Reports Server (NTRS)
Davis, Bruce A.; Hill, Chuck; Maughan, Paul M.
1998-01-01
The partnership model used by NASA's Commercial Remote Sensing Program has been successful in better defining remote sensing functional requirements and translation to technical specifications to address environmental needs of the 21st century.
Seaway Information System Management and Control Requirements
DOT National Transportation Integrated Search
1973-10-01
This report examines in detail the control and information system requirements of the St. Lawrence Seaway development program in terms of the needs of the vessel traffic controllers and the management users. Structural control models of Seaway operat...
The application of use case modeling in designing medical imaging information systems.
Safdari, Reza; Farzi, Jebraeil; Ghazisaeidi, Marjan; Mirzaee, Mahboobeh; Goodini, Azadeh
2013-01-01
Introduction. The essay at hand is aimed at examining the application of use case modeling in analyzing and designing information systems to support Medical Imaging services. Methods. The application of use case modeling in analyzing and designing health information systems was examined using electronic databases (Pubmed, Google scholar) resources and the characteristics of the modeling system and its effect on the development and design of the health information systems were analyzed. Results. Analyzing the subject indicated that Provident modeling of health information systems should provide for quick access to many health data resources in a way that patients' data can be used in order to expand distant services and comprehensive Medical Imaging advices. Also these experiences show that progress in the infrastructure development stages through gradual and repeated evolution process of user requirements is stronger and this can lead to a decline in the cycle of requirements engineering process in the design of Medical Imaging information systems. Conclusion. Use case modeling approach can be effective in directing the problems of health and Medical Imaging information systems towards understanding, focusing on the start and analysis, better planning, repetition, and control.
Project Shuttle simulation math model coordination catalog, revision 1
NASA Technical Reports Server (NTRS)
1974-01-01
A catalog is presented of subsystem and environment math models used or planned for space shuttle simulations. The purpose is to facilitate sharing of similar math models between shuttle simulations. It provides information on mach model requirements, formulations, schedules, and contact persons for further information.
The role of man in flight experiment payload missions. Volume 2: Appendices
NASA Technical Reports Server (NTRS)
Malone, T. B.
1973-01-01
In the study to determine the role of man in Sortie Lab operations, a functional model of a generalized experiment system was developed. The results are presented of a requirements analysis which was conducted to identify performance requirements, information requirements, and interface requirements associated with each function in the model.
ERIC Educational Resources Information Center
Czuchry, Andrew J.; And Others
This user's guide describes the functions, logical operations and subroutines, input data requirements, and available outputs of the Training Requirements Analysis Model (TRAMOD), a computerized analytical life cycle cost modeling system for use in the early stages of system design. Operable in a stand-alone mode, TRAMOD can be used for the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benbennick, M.E.; Broton, M.S.; Fuoto, J.S.
This report describes a model tracking system for a low-level radioactive waste (LLW) disposal facility license application. In particular, the model tracks interrogatories (questions, requests for information, comments) and responses. A set of requirements and desired features for the model tracking system was developed, including required structure and computer screens. Nine tracking systems were then reviewed against the model system requirements and only two were found to meet all requirements. Using Kepner-Tregoe decision analysis, a model tracking system was selected.
Optimal averaging of soil moisture predictions from ensemble land surface model simulations
USDA-ARS?s Scientific Manuscript database
The correct interpretation of ensemble information obtained from the parallel implementation of multiple land surface models (LSMs) requires information concerning the LSM ensemble’s mutual error covariance. Here we propose a new technique for obtaining such information using an instrumental variabl...
49 CFR 573.15 - Public Availability of Motor Vehicle Recall Information.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Internet. The information shall be in a format that is searchable by vehicle make and model and vehicle... following requirements: (1) Be free of charge and not require users to register or submit information, other... (Internet link) to it conspicuously placed on the manufacturer's main United States' Web page; (3) Not...
49 CFR 573.15 - Public availability of motor vehicle recall information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Internet. The information shall be in a format that is searchable by vehicle make and model and vehicle... following requirements: (1) Be free of charge and not require users to register or submit information, other... (Internet link) to it conspicuously placed on the manufacturer's main United States' Web page; (3) Not...
Clinical professional governance for detailed clinical models.
Goossen, William; Goossen-Baremans, Anneke
2013-01-01
This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models. Finally, collections of clinical models do require a repository in which they can be stored, searched, and maintained. Governance of Detailed Clinical Models is required at local, national, and international levels.
Bibliographic Records in an Online Environment
ERIC Educational Resources Information Center
Cossham, Amanda F.
2013-01-01
Introduction: The IFLA functional requirements for bibliographic records model has had a major impact on cataloguing principles and practices over the past fifteen years. This paper evaluates the model in the light of changes in the wider information environment (especially to information resources and retrieval) and in information seeking…
Optimal averaging of soil moisture predictions from ensemble land surface model simulations
USDA-ARS?s Scientific Manuscript database
The correct interpretation of ensemble 3 soil moisture information obtained from the parallel implementation of multiple land surface models (LSMs) requires information concerning the LSM ensemble’s mutual error covariance. Here we propose a new technique for obtaining such information using an inst...
NASA Astrophysics Data System (ADS)
Delgado, Francisco
2017-12-01
Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.
Beware the tail that wags the dog: informal and formal models in biology
Gunawardena, Jeremy
2014-01-01
Informal models have always been used in biology to guide thinking and devise experiments. In recent years, formal mathematical models have also been widely introduced. It is sometimes suggested that formal models are inherently superior to informal ones and that biology should develop along the lines of physics or economics by replacing the latter with the former. Here I suggest to the contrary that progress in biology requires a better integration of the formal with the informal. PMID:25368417
Automatically updating predictive modeling workflows support decision-making in drug design.
Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O
2016-09-01
Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.
Proposing a Metaliteracy Model to Redefine Information Literacy
ERIC Educational Resources Information Center
Jacobson, Trudi E.; Mackey, Thomas P.
2013-01-01
Metaliteracy is envisioned as a comprehensive model for information literacy to advance critical thinking and reflection in social media, open learning settings, and online communities. At this critical time in higher education, an expansion of the original definition of information literacy is required to include the interactive production and…
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2000-01-01
Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.
NASA Astrophysics Data System (ADS)
Wong, T. E.; Noone, D. C.; Kleiber, W.
2014-12-01
The single largest uncertainty in climate model energy balance is the surface latent heating over tropical land. Furthermore, the partitioning of the total latent heat flux into contributions from surface evaporation and plant transpiration is of great importance, but notoriously poorly constrained. Resolving these issues will require better exploiting information which lies at the interface between observations and advanced modeling tools, both of which are imperfect. There are remarkably few observations which can constrain these fluxes, placing strict requirements on developing statistical methods to maximize the use of limited information to best improve models. Previous work has demonstrated the power of incorporating stable water isotopes into land surface models for further constraining ecosystem processes. We present results from a stable water isotopically-enabled land surface model (iCLM4), including model experiments partitioning the latent heat flux into contributions from plant transpiration and surface evaporation. It is shown that the partitioning results are sensitive to the parameterization of kinetic fractionation used. We discuss and demonstrate an approach to calibrating select model parameters to observational data in a Bayesian estimation framework, requiring Markov Chain Monte Carlo sampling of the posterior distribution, which is shown to constrain uncertain parameters as well as inform relevant values for operational use. Finally, we discuss the application of the estimation scheme to iCLM4, including entropy as a measure of information content and specific challenges which arise in calibration models with a large number of parameters.
Exploration Medical System Trade Study Tools Overview
NASA Technical Reports Server (NTRS)
Mindock, J.; Myers, J.; Latorella, K.; Cerro, J.; Hanson, A.; Hailey, M.; Middour, C.
2018-01-01
ExMC is creating an ecosystem of tools to enable well-informed medical system trade studies. The suite of tools address important system implementation aspects of the space medical capabilities trade space and are being built using knowledge from the medical community regarding the unique aspects of space flight. Two integrating models, a systems engineering model and a medical risk analysis model, tie the tools together to produce an integrated assessment of the medical system and its ability to achieve medical system target requirements. This presentation will provide an overview of the various tools that are a part of the tool ecosystem. Initially, the presentation's focus will address the tools that supply the foundational information to the ecosystem. Specifically, the talk will describe how information that describes how medicine will be practiced is captured and categorized for efficient utilization in the tool suite. For example, the talk will include capturing what conditions will be planned for in-mission treatment, planned medical activities (e.g., periodic physical exam), required medical capabilities (e.g., provide imaging), and options to implement the capabilities (e.g., an ultrasound device). Database storage and configuration management will also be discussed. The presentation will include an overview of how these information tools will be tied to parameters in a Systems Modeling Language (SysML) model, allowing traceability to system behavioral, structural, and requirements content. The discussion will also describe an HRP-led enhanced risk assessment model developed to provide quantitative insight into each capability's contribution to mission success. Key outputs from these various tools, to be shared with the space medical and exploration mission development communities, will be assessments of medical system implementation option satisfaction of requirements and per-capability contributions toward achieving requirements.
Information Interaction Study for DER and DMS Interoperability
NASA Astrophysics Data System (ADS)
Liu, Haitao; Lu, Yiming; Lv, Guangxian; Liu, Peng; Chen, Yu; Zhang, Xinhui
The Common Information Model (CIM) is an abstract data model that can be used to represent the major objects in Distribution Management System (DMS) applications. Because the Common Information Model (CIM) doesn't modeling the Distributed Energy Resources (DERs), it can't meet the requirements of DER operation and management for Distribution Management System (DMS) advanced applications. Modeling of DER were studied based on a system point of view, the article initially proposed a CIM extended information model. By analysis the basic structure of the message interaction between DMS and DER, a bidirectional messaging mapping method based on data exchange was proposed.
NASA Technical Reports Server (NTRS)
Arnold, S. M.
2006-01-01
Materials property information such as composition and thermophysical/mechanical properties abound in the literature. Oftentimes, however, the corresponding response curves from which these data are determined are missing or at the very least difficult to retrieve. Further, the paradigm for collecting materials property information has historically centered on (1) properties for materials comparison/selection purposes and (2) input requirements for conventional design/analysis methods. However, just as not all materials are alike or equal, neither are all constitutive models (and thus design/ analysis methods) equal; each model typically has its own specific and often unique required materials parameters, some directly measurable and others indirectly measurable. Therefore, the type and extent of materials information routinely collected is not always sufficient to meet the current, much less future, needs of the materials modeling community. Informatics has been defined as the science concerned with gathering, manipulating, storing, retrieving, and classifying recorded information. A key aspect of informatics is its focus on understanding problems and applying information technology as needed to address those problems. The primary objective of this article is to highlight the need for a paradigm shift in materials data collection, analysis, and dissemination so as to maximize the impact on both practitioners and researchers. Our hope is to identify and articulate what constitutes "sufficient" data content (i.e., quality and quantity) for developing, characterizing, and validating sophisticated nonlinear time- and history-dependent (hereditary) constitutive models. Likewise, the informatics infrastructure required for handling the potentially massive amounts of materials data will be discussed.
Information engineering for molecular diagnostics.
Sorace, J. M.; Ritondo, M.; Canfield, K.
1994-01-01
Clinical laboratories are beginning to apply the recent advances in molecular biology to the testing of patient samples. The emerging field of Molecular Diagnostics will require a new Molecular Diagnostics Laboratory Information System which handles the data types, samples and test methods found in this field. The system must be very flexible in regards to supporting ad-hoc queries. The requirements which are shaping the developments in this field are reviewed and a data model developed. Several queries which demonstrate the data models ability to support the information needs of this area have been developed and run. These results demonstrate the ability of the purposed data model to meet the current and projected needs of this rapidly expanding field. PMID:7949937
ERIC Educational Resources Information Center
Kamis-Gould, Edna; And Others
1991-01-01
A model for quality assurance (QA) in psychiatric hospitals is described. Its functions (general QA, utilization review, clinical records, evaluation, management information systems, risk management, and infection control), subfunctions, and corresponding staffing requirements are reviewed. This model was designed to foster standardization in QA…
Human-telerobot interactions - Information, control, and mental models
NASA Technical Reports Server (NTRS)
Smith, Randy L.; Gillan, Douglas J.
1987-01-01
A part of the NASA's Space Station will be a teleoperated robot (telerobot) with arms for grasping and manipulation, feet for holding onto objects, and television cameras for visual feedback. The objective of the work described in this paper is to develop the requirements and specifications for the user-telerobot interface and to determine through research and testing that the interface results in efficient system operation. The focus of the development of the user-telerobot interface is on the information required by the user, the user inputs, and the design of the control workstation. Closely related to both the information required by the user and the user's control of the telerobot is the user's mental model of the relationship between the control inputs and the telerobot's actions.
Next Generation Multimedia Distributed Data Base Systems
NASA Technical Reports Server (NTRS)
Pendleton, Stuart E.
1997-01-01
The paradigm of client/server computing is changing. The model of a server running a monolithic application and supporting clients at the desktop is giving way to a different model that blurs the line between client and server. We are on the verge of plunging into the next generation of computing technology--distributed object-oriented computing. This is not only a change in requirements but a change in opportunities, and requires a new way of thinking for Information System (IS) developers. The information system demands caused by global competition are requiring even more access to decision making tools. Simply, object-oriented technology has been developed to supersede the current design process of information systems which is not capable of handling next generation multimedia.
A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Research on BIM-based building information value chain reengineering
NASA Astrophysics Data System (ADS)
Hui, Zhao; Weishuang, Xie
2017-04-01
The achievement of value and value-added factor to the building engineering information is accomplished through a chain-flow, that is, building the information value chain. Based on the deconstruction of the information chain on the construction information in the traditional information mode, this paper clarifies the value characteristics and requirements of each stage of the construction project. In order to achieve building information value-added, the paper deconstructs the traditional building information value chain, reengineer the information value chain model on the basis of the theory and techniques of BIM, to build value-added management model and analyse the value of the model.
ERIC Educational Resources Information Center
Yang, Samuel C.
2016-01-01
The author examines the present state of information systems undergraduate programs in the United States. He reviewed 516 institutions and collected data on 234 institutions offering information systems (IS) undergraduate programs. Of seven core courses required by the IS 2010 curriculum model, four are required by more than 50% of the programs,…
A Transparent Translation from Legacy System Model into Common Information Model: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Simpson, Jeffrey; Zhang, Yingchen
Advance in smart grid is forcing utilities towards better monitoring, control and analysis of distribution systems, and requires extensive cyber-based intelligent systems and applications to realize various functionalities. The ability of systems, or components within systems, to interact and exchange services or information with each other is the key to the success of smart grid technologies, and it requires efficient information exchanging and data sharing infrastructure. The Common Information Model (CIM) is a standard that allows different applications to exchange information about an electrical system, and it has become a widely accepted solution for information exchange among different platforms andmore » applications. However, most existing legacy systems are not developed using CIM, but using their own languages. Integrating such legacy systems is a challenge for utilities, and the appropriate utilization of the integrated legacy systems is even more intricate. Thus, this paper has developed an approach and open-source tool in order to translate legacy system models into CIM format. The developed tool is tested for a commercial distribution management system and simulation results have proved its effectiveness.« less
12 CFR Appendix B to Part 1022 - Model Notices of Furnishing Negative Information
Code of Federal Regulations, 2010 CFR
2018-01-01
... 12 Banks and Banking 8 2018-01-01 2018-01-01 false Model Notices of Furnishing Negative... REPORTING (REGULATION V) Pt. 1022, App. B Appendix B to Part 1022—Model Notices of Furnishing Negative Information a. Although use of the model notices is not required, a financial institution that is subject to...
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
Information systems - Issues in global habitability
NASA Technical Reports Server (NTRS)
Norman, S. D.; Brass, J. A.; Jones, H.; Morse, D. R.
1984-01-01
The present investigation is concerned with fundamental issues, related to information considerations, which arise in an interdisciplinary approach to questions of global habitability. Information system problems and issues are illustrated with the aid of an example involving biochemical cycling and biochemical productivity. The estimation of net primary production (NPP) as an important consideration in the overall global habitability issue is discussed. The NPP model requires three types of data, related to meteorological information, a land surface inventory, and the vegetation structure. Approaches for obtaining and processing these data are discussed. Attention is given to user requirements, information system requirements, workstations, network communications, hardware/software access, and data management.
Building team adaptive capacity: the roles of sensegiving and team composition.
Randall, Kenneth R; Resick, Christian J; DeChurch, Leslie A
2011-05-01
The current study draws on motivated information processing in groups theory to propose that leadership functions and composition characteristics provide teams with the epistemic and social motivation needed for collective information processing and strategy adaptation. Three-person teams performed a city management decision-making simulation (N=74 teams; 222 individuals). Teams first managed a simulated city that was newly formed and required growth strategies and were then abruptly switched to a second simulated city that was established and required revitalization strategies. Consistent with hypotheses, external sensegiving and team composition enabled distinct aspects of collective information processing. Sensegiving prompted the emergence of team strategy mental models (i.e., cognitive information processing); psychological collectivism facilitated information sharing (i.e., behavioral information processing); and cognitive ability provided the capacity for both the cognitive and behavioral aspects of collective information processing. In turn, team mental models and information sharing enabled reactive strategy adaptation.
Martínez-Costa, Catalina; Cornet, Ronald; Karlsson, Daniel; Schulz, Stefan; Kalra, Dipak
2015-05-01
To improve semantic interoperability of electronic health records (EHRs) by ontology-based mediation across syntactically heterogeneous representations of the same or similar clinical information. Our approach is based on a semantic layer that consists of: (1) a set of ontologies supported by (2) a set of semantic patterns. The first aspect of the semantic layer helps standardize the clinical information modeling task and the second shields modelers from the complexity of ontology modeling. We applied this approach to heterogeneous representations of an excerpt of a heart failure summary. Using a set of finite top-level patterns to derive semantic patterns, we demonstrate that those patterns, or compositions thereof, can be used to represent information from clinical models. Homogeneous querying of the same or similar information, when represented according to heterogeneous clinical models, is feasible. Our approach focuses on the meaning embedded in EHRs, regardless of their structure. This complex task requires a clear ontological commitment (ie, agreement to consistently use the shared vocabulary within some context), together with formalization rules. These requirements are supported by semantic patterns. Other potential uses of this approach, such as clinical models validation, require further investigation. We show how an ontology-based representation of a clinical summary, guided by semantic patterns, allows homogeneous querying of heterogeneous information structures. Whether there are a finite number of top-level patterns is an open question. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Modeling Requirements for Cohort and Register IT.
Stäubert, Sebastian; Weber, Ulrike; Michalik, Claudia; Dress, Jochen; Ngouongo, Sylvie; Stausberg, Jürgen; Winter, Alfred
2016-01-01
The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; b) to analyze the current state models and to find simplifications within the generic catalog. Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as requirements specification for bids, is supported, too.
An empirical analysis of executive behaviour with hospital executive information systems in Taiwan.
Huang, Wei-Min
2013-01-01
Existing health information systems largely only support the daily operations of a medical centre, and are unable to generate the information required by executives for decision-making. Building on past research concerning information retrieval behaviour and learning through mental models, this study examines the use of information systems by hospital executives in medical centres. It uses a structural equation model to help find ways hospital executives might use information systems more effectively. The results show that computer self-efficacy directly affects the maintenance of mental models, and that system characteristics directly impact learning styles and information retrieval behaviour. Other results include the significant impact of perceived environmental uncertainty on scan searches; information retrieval behaviour and focused searches on mental models and perceived efficiency; scan searches on mental model building; learning styles and model building on perceived efficiency; and finally the impact of mental model maintenance on perceived efficiency and effectiveness.
SLS Model Based Design: A Navigation Perspective
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin
2018-01-01
The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.
An analytical approach to customer requirement information processing
NASA Astrophysics Data System (ADS)
Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong
2013-11-01
'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
Skill Assessment in Ocean Biological Data Assimilation
NASA Technical Reports Server (NTRS)
Gregg, Watson W.; Friedrichs, Marjorie A. M.; Robinson, Allan R.; Rose, Kenneth A.; Schlitzer, Reiner; Thompson, Keith R.; Doney, Scott C.
2008-01-01
There is growing recognition that rigorous skill assessment is required to understand the ability of ocean biological models to represent ocean processes and distributions. Statistical analysis of model results with observations represents the most quantitative form of skill assessment, and this principle serves as well for data assimilation models. However, skill assessment for data assimilation requires special consideration. This is because there are three sets of information in the free-run model, data, and the assimilation model, which uses Data assimilation information from both the flee-run model and the data. Intercom parison of results among the three sets of information is important and useful for assessment, but is not conclusive since the three information sets are intertwined. An independent data set is necessary for an objective determination. Other useful measures of ocean biological data assimilation assessment include responses of unassimilated variables to the data assimilation, performance outside the prescribed region/time of interest, forecasting, and trend analysis. Examples of each approach from the literature are provided. A comprehensive list of ocean biological data assimilation and their applications of skill assessment, in both ecosystem/biogeochemical and fisheries efforts, is summarized.
40 CFR 91.504 - Maintenance of records; submittal of information.
Code of Federal Regulations, 2010 CFR
2010-07-01
... testing required for the engine family in a model year. Records may be retained as hard copy (i.e., on... hard copy is retained. (c) The manufacturer must, upon request by the Administrator, submit the... testing using an EPA information format. The Administrator may exempt manufacturers from this requirement...
NASA Technical Reports Server (NTRS)
Reil, Robin L.
2014-01-01
Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.
ERIC Educational Resources Information Center
Smith, Hubert Gene
The objectives of the study presented in the dissertation were to identify present and anticipated information requirements of the various departments within the Oklahoma State Department of Vocational and Technical Education, to design a computerized information system model utilizing an integrated systems concept to meet information…
Using conceptual work products of health care to design health IT.
Berry, Andrew B L; Butler, Keith A; Harrington, Craig; Braxton, Melissa O; Walker, Amy J; Pete, Nikki; Johnson, Trevor; Oberle, Mark W; Haselkorn, Jodie; Paul Nichol, W; Haselkorn, Mark
2016-02-01
This paper introduces a new, model-based design method for interactive health information technology (IT) systems. This method extends workflow models with models of conceptual work products. When the health care work being modeled is substantially cognitive, tacit, and complex in nature, graphical workflow models can become too complex to be useful to designers. Conceptual models complement and simplify workflows by providing an explicit specification for the information product they must produce. We illustrate how conceptual work products can be modeled using standard software modeling language, which allows them to provide fundamental requirements for what the workflow must accomplish and the information that a new system should provide. Developers can use these specifications to envision how health IT could enable an effective cognitive strategy as a workflow with precise information requirements. We illustrate the new method with a study conducted in an outpatient multiple sclerosis (MS) clinic. This study shows specifically how the different phases of the method can be carried out, how the method allows for iteration across phases, and how the method generated a health IT design for case management of MS that is efficient and easy to use. Copyright © 2015 Elsevier Inc. All rights reserved.
A hierarchical modeling methodology for the definition and selection of requirements
NASA Astrophysics Data System (ADS)
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.
Research on Capturing of Customer Requirements Based on Innovation Theory
NASA Astrophysics Data System (ADS)
junwu, Ding; dongtao, Yang; zhenqiang, Bao
To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.
Study of data collection platform concepts: Data collection system user requirements
NASA Technical Reports Server (NTRS)
1973-01-01
The overall purpose of the survey was to provide real world data on user requirements. The intent was to assess data collection system user requirements by questioning actual potential users rather than speculating on requirements. The end results of the survey are baseline requirements models for both a data collection platform and a data collection system. These models were derived from the survey results. The real value of these models lies in the fact that they are based on actual user requirements as delineated in the survey questionnaires. Some users desire data collection platforms of small size and light weight. These sizes and weights are beyond the present state of the art. Also, the survey provided a wealth of information on the nature and constituency of the data collection user community as well as information on user applications for data collection systems. Finally, the data sheds light on the generalized platform concept. That is, the diversity of user requirements shown in the data indicates the difficulty that can be anticipated in attempting to implement such a concept.
A Product Development Decision Model for Cockpit Weather Information System
NASA Technical Reports Server (NTRS)
Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin; Johnson, Edward J., Jr. (Technical Monitor)
2003-01-01
There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.
Geographic information system/watershed model interface
Fisher, Gary T.
1989-01-01
Geographic information systems allow for the interactive analysis of spatial data related to water-resources investigations. A conceptual design for an interface between a geographic information system and a watershed model includes functions for the estimation of model parameter values. Design criteria include ease of use, minimal equipment requirements, a generic data-base management system, and use of a macro language. An application is demonstrated for a 90.1-square-kilometer subbasin of the Patuxent River near Unity, Maryland, that performs automated derivation of watershed parameters for hydrologic modeling.
A Product Development Decision Model for Cockpit Weather Information Systems
NASA Technical Reports Server (NTRS)
Sireli, Yesim; Kauffmann, Paul; Gupta, Surabhi; Kachroo, Pushkin
2003-01-01
There is a significant market demand for advanced cockpit weather information products. However, it is unclear how to identify the most promising technological options that provide the desired mix of consumer requirements by employing feasible technical systems at a price that achieves market success. This study develops a unique product development decision model that employs Quality Function Deployment (QFD) and Kano's model of consumer choice. This model is specifically designed for exploration and resolution of this and similar information technology related product development problems.
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
12 CFR Appendix A to Part 573 - Model Privacy Form
Code of Federal Regulations, 2010 CFR
2010-01-01
... rates and payments; retirement assets; checking account information; employment information; wire... identified as “[account #].” Institutions that require additional or different information, such as a random... for financing; apply for a lease; provide account information; give us your contact information; pay...
12 CFR Appendix A to Part 40 - Model Privacy Form
Code of Federal Regulations, 2010 CFR
2010-01-01
... rates and payments; retirement assets; checking account information; employment information; wire... identified as “[account #].” Institutions that require additional or different information, such as a random... for financing; apply for a lease; provide account information; give us your contact information; pay...
12 CFR Appendix A to Part 716 - Model Privacy Form
Code of Federal Regulations, 2010 CFR
2010-01-01
... rates and payments; retirement assets; checking account information; employment information; wire... identified as “[account #].” Institutions that require additional or different information, such as a random... for financing; apply for a lease; provide account information; give us your contact information; pay...
Models of Human Information Requirements: "When Reasonable Aiding Systems Disagree"
NASA Technical Reports Server (NTRS)
Corker, Kevin; Pisanich, Gregory; Shafto, Michael (Technical Monitor)
1994-01-01
Aircraft flight management and Air Traffic Control (ATC) automation are under development to maximize the economy of flight and to increase the capacity of the terminal area airspace while maintaining levels of flight safety equal to or better than current system performance. These goals are being realized by the introduction of flight management automation aiding and operations support systems on the flight deck and by new developments of ATC aiding systems that seek to optimize scheduling of aircraft while potentially reducing required separation and accounting for weather and wake vortex turbulence. Aiding systems on both the flight deck and the ground operate through algorithmic functions on models of the aircraft and of the airspace. These models may differ from each other as a result of variations in their models of the immediate environment. The resultant flight operations or ATC commands may differ in their response requirements (e.g. different preferred descent speeds or descent initiation points). The human operators in the system must then interact with the automation to reconcile differences and resolve conflicts. We have developed a model of human performance including cognitive functions (decision-making, rule-based reasoning, procedural interruption recovery and forgetting) that supports analysis of the information requirements for resolution of flight aiding and ATC conflicts. The model represents multiple individuals in the flight crew and in ATC. The model is supported in simulation on a Silicon Graphics' workstation using Allegro Lisp. Design guidelines for aviation automation aiding systems have been developed using the model's specification of information and team procedural requirements. Empirical data on flight deck operations from full-mission flight simulation are provided to support the model's predictions. The paper describes the model, its development and implementation, the simulation test of the model predictions, and the empirical validation process. The model and its supporting data provide a generalizable tool that is being expanded to include air/ground compatibility and ATC crew interactions in air traffic management.
NASA Technical Reports Server (NTRS)
Wiswell, E. R.; Cooper, G. R. (Principal Investigator)
1978-01-01
The author has identified the following significant results. The concept of average mutual information in the received spectral random process about the spectral scene was developed. Techniques amenable to implementation on a digital computer were also developed to make the required average mutual information calculations. These techniques required identification of models for the spectral response process of scenes. Stochastic modeling techniques were adapted for use. These techniques were demonstrated on empirical data from wheat and vegetation scenes.
From Informal Safety-Critical Requirements to Property-Driven Formal Validation
NASA Technical Reports Server (NTRS)
Cimatti, Alessandro; Roveri, Marco; Susi, Angelo; Tonetta, Stefano
2008-01-01
Most of the efforts in formal methods have historically been devoted to comparing a design against a set of requirements. The validation of the requirements themselves, however, has often been disregarded, and it can be considered a largely open problem, which poses several challenges. The first challenge is given by the fact that requirements are often written in natural language, and may thus contain a high degree of ambiguity. Despite the progresses in Natural Language Processing techniques, the task of understanding a set of requirements cannot be automatized, and must be carried out by domain experts, who are typically not familiar with formal languages. Furthermore, in order to retain a direct connection with the informal requirements, the formalization cannot follow standard model-based approaches. The second challenge lies in the formal validation of requirements. On one hand, it is not even clear which are the correctness criteria or the high-level properties that the requirements must fulfill. On the other hand, the expressivity of the language used in the formalization may go beyond the theoretical and/or practical capacity of state-of-the-art formal verification. In order to solve these issues, we propose a new methodology that comprises of a chain of steps, each supported by a specific tool. The main steps are the following. First, the informal requirements are split into basic fragments, which are classified into categories, and dependency and generalization relationships among them are identified. Second, the fragments are modeled using a visual language such as UML. The UML diagrams are both syntactically restricted (in order to guarantee a formal semantics), and enriched with a highly controlled natural language (to allow for modeling static and temporal constraints). Third, an automatic formal analysis phase iterates over the modeled requirements, by combining several, complementary techniques: checking consistency; verifying whether the requirements entail some desirable properties; verify whether the requirements are consistent with selected scenarios; diagnosing inconsistencies by identifying inconsistent cores; identifying vacuous requirements; constructing multiple explanations by enabling the fault-tree analysis related to particular fault models; verifying whether the specification is realizable.
Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M
2001-12-01
Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.
NASA Astrophysics Data System (ADS)
Valencia, J.; Muñoz-Nieto, A.; Rodriguez-Gonzalvez, P.
2015-02-01
3D virtual modeling, visualization, dissemination and management of urban areas is one of the most exciting challenges that must face geomatics in the coming years. This paper aims to review, compare and analyze the new technologies, policies and software tools that are in progress to manage urban 3D information. It is assumed that the third dimension increases the quality of the model provided, allowing new approaches to urban planning, conservation and management of architectural and archaeological areas. Despite the fact that displaying 3D urban environments is an issue nowadays solved, there are some challenges to be faced by geomatics in the coming future. Displaying georeferenced linked information would be considered the first challenge. Another challenge to face is to improve the technical requirements if this georeferenced information must be shown in real time. Are there available software tools ready for this challenge? Are they useful to provide services required in smart cities? Throughout this paper, many practical examples that require 3D georeferenced information and linked data will be shown. Computer advances related to 3D spatial databases and software that are being developed to convert rendering virtual environment to a new enriched environment with linked information will be also analyzed. Finally, different standards that Open Geospatial Consortium has assumed and developed regarding the three-dimensional geographic information will be reviewed. Particular emphasis will be devoted on KML, LandXML, CityGML and the new IndoorGML.
Adaptive control based on retrospective cost optimization
NASA Technical Reports Server (NTRS)
Bernstein, Dennis S. (Inventor); Santillo, Mario A. (Inventor)
2012-01-01
A discrete-time adaptive control law for stabilization, command following, and disturbance rejection that is effective for systems that are unstable, MIMO, and/or nonminimum phase. The adaptive control algorithm includes guidelines concerning the modeling information needed for implementation. This information includes the relative degree, the first nonzero Markov parameter, and the nonminimum-phase zeros. Except when the plant has nonminimum-phase zeros whose absolute value is less than the plant's spectral radius, the required zero information can be approximated by a sufficient number of Markov parameters. No additional information about the poles or zeros need be known. Numerical examples are presented to illustrate the algorithm's effectiveness in handling systems with errors in the required modeling data, unknown latency, sensor noise, and saturation.
Cognition and procedure representational requirements for predictive human performance models
NASA Technical Reports Server (NTRS)
Corker, K.
1992-01-01
Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods including procedural backtracking with concurrent search, temporal reasoning, and constraint checking for partial ordering of procedures. Finally, the representation is being linked to models of human decision making processes that include heuristic, propositional and prescriptive judgement models that are sensitive to the procedural content in which the valuative functions are being performed.
Model-centric approaches for the development of health information systems.
Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa
2007-01-01
Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.
Dunham, Kylee; Grand, James B.
2016-01-01
We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.
A New Method for Conceptual Modelling of Information Systems
NASA Astrophysics Data System (ADS)
Gustas, Remigijus; Gustiene, Prima
Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.
Fuzzy Naive Bayesian model for medical diagnostic decision support.
Wagholikar, Kavishwar B; Vijayraghavan, Sundararajan; Deshpande, Ashok W
2009-01-01
This work relates to the development of computational algorithms to provide decision support to physicians. The authors propose a Fuzzy Naive Bayesian (FNB) model for medical diagnosis, which extends the Fuzzy Bayesian approach proposed by Okuda. A physician's interview based method is described to define a orthogonal fuzzy symptom information system, required to apply the model. For the purpose of elaboration and elicitation of characteristics, the algorithm is applied to a simple simulated dataset, and compared with conventional Naive Bayes (NB) approach. As a preliminary evaluation of FNB in real world scenario, the comparison is repeated on a real fuzzy dataset of 81 patients diagnosed with infectious diseases. The case study on simulated dataset elucidates that FNB can be optimal over NB for diagnosing patients with imprecise-fuzzy information, on account of the following characteristics - 1) it can model the information that, values of some attributes are semantically closer than values of other attributes, and 2) it offers a mechanism to temper exaggerations in patient information. Although the algorithm requires precise training data, its utility for fuzzy training data is argued for. This is supported by the case study on infectious disease dataset, which indicates optimality of FNB over NB for the infectious disease domain. Further case studies on large datasets are required to establish utility of FNB.
Analysis of information systems for hydropower operations
NASA Technical Reports Server (NTRS)
Sohn, R. L.; Becker, L.; Estes, J.; Simonett, D.; Yeh, W. W. G.
1976-01-01
The operations of hydropower systems were analyzed with emphasis on water resource management, to determine how aerospace derived information system technologies can increase energy output. Better utilization of water resources was sought through improved reservoir inflow forecasting based on use of hydrometeorologic information systems with new or improved sensors, satellite data relay systems, and use of advanced scheduling techniques for water release. Specific mechanisms for increased energy output were determined, principally the use of more timely and accurate short term (0-7 days) inflow information to reduce spillage caused by unanticipated dynamic high inflow events. The hydrometeorologic models used in predicting inflows were examined to determine the sensitivity of inflow prediction accuracy to the many variables employed in the models, and the results used to establish information system requirements. Sensor and data handling system capabilities were reviewed and compared to the requirements, and an improved information system concept outlined.
Analysis of information systems for hydropower operations: Executive summary
NASA Technical Reports Server (NTRS)
Sohn, R. L.; Becker, L.; Estes, J.; Simonett, D.; Yeh, W.
1976-01-01
An analysis was performed of the operations of hydropower systems, with emphasis on water resource management, to determine how aerospace derived information system technologies can effectively increase energy output. Better utilization of water resources was sought through improved reservoir inflow forecasting based on use of hydrometeorologic information systems with new or improved sensors, satellite data relay systems, and use of advanced scheduling techniques for water release. Specific mechanisms for increased energy output were determined, principally the use of more timely and accurate short term (0-7 days) inflow information to reduce spillage caused by unanticipated dynamic high inflow events. The hydrometeorologic models used in predicting inflows were examined in detail to determine the sensitivity of inflow prediction accuracy to the many variables employed in the models, and the results were used to establish information system requirements. Sensor and data handling system capabilities were reviewed and compared to the requirements, and an improved information system concept was outlined.
Application of structured analysis to a telerobotic system
NASA Technical Reports Server (NTRS)
Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven
1990-01-01
The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.
Chen, Elizabeth S; Zhou, Li; Kashyap, Vipul; Schaeffer, Molly; Dykes, Patricia C; Goldberg, Howard S
2008-11-06
As Electronic Healthcare Records become more prevalent, there is an increasing need to ensure unambiguous data capture, interpretation, and exchange within and across heterogeneous applications. To address this need, a common, uniform, and comprehensive approach for representing clinical information is essential. At Partners HealthCare System, we are investigating the development and implementation of enterprise-wide information models to specify the representation of clinical information to support semantic interoperability. This paper summarizes our early experiences in: (1) defining a process for information model development, (2) reviewing and comparing existing healthcare information models, (3) identifying requirements for representation of laboratory and clinical observations, and (4) exploring linkages to existing terminology and data standards. These initial findings provide insight to the various challenges ahead and guidance on next steps for adoption of information models at our organization.
NASA Technical Reports Server (NTRS)
Korram, S.
1977-01-01
The design of general remote sensing-aided methodologies was studied to provide the estimates of several important inputs to water yield forecast models. These input parameters are snow area extent, snow water content, and evapotranspiration. The study area is Feather River Watershed (780,000 hectares), Northern California. The general approach involved a stepwise sequence of identification of the required information, sample design, measurement/estimation, and evaluation of results. All the relevent and available information types needed in the estimation process are being defined. These include Landsat, meteorological satellite, and aircraft imagery, topographic and geologic data, ground truth data, and climatic data from ground stations. A cost-effective multistage sampling approach was employed in quantification of all the required parameters. The physical and statistical models for both snow quantification and evapotranspiration estimation was developed. These models use the information obtained by aerial and ground data through appropriate statistical sampling design.
An efficient temporal database design method based on EER
NASA Astrophysics Data System (ADS)
Liu, Zhi; Huang, Jiping; Miao, Hua
2007-12-01
Many existing methods of modeling temporal information are based on logical model, which makes relational schema optimization more difficult and more complicated. In this paper, based on the conventional EER model, the author attempts to analyse and abstract temporal information in the phase of conceptual modelling according to the concrete requirement to history information. Then a temporal data model named BTEER is presented. BTEER not only retains all designing ideas and methods of EER which makes BTEER have good upward compatibility, but also supports the modelling of valid time and transaction time effectively at the same time. In addition, BTEER can be transformed to EER easily and automatically. It proves in practice, this method can model the temporal information well.
Prodinger, Birgit; Tennant, Alan; Stucki, Gerold; Cieza, Alarcos; Üstün, Tevfik Bedirhan
2016-10-01
Our aim was to specify the requirements of an architecture to serve as the foundation for standardized reporting of health information and to provide an exemplary application of this architecture. The World Health Organization's International Classification of Functioning, Disability and Health (ICF) served as the conceptual framework. Methods to establish content comparability were the ICF Linking Rules. The Rasch measurement model, as a special case of additive conjoint measurement, which satisfies the required criteria for fundamental measurement, allowed for the development of a common metric foundation for measurement unit conversion. Secondary analysis of data from the North Yorkshire Survey was used to illustrate these methods. Patients completed three instruments and the items were linked to the ICF. The Rasch measurement model was applied, first to each scale, and then to items across scales which were linked to a common domain. Based on the linking of items to the ICF, the majority of items were grouped into two domains, Mobility and Self-care. Analysis of the individual scales and of items linked to a common domain across scales satisfied the requirements of the Rasch measurement model. The measurement unit conversion between items from the three instruments linked to the Mobility and Self-care domains, respectively, was demonstrated. The realization of an ICF-based architecture for information on patients' functioning enables harmonization of health information while allowing clinicians and researchers to continue using their existing instruments. This architecture will facilitate access to comprehensive and consistently reported health information to serve as the foundation for informed decision-making. © The Author(s) 2016.
Online information search behaviour of physicians.
Mikalef, Patrick; Kourouthanassis, Panos E; Pateli, Adamantia G
2017-03-01
Although doctors increasingly engage in online information seeking to complement their medical practice, little is known regarding what online information sources are used and how effective they are. Grounded on self-determination and needs theory, this study posits that doctors tend to use online information sources to fulfil their information requirements in three pre-defined areas: patient care, knowledge development and research activities. Fulfilling these information needs is argued to improve doctors' perceived medical practice competence. Performing PLS-SEM analysis on primary survey data from 303 medical doctors practicing in four major Greek hospitals, a conceptual model is empirically tested. Using authoritative online information sources was found to fulfil all types of information needs. Contrarily, using non-authoritative information sources had no significant effect. Satisfying information requirements relating to patient care and research activities enhanced doctors' perceptions about their medical practice competence. In contrast, meeting knowledge development information needs had the opposite result. Consistent with past studies, outcomes indicate that doctors tend to use non-authoritative online information sources; yet their use was found to have no significant value in fulfilling their information requirements. Authoritative online information sources are found to improve perceived medical practice competence by satisfying doctors' diverse information requirements. © 2017 Health Libraries Group.
Nurse Training Act of 1975: Second Report to the Congress, March 15, 1979 (Revised).
ERIC Educational Resources Information Center
Health Resources Administration (DHEW/PHS), Bethesda, MD. Bureau of Health Manpower.
In compliance with section 951 of Public Law 94-63, this second annual report presents and anlayzes information on the supply and distribution of and requirements for nurses. Chapter 1 presents three models on the requirements for nursing personnel in the nation: The Vector Model (impact of health system changes), Pugh Roberts Model (the system…
ERIC Educational Resources Information Center
Deane, Robert T.; Ro, Kong-Kyun
The analysis and description of four manpower nursing requirements models-- the Pugh-Roberts, the Vector, the Community Systems Foundation (CSF), and the Western Interstate Commission of Higher Education (WICHE)--are presented in this report. The introduction provides an overview of the project which was designed to analyze these different models.…
A requirements index for information processing in hospitals.
Ammenwerth, E; Buchauer, A; Haux, R
2002-01-01
Reference models describing typical information processing requirements in hospitals do not currently exist. This leads to high hospital information system (HIS) management expenses, for example, during tender processes for the acquisition of software application programs. Our aim was, therefore, to develop a comprehensive, lasting, technology-independent, and sufficiently detailed index of requirements for information processing in hospitals in order to reduce respective expenses. Two-dozen German experts established an index of requirements for information processing in university hospitals. This was done in a consensus-based, top-down, cyclic manner. Each functional requirement was derived from information processing functions and sub-functions of a hospital. The result is the first official German version of a requirements index, containing 233 functional requirements and 102 function-independent requirements, focusing on German needs. The functional requirements are structured according to the primary care process from admission to discharge and supplemented by requirements for handling patient records, work organization and resource planning, hospital management, research and education. Both the German version and its English translation are available in the Internet. The index of requirements contains general information processing requirements in hospitals which are formulated independent of information processing tools, or of HIS architectures. It aims at supporting HIS management, especially HIS strategic planning, HIS evaluation, and tender processes. The index can be regarded as a draft, which must, however, be refined according to the specific aims of a particular project. Although focused on German needs, we expect that it can also be useful in other countries. The high amount of interest shown for the index supports its usefulness.
Solid rocket booster thermal radiation model. Volume 2: User's manual
NASA Technical Reports Server (NTRS)
Lee, A. L.
1976-01-01
A user's manual was prepared for the computer program of a solid rocket booster (SRB) thermal radiation model. The following information was included: (1) structure of the program, (2) input information required, (3) examples of input cards and output printout, (4) program characteristics, and (5) program listing.
ERIC Educational Resources Information Center
Jeyaraj, Anand
2010-01-01
The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…
Automatic pattern identification of rock moisture based on the Staff-RF model
NASA Astrophysics Data System (ADS)
Zheng, Wei; Tao, Kai; Jiang, Wei
2018-04-01
Studies on the moisture and damage state of rocks generally focus on the qualitative description and mechanical information of rocks. This method is not applicable to the real-time safety monitoring of rock mass. In this study, a musical staff computing model is used to quantify the acoustic emission signals of rocks with different moisture patterns. Then, the random forest (RF) method is adopted to form the staff-RF model for the real-time pattern identification of rock moisture. The entire process requires only the computing information of the AE signal and does not require the mechanical conditions of rocks.
Directory of Energy Information Administration Model Abstracts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1986-07-16
This directory partially fulfills the requirements of Section 8c, of the documentation order, which states in part that: The Office of Statistical Standards will annually publish an EIA document based on the collected abstracts and the appendices. This report contains brief statements about each model's title, acronym, purpose, and status, followed by more detailed information on characteristics, uses, and requirements. Sources for additional information are identified. All models active through March 1985 are included. The main body of this directory is an alphabetical list of all active EIA models. Appendix A identifies major EIA modeling systems and the models withinmore » these systems, and Appendix B identifies active EIA models by type (basic, auxiliary, and developing). EIA also leases models developed by proprietary software vendors. Documentation for these proprietary models is the responsibility of the companies from which they are leased. EIA has recently leased models from Chase Econometrics, Inc., Data Resources, Inc. (DRI), the Oak Ridge National Laboratory (ORNL), and Wharton Econometric Forecasting Associates (WEFA). Leased models are not abstracted here. The directory is intended for the use of energy and energy-policy analysts in the public and private sectors.« less
An information model to support user-centered design of medical devices.
Hagedorn, Thomas J; Krishnamurty, Sundar; Grosse, Ian R
2016-08-01
The process of engineering design requires the product development team to balance the needs and limitations of many stakeholders, including those of the user, regulatory organizations, and the designing institution. This is particularly true in medical device design, where additional consideration must be given for a much more complex user-base that can only be accessed on a limited basis. Given this inherent challenge, few projects exist that consider design domain concepts, such as aspects of a detailed design, a detailed view of various stakeholders and their capabilities, along with the user-needs simultaneously. In this paper, we present a novel information model approach that combines a detailed model of design elements with a model of the design itself, customer requirements, and of the capabilities of the customer themselves. The information model is used to facilitate knowledge capture and automated reasoning across domains with a minimal set of rules by adopting a terminology that treats customer and design specific factors identically, thus enabling straightforward assessments. A uniqueness of this approach is that it systematically provides an integrated perspective on the key usability information that drive design decisions towards more universal or effective outcomes with the very design information impacted by the usability information. This can lead to cost-efficient optimal designs based on a direct inclusion of the needs of customers alongside those of business, marketing, and engineering requirements. Two case studies are presented to show the method's potential as a more effective knowledge management tool with built-in automated inferences that provide design insight, as well as its overall effectiveness as a platform to develop and execute medical device design from a holistic perspective. Copyright © 2016 Elsevier Inc. All rights reserved.
Bitwise efficiency in chaotic models
Düben, Peter; Palmer, Tim
2017-01-01
Motivated by the increasing energy consumption of supercomputing for weather and climate simulations, we introduce a framework for investigating the bit-level information efficiency of chaotic models. In comparison with previous explorations of inexactness in climate modelling, the proposed and tested information metric has three specific advantages: (i) it requires only a single high-precision time series; (ii) information does not grow indefinitely for decreasing time step; and (iii) information is more sensitive to the dynamics and uncertainties of the model rather than to the implementation details. We demonstrate the notion of bit-level information efficiency in two of Edward Lorenz’s prototypical chaotic models: Lorenz 1963 (L63) and Lorenz 1996 (L96). Although L63 is typically integrated in 64-bit ‘double’ floating point precision, we show that only 16 bits have significant information content, given an initial condition uncertainty of approximately 1% of the size of the attractor. This result is sensitive to the size of the uncertainty but not to the time step of the model. We then apply the metric to the L96 model and find that a 16-bit scaled integer model would suffice given the uncertainty of the unresolved sub-grid-scale dynamics. We then show that, by dedicating computational resources to spatial resolution rather than numeric precision in a field programmable gate array (FPGA), we see up to 28.6% improvement in forecast accuracy, an approximately fivefold reduction in the number of logical computing elements required and an approximately 10-fold reduction in energy consumed by the FPGA, for the L96 model. PMID:28989303
Bitwise efficiency in chaotic models
NASA Astrophysics Data System (ADS)
Jeffress, Stephen; Düben, Peter; Palmer, Tim
2017-09-01
Motivated by the increasing energy consumption of supercomputing for weather and climate simulations, we introduce a framework for investigating the bit-level information efficiency of chaotic models. In comparison with previous explorations of inexactness in climate modelling, the proposed and tested information metric has three specific advantages: (i) it requires only a single high-precision time series; (ii) information does not grow indefinitely for decreasing time step; and (iii) information is more sensitive to the dynamics and uncertainties of the model rather than to the implementation details. We demonstrate the notion of bit-level information efficiency in two of Edward Lorenz's prototypical chaotic models: Lorenz 1963 (L63) and Lorenz 1996 (L96). Although L63 is typically integrated in 64-bit `double' floating point precision, we show that only 16 bits have significant information content, given an initial condition uncertainty of approximately 1% of the size of the attractor. This result is sensitive to the size of the uncertainty but not to the time step of the model. We then apply the metric to the L96 model and find that a 16-bit scaled integer model would suffice given the uncertainty of the unresolved sub-grid-scale dynamics. We then show that, by dedicating computational resources to spatial resolution rather than numeric precision in a field programmable gate array (FPGA), we see up to 28.6% improvement in forecast accuracy, an approximately fivefold reduction in the number of logical computing elements required and an approximately 10-fold reduction in energy consumed by the FPGA, for the L96 model.
ERIC Educational Resources Information Center
Greene, Gloria; Robb, Reive
The main concerns of this manpower survey were to examine and, where possible, modify and expand on the manpower planning model generated in the 1982 pilot study, and to use the model to assist with the forecasting of manpower requirements for library and information systems in the Caribbean region. Libraries and information systems in this area…
Supply of genetic information--amount, format, and frequency.
Misztal, I; Lawlor, T J
1999-05-01
The volume and complexity of genetic information is increasing because of new traits and better models. New traits may include reproduction, health, and carcass. More comprehensive models include the test day model in dairy cattle or a growth model in beef cattle. More complex models, which may include nonadditive effects such as inbreeding and dominance, also provide additional information. The amount of information per animal may increase drastically if DNA marker typing becomes routine and quantitative trait loci information is utilized. In many industries, evaluations are run more frequently. They result in faster genetic progress and improved management and marketing opportunities but also in extra costs and information overload. Adopting new technology and making some organizational changes can help realize all the added benefits of the improvements to the genetic evaluation systems at an acceptable cost. Continuous genetic evaluation, in which new records are accepted and breeding values are updated continuously, will relieve time pressures. An online mating system with access to both genetic and marketing information can result in mating recommendations customized for each user. Such a system could utilize inbreeding and dominance information that cannot efficiently be accommodated in the current sire summaries or off-line mating programs. The new systems will require a new organizational approach in which the task of scientists and technicians will not be simply running the evaluations but also providing the research, design, supervision, and maintenance required in the entire system of evaluation, decision making, and distribution.
On Modeling Research Work for Describing and Filtering Scientific Information
NASA Astrophysics Data System (ADS)
Sicilia, Miguel-Ángel
Existing models for Research Information Systems (RIS) properly address the description of people and organizations, projects, facilities and their outcomes, e.g. papers, reports or patents. While this is adequate for the recording and accountability of research investments, helping researchers in finding relevant people, organizations or results requires considering both the content of research work and also its context. The content is not only related to the domain area, but it requires modeling methodological issues as variables, instruments or scientific methods that can then be used as search criteria. The context of research work is determined by the ongoing projects or scientific interests of an individual or a group, and can be expressed using the same methodological concepts. However, modeling methodological issues is notably complex and dependent on the scientific discipline and research area. This paper sketches the main requirements for those models, providing some motivating examples that could serve as a point of departure for future attempts in developing an upper ontology for research methods and tools.
Habitat suitability index models: Black crappie
Edwards, Elizabeth A.; Krieger, Douglas A.; Bacteller, Mary; Maughan, O. Eugene
1982-01-01
Characteristics and habitat requirements of the black crappie (Pomoxis nigromaculatus) are described in a review of Habitat Suitability Index models. This is one in a series of publications to provide information on the habitat requirements of selected fish and wildlife species. Numerous literature sources have been consulted in an effort to consolidate scientific data on species-habitat relationships. These data have subsequently been synthesized into explicit Habitat Suitability Index (HSI) models. The models are based on suitability indices indicating habitat preferences. Indices have been formulated for variables found to affect the life cycle and survival of each species. Habitat Suitability Index (HSI) models are designed to provide information for use in impact assessment and habitat management activities. The HSI technique is a corollary to the U.S. Fish and Wildlife Service's Habitat Evaluation Procedures.
An integrated approach to system design, reliability, and diagnosis
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Iverson, David L.
1990-01-01
The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems ingeneering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms.
NASA Technical Reports Server (NTRS)
Margaria, Tiziana (Inventor); Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Steffen, Bernard (Inventor)
2010-01-01
Systems, methods and apparatus are provided through which in some embodiments, automata learning algorithms and techniques are implemented to generate a more complete set of scenarios for requirements based programming. More specifically, a CSP-based, syntax-oriented model construction, which requires the support of a theorem prover, is complemented by model extrapolation, via automata learning. This may support the systematic completion of the requirements, the nature of the requirement being partial, which provides focus on the most prominent scenarios. This may generalize requirement skeletons by extrapolation and may indicate by way of automatically generated traces where the requirement specification is too loose and additional information is required.
Inverse models: A necessary next step in ground-water modeling
Poeter, E.P.; Hill, M.C.
1997-01-01
Inverse models using, for example, nonlinear least-squares regression, provide capabilities that help modelers take full advantage of the insight available from ground-water models. However, lack of information about the requirements and benefits of inverse models is an obstacle to their widespread use. This paper presents a simple ground-water flow problem to illustrate the requirements and benefits of the nonlinear least-squares repression method of inverse modeling and discusses how these attributes apply to field problems. The benefits of inverse modeling include: (1) expedited determination of best fit parameter values; (2) quantification of the (a) quality of calibration, (b) data shortcomings and needs, and (c) confidence limits on parameter estimates and predictions; and (3) identification of issues that are easily overlooked during nonautomated calibration.Inverse models using, for example, nonlinear least-squares regression, provide capabilities that help modelers take full advantage of the insight available from ground-water models. However, lack of information about the requirements and benefits of inverse models is an obstacle to their widespread use. This paper presents a simple ground-water flow problem to illustrate the requirements and benefits of the nonlinear least-squares regression method of inverse modeling and discusses how these attributes apply to field problems. The benefits of inverse modeling include: (1) expedited determination of best fit parameter values; (2) quantification of the (a) quality of calibration, (b) data shortcomings and needs, and (c) confidence limits on parameter estimates and predictions; and (3) identification of issues that are easily overlooked during nonautomated calibration.
The study on knowledge transferring incentive for information system requirement development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yang
2015-03-10
Information system requirement development is a process of users’ knowledge sharing and transferring. However the tacit requirements developing is a main problem during requirement development process, for the reason of difficult to encoding, express, and communicate. Knowledge fusion and corporate effort is needed to finding tacit requirements. Under this background, our paper try to find out the rule of effort dynamic evolutionary of software developer and user by building an evolutionary game model on the condition of incentive system. And in addition this paper provides an in depth discussion at the end of this paper.
NASA Astrophysics Data System (ADS)
Jara, A. J.; Bocchi, Y.; Fernandez, D.; Molina, G.; Gomez, A.
2017-09-01
Smart Cities requires the support of context-aware and enriched semantic descriptions to support a scalable and cross-domain development of smart applications. For example, nowadays general purpose sensors such as crowd monitoring (counting people in an area), environmental information (pollution, air quality, temperature, humidity, noise) etc. can be used in multiple solutions with different objectives. For that reason, a data model that offers advanced capabilities for the description of context is required. This paper presents an overview of the available technologies for this purpose and how it is being addressed by the Open and Agile Smart Cities principles and FIWARE platform through the data models defined by the ETSI ISG Context Information Management (ETSI CIM).
A Dynamic/Anisotropic Low Earth Orbit (LEO) Ionizing Radiation Model
NASA Technical Reports Server (NTRS)
Badavi, Francis F.; West, Katie J.; Nealy, John E.; Wilson, John W.; Abrahms, Briana L.; Luetke, Nathan J.
2006-01-01
The International Space Station (ISS) provides the proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the experimental validation of ionizing radiation environmental models, nuclear transport code algorithms, and nuclear reaction cross sections. Indeed, prior measurements on the Space Transportation System (STS; Shuttle) have provided vital information impacting both the environmental models and the nuclear transport code development by requiring dynamic models of the Low Earth Orbit (LEO) environment. Previous studies using Computer Aided Design (CAD) models of the evolving ISS configurations with Thermo Luminescent Detector (TLD) area monitors, demonstrated that computational dosimetry requires environmental models with accurate non-isotropic as well as dynamic behavior, detailed information on rack loading, and an accurate 6 degree of freedom (DOF) description of ISS trajectory and orientation.
The Design and Implement of Tourism Information System Based on GIS
NASA Astrophysics Data System (ADS)
Chunchang, Fu; Nan, Zhang
From the geographical information system concept, discusses the main contents of the geographic information system, and the current of the geographic information system key technological measures of tourism information system, the application of tourism information system for specific requirements and goals, and analyzes a relational database model based on the tourist information system in GIS application methods of realization.
Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling
NASA Astrophysics Data System (ADS)
March, Salvatore T.; Allen, Gove N.
Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.
Information model construction of MES oriented to mechanical blanking workshop
NASA Astrophysics Data System (ADS)
Wang, Jin-bo; Wang, Jin-ye; Yue, Yan-fang; Yao, Xue-min
2016-11-01
Manufacturing Execution System (MES) is one of the crucial technologies to implement informatization management in manufacturing enterprises, and the construction of its information model is the base of MES database development. Basis on the analysis of the manufacturing process information in mechanical blanking workshop and the information requirement of MES every function module, the IDEF1X method was adopted to construct the information model of MES oriented to mechanical blanking workshop, and a detailed description of the data structure feature included in MES every function module and their logical relationship was given from the point of view of information relationship, which laid the foundation for the design of MES database.
MARC ES: a computer program for estimating medical information storage requirements.
Konoske, P J; Dobbins, R W; Gauker, E D
1998-01-01
During combat, documentation of medical treatment information is critical for maintaining continuity of patient care. However, knowledge of prior status and treatment of patients is limited to the information noted on a paper field medical card. The Multi-technology Automated Reader Card (MARC), a smart card, has been identified as a potential storage mechanism for casualty medical information. Focusing on data capture and storage technology, this effort developed a Windows program, MARC ES, to estimate storage requirements for the MARC. The program calculates storage requirements for a variety of scenarios using medical documentation requirements, casualty rates, and casualty flows and provides the user with a tool to estimate the space required to store medical data at each echelon of care for selected operational theaters. The program can also be used to identify the point at which data must be uploaded from the MARC if size constraints are imposed. Furthermore, this model can be readily extended to other systems that store or transmit medical information.
Experiences Using Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1996-01-01
This paper describes three cases studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, the formal modeling provided a cost effective enhancement of the existing verification and validation processes. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
A Management Information System in a Library Environment.
ERIC Educational Resources Information Center
Sutton, Michael J.; Black, John B.
More effective use of diminishing resources was needed to provide the best possible services at the University of Guelph (Ontario, Canada) library. This required the improved decision-making processes of a Library Management Information System (LMIS) to provide systematic information analysis. An information flow model was created, and an…
AIR QUALITY MODELING OF PM AND AIR TOXICS AT NEIGHBORHOOD SCALES
The current interest in fine particles and toxics pollutants provide an impetus for extending air quality modeling capability towards improving exposure modeling and assessments. Human exposure models require information on concentration derived from interpolation of observati...
Using sampling theory as the basis for a conceptual data model
Fred C. Martin; Tonya Baggett; Tom Wolfe
2000-01-01
Greater demands on forest resources require that larger amounts of information be readily available to decisionmakers. To provide more information faster, databases must be developed that are more comprehensive and easier to use. Data modeling is a process for building more complete and flexible databases by emphasizing fundamental relationships over existing or...
Information system modeling for biomedical imaging applications
NASA Astrophysics Data System (ADS)
Hoo, Kent S., Jr.; Wong, Stephen T. C.
1999-07-01
Information system modeling has historically been relegated to a low priority among the designers of information systems. Often times, there is a rush to design and implement hardware and software solutions after only the briefest assessments of the domain requirements. Although this process results in a rapid development cycle, the system usually does not satisfy the needs of the users and the developers are forced to re-program certain aspects of the system. It would be much better to create an accurate model of the system based on the domain needs so that the implementation of the solution satisfies the needs of the users immediately. It would also be advantageous to build extensibility into the model so that updates to the system could be carried out in an organized fashion. The significance of this research is the development of a new formal framework for the construction of a multimedia medical information system. This formal framework is constructed using visual modeling which provides a way of thinking about problems using models organized around real- world ideas. These models provide an abstract way to view complex problems, making them easier for one to understand. The formal framework is the result of an object-oriented analysis and design process that translates the systems requirements and functionality into software models. The usefulness of this information framework is demonstrated with two different applications in epilepsy research and care, i.e., surgical planning of epilepsy and decision threshold determination.
Valder, Joshua F.; Delzer, Gregory C.; Carter, Janet M.; Smith, Bruce D.; Smith, David V.
2016-09-28
The city of Sioux Falls is the fastest growing community in South Dakota. In response to this continued growth and planning for future development, Sioux Falls requires a sustainable supply of municipal water. Planning and managing sustainable groundwater supplies requires a thorough understanding of local groundwater resources. The Big Sioux aquifer consists of glacial outwash sands and gravels and is hydraulically connected to the Big Sioux River, which provided about 90 percent of the city’s source-water production in 2015. Managing sustainable groundwater supplies also requires an understanding of groundwater availability. An effective mechanism to inform water management decisions is the development and utilization of a groundwater-flow model. A groundwater-flow model provides a quantitative framework for synthesizing field information and conceptualizing hydrogeologic processes. These groundwater-flow models can support decision making processes by mapping and characterizing the aquifer. Accordingly, the city of Sioux Falls partnered with the U.S. Geological Survey to construct a groundwater-flow model. Model inputs will include data from advanced geophysical techniques, specifically airborne electromagnetic methods.
Lockwood, Craig; Stephenson, Matthew; Lizarondo, Lucylynn; van Den Hoek, Joan; Harrison, Margaret
2016-08-01
This paper describes an online facilitation for operationalizing the knowledge-to-action (KTA) model. The KTA model incorporates implementation planning that is optimally suited to the information needs of clinicians. The can-implement(©) is an evidence implementation process informed by the KTA model. An online counterpart, the can-implement.pro(©) , was developed to enable greater dissemination and utilization of the can-implement(©) process. The driver for this work was health professionals' need for facilitation that is iterative, informed by context and localized to the specific needs of users. The literature supporting this paper includes evaluation studies and theoretical concepts relevant to KTA model, evidence implementation and facilitation. Nursing and other health disciplines require a skill set and resources to successfully navigate the complexity of organizational requirements, inter-professional leadership and day-to-day practical management to implement evidence into clinical practice. The can-implement.pro(©) provides an accessible, inclusive system for evidence implementation projects. There is empirical support for evidence implementation informed by the KTA model, which in this phase of work has been developed for online uptake. Nurses and other clinicians seeking to implement evidence could benefit from the directed actions, planning advice and information embedded in the phases and steps of can-implement.pro(©) . © 2016 John Wiley & Sons Australia, Ltd.
Using SCOR as a Supply Chain Management Framework for Government Agency Contract Requirements
NASA Technical Reports Server (NTRS)
Paxton, Joseph; Tucker, Brian
2010-01-01
This paper will present a model that uses the Supply-Chain Operations Reference (SCOR) model as a foundation for a framework to illustrate the information needed throughout a product lifecycle to support a healthy supply chain management function and the subsequent contract requirements to enable it. It will also show where in the supply chain the information must be extracted. The ongoing case study used to exemplify the model is NASA's (National Aeronautics and Space Administration) Ares I program for human spaceflight. Effective supply chain management and contract requirements are ongoing opportunities for continuous improvement within government agencies, specifically development of systems for human spaceflight operations. Multiple reports from the Government Accountability Office (GAO) reinforce this importance. The SCOR model is a framework for describing a supply chain with process building blocks and business activities. It provides a set of metrics for measuring supply chain performance and best practices for continuously improving. This paper expands the application of the SCOR to also provide the framework for defining information needed from different levels of the supply chain and at different phases of the lifecycle. These needs can be incorporated into contracts to enable more effective supply chain management. Depending on the phase of the lifecycle, effective supply chain management will require involvement from different levels of the organization and different levels of the supply chain.
Kim, Katherine K; Browe, Dennis K; Logan, Holly C; Holm, Roberta; Hack, Lori; Ohno-Machado, Lucila
2014-01-01
There is currently limited information on best practices for the development of governance requirements for distributed research networks (DRNs), an emerging model that promotes clinical data reuse and improves timeliness of comparative effectiveness research. Much of the existing information is based on a single type of stakeholder such as researchers or administrators. This paper reports on a triangulated approach to developing DRN data governance requirements based on a combination of policy analysis with experts, interviews with institutional leaders, and patient focus groups. This approach is illustrated with an example from the Scalable National Network for Effectiveness Research, which resulted in 91 requirements. These requirements were analyzed against the Fair Information Practice Principles (FIPPs) and Health Insurance Portability and Accountability Act (HIPAA) protected versus non-protected health information. The requirements addressed all FIPPs, showing how a DRN's technical infrastructure is able to fulfill HIPAA regulations, protect privacy, and provide a trustworthy platform for research. PMID:24302285
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.
2015-12-03
Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insightsmore » from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of grid modernization activities. The availability of a grid HILF risk model, integrated across multi-hazard domains which, when interrogated, can support transparent, defensible and effective decisions, is an attractive prospect among these communities. In this report, we document an integrated HILF risk framework intended to inform the development of risk models. These models would be based on the systematic and comprehensive (to within scope) characterization of hazards to the level of detail required for modeling risk, identification of the stressors associated with the hazards (i.e., the means of impacting grid and supporting infrastructure), characterization of the vulnerability of assets to these stressors and the probabilities of asset compromise, the grid’s dynamic response to the asset failures, and assessment of subsequent severities of consequence with respect to selected impact metrics, such as power outage duration and geographic reach. Specifically, the current framework is being developed to;1. Provide the conceptual and overarching technical paradigms for the development of risk models; 2. Identify the classes of models required to implement the framework - providing examples of existing models, and also identifying where modeling gaps exist; 3. Identify the types of data required, addressing circumstances under which data are sparse and the formal elicitation of informed judgment might be required; and 4. Identify means by which the resultant risk models might be interrogated to form the necessary basis for risk management.« less
Real-Time Dynamic Modeling - Data Information Requirements and Flight Test Results
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Smith, Mark S.
2008-01-01
Practical aspects of identifying dynamic models for aircraft in real time were studied. Topics include formulation of an equation-error method in the frequency domain to estimate non-dimensional stability and control derivatives in real time, data information content for accurate modeling results, and data information management techniques such as data forgetting, incorporating prior information, and optimized excitation. Real-time dynamic modeling was applied to simulation data and flight test data from a modified F-15B fighter aircraft, and to operational flight data from a subscale jet transport aircraft. Estimated parameter standard errors and comparisons with results from a batch output-error method in the time domain were used to demonstrate the accuracy of the identified real-time models.
Real-Time Dynamic Modeling - Data Information Requirements and Flight Test Results
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; Smith, Mark S.
2010-01-01
Practical aspects of identifying dynamic models for aircraft in real time were studied. Topics include formulation of an equation-error method in the frequency domain to estimate non-dimensional stability and control derivatives in real time, data information content for accurate modeling results, and data information management techniques such as data forgetting, incorporating prior information, and optimized excitation. Real-time dynamic modeling was applied to simulation data and flight test data from a modified F-15B fighter aircraft, and to operational flight data from a subscale jet transport aircraft. Estimated parameter standard errors, prediction cases, and comparisons with results from a batch output-error method in the time domain were used to demonstrate the accuracy of the identified real-time models.
HydroCube: an entity-relationship hydrogeological data model
NASA Astrophysics Data System (ADS)
Wojda, Piotr; Brouyère, Serge; Derouane, Johan; Dassargues, Alain
2010-12-01
Managing, handling and accessing hydrogeological information depends heavily on the applied hydrogeological data models, which differ between institutions and countries. The effective dissemination of hydrogeological information requires the convergence of such models to make hydrogeological information accessible to multiple users such as universities, water suppliers, and administration and research organisations. Furthermore, because hydrogeological studies are complex, they require a wide variety of high-quality hydrogeological data with appropriate metadata in clearly designed and coherent structures. A need exists, therefore, to develop and implement hydrogeological data models that cover, as much as possible, the full hydrogeological domain. A new data model, called HydroCube, was developed for the Walloon Region in Belgium in 2005. The HydroCube model presents an innovative holistic project-based approach which covers a full set of hydrogeological concepts and features, allowing for effective hydrogeological project management. The model stores data relating to the project locality, hydrogeological equipment, and related observations and measurements. In particular, it focuses on specialized hydrogeological field experiments such as pumping and tracer tests. This logical data model uses entity-relationship diagrams and it has been implemented in the Microsoft Access environment. It has been enriched with a fully functional user interface.
30 CFR 550.218 - What air emissions information must accompany the EP?
Code of Federal Regulations, 2013 CFR
2013-07-01
... quality modeling, you must use the guidelines in appendix W of 40 CFR part 51 with a model approved by the.... (f) Modeling report. A modeling report or the modeling results (if § 550.303 requires you to use an...
30 CFR 550.218 - What air emissions information must accompany the EP?
Code of Federal Regulations, 2014 CFR
2014-07-01
... quality modeling, you must use the guidelines in Appendix W of 40 CFR part 51 with a model approved by the.... (f) Modeling report. A modeling report or the modeling results (if § 550.303 requires you to use an...
30 CFR 550.218 - What air emissions information must accompany the EP?
Code of Federal Regulations, 2012 CFR
2012-07-01
... quality modeling, you must use the guidelines in Appendix W of 40 CFR part 51 with a model approved by the.... (f) Modeling report. A modeling report or the modeling results (if § 550.303 requires you to use an...
Space station needs, attributes and architectural options: Mission requirements
NASA Technical Reports Server (NTRS)
1983-01-01
Various mission requirements for the proposed space station are examined. Subjects include modelling methodology, science applications, commercial opportunities, operations analysis, integrated mission requirements, and the role of man in space station functions and activities. The information is presented through the use of graphs.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Negligence in securing informed consent and medical malpractice.
Perry, C
1988-01-01
The doctrine of informed consent requires that the patient must act voluntarily and in the light of adequate information in order to give legally valid consent to medical care. Different models have been developed by various courts to determine whether the informational requirement, what the physician must disclose to the patient about the potential risks of the proposed treatment, has been met under the tort theory of negligence. To prevail, the patient plaintiff must show that a particular jurisdiction's disclosure standard has been breached, that harm has resulted, and that the defendant physician's negligent failure to discuss certain risks was causally responsible for the patient's failure to withhold consent. Perry discusses possible problems of redundancy or inconsistency concerning the relationship between different models for disclosure and causality, and notes that these problems may have serious implications for patient autonomy.
Report of the panel on geopotential fields: Gravity field, section 8
NASA Technical Reports Server (NTRS)
Anderson, Allen Joel; Kaula, William M.; Lazarewics, Andrew R.; Lefebvre, Michel; Phillips, Roger J.; Rapp, Richard H.; Rummel, Reinhard F.; Smith, David E.; Tapley, Byron D.; Zlotnick, Victor
1991-01-01
The objective of the Geopotential Panel was to develop a program of data acquisition and model development for the Earth's gravity and magnetic fields that meet the basic science requirements of the solid Earth and ocean studies. Presented here are the requirements for gravity information and models through the end of the century, the present status of our knowledge, data acquisition techniques, and an outline of a program to meet the requirements.
Habitat Suitability Index Models: Eastern meadowlark
Schroeder, Richard L.; Sousa, Patrick J.
1982-01-01
Habitat preferences of the eastern meadowlark (Sturnella magna) are described in this publication, which is one of a series of Habitat Suitability Index (HSI) models. Habitat use information is presented in a synthesis of the literature on the species-habitat requirements of the eastern meadowlark, followed by the development of the HSI model. The model is presented in three formats: graphic, word, and mathematical, and is designed to provide information for use in impact assessment and habitat management activities.
Habitat Suitability Index Models: Pine warbler
Schroeder, Richard L.
1982-01-01
Habitat preferences of the pine warbler (Dendroica pinus) are described in this publication, which is one of a series of Habitat Suitability Index (HSI) models. Habitat use information is presented in a synthesis of the literature on the species-habitat requirements of the pine warbler, followed by the development of the HSI model. The model is presented in three formats: graphic, word, and mathematical, and is designed to provide information for use in impact assessment and habitat management activities.
40 CFR 600.405-77 - Dealer requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Dealer Availability of Fuel Economy Information § 600.405-77 Dealer... information that similar booklets containing the EPA fuel economy information are also available through the...
Message Processing Research from Psychology to Communication.
ERIC Educational Resources Information Center
Basil, Michael D.
Information processing theories have been very useful in psychology. The application of information processing literature to communication, however, requires definitions of audiences and definitions of messages relevant to information-processing theories. In order to establish the relevant aspect of audiences, a multiple-stage model of audiences…
40 CFR 600.405-77 - Dealer requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Dealer Availability of Fuel Economy Information § 600.405-77 Dealer... information that similar booklets containing the EPA fuel economy information are also available through the...
Habitat Suitability Index Models: Pronghorn
Allen, Arthur W.; Cook, John G.; Armbruster, Michael J.
1984-01-01
This is one of a series of publications that provide information on the habitat requirements of selected fish and wildlife species. Literature describing the relationship between habitat variables related to life requisites and habitat suitability for the pronghorn (Antilocapra americana) are synthesized. These data are subsequently used to develop Habitat Suitability Index (HSI) models. The HSI models are designed to provide information that can be used in impact assessment and habitat management.
ERIC Educational Resources Information Center
Sharifzadeh, Aboulqasem; Abdollahzadeh, Gholam Hossein; Sharifi, Mahnoosh
2009-01-01
Capacity Development is needed in the Iranian Agricultural System. Integrating Information and Communication Technologies (ICTs) in the agricultural research system is an appropriate capacity development mechanism. The appropriate application of ICTs and information such as a National Agricultural Information System requires a systemically…
ERIC Educational Resources Information Center
Spears, Janine L.; Parrish, James L., Jr.
2013-01-01
This teaching case introduces students to a relatively simple approach to identifying and documenting security requirements within conceptual models that are commonly taught in systems analysis and design courses. An introduction to information security is provided, followed by a classroom example of a fictitious company, "Fun &…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... Reinvestment Act of 2009 (ARRA), as Further Amended by the Temporary Extension Act (TEA) of 2010, Notice AGENCY... Model Health Care Continuation Coverage Notices required by ARRA, as further amended by TEA. SUMMARY: On... notices required by ARRA, as further amended by TEA. FOR FURTHER INFORMATION CONTACT: Kevin Horahan or...
Informatics in radiology: an information model of the DICOM standard.
Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L
2011-01-01
The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010
NASA Astrophysics Data System (ADS)
Park, Jin-Young; Lee, Dong-Eun; Kim, Byung-Soo
2017-10-01
Due to the increasing concern about climate change, efforts to reduce environmental load are continuously being made in construction industry, and LCA (life cycle assessment) is being presented as an effective method to assess environmental load. Since LCA requires information on construction quantity used for environmental load estimation, however, it is not being utilized in the environmental review in the early design phase where it is difficult to obtain such information. In this study, computation system for construction quantity based on standard cross section of road drainage facilities was developed to compute construction quantity required for LCA using only information available in the early design phase to develop and verify the effectiveness of a model that can perform environmental load estimation. The result showed that it is an effective model that can be used in the early design phase as it revealed a 13.39% mean absolute error rate.
ERIC Educational Resources Information Center
Almond, Russell G.
2007-01-01
Over the course of instruction, instructors generally collect a great deal of information about each student. Integrating that information intelligently requires models for how a student's proficiency changes over time. Armed with such models, instructors can "filter" the data--more accurately estimate the student's current proficiency…
Defining Support Requirements During Conceptual Design of Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Morris, W. D.; White, N. H.; Davis, W. T.; Ebeling, C. E.
1995-01-01
Current methods for defining the operational support requirements of new systems are data intensive and require significant design information. Methods are being developed to aid in the analysis process of defining support requirements for new launch vehicles during their conceptual design phase that work with the level of information available during this phase. These methods will provide support assessments based on the vehicle design and the operating scenarios. The results can be used both to define expected support requirements for new launch vehicle designs and to help evaluate the benefits of using new technologies. This paper describes the models, their current status, and provides examples of their use.
The ISACA Business Model for Information Security: An Integrative and Innovative Approach
NASA Astrophysics Data System (ADS)
von Roessing, Rolf
In recent years, information security management has matured into a professional discipline that covers both technical and managerial aspects in an organisational environment. Information security is increasingly dependent on business-driven parameters and interfaces to a variety of organisational units and departments. In contrast, common security models and frameworks have remained largely technical. A review of extant models ranging from [LaBe73] to more recent models shows that technical aspects are covered in great detail, while the managerial aspects of security are often neglected.Likewise, the business view on organisational security is frequently at odds with the demands of information security personnel or information technology management. In practice, senior and executive level management remain comparatively distant from technical requirements. As a result, information security is generally regarded as a cost factor rather than a benefit to the organisation.
75 FR 8279 - Airworthiness Directives; The Boeing Company Model 747 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... airplanes. The original NPRM would have superseded an existing AD that currently requires repetitive... inspection of the modified area. The original NPRM proposed to continue to require those actions using revised service information. For certain airplanes, the original NPRM proposed to require new repetitive...
An integrated approach to system design, reliability, and diagnosis
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Iverson, David L.
1990-01-01
The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems engineering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms. Procedures have been developed at Ames that perform reliability evaluations during design and failure diagnoses during system operation. These procedures utilize information from a central source, structured as object-oriented fault trees. Fault trees were selected because they are a flexible model widely used in aerospace applications and because they give a concise, structured representation of system behavior. The utility of this integrated environment for aerospace applications in light of our experiences during its development and use is described. The techniques for reliability evaluation and failure diagnosis are discussed, and current extensions of the environment and areas requiring further development are summarized.
An integrated approach to system design, reliability, and diagnosis
NASA Astrophysics Data System (ADS)
Patterson-Hine, F. A.; Iverson, David L.
1990-12-01
The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems engineering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms. Procedures have been developed at Ames that perform reliability evaluations during design and failure diagnoses during system operation. These procedures utilize information from a central source, structured as object-oriented fault trees. Fault trees were selected because they are a flexible model widely used in aerospace applications and because they give a concise, structured representation of system behavior. The utility of this integrated environment for aerospace applications in light of our experiences during its development and use is described. The techniques for reliability evaluation and failure diagnosis are discussed, and current extensions of the environment and areas requiring further development are summarized.
AGIS: Integration of new technologies used in ATLAS Distributed Computing
NASA Astrophysics Data System (ADS)
Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria
2017-10-01
The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.
Development of an EVA systems cost model. Volume 3: EVA systems cost model
NASA Technical Reports Server (NTRS)
1975-01-01
The EVA systems cost model presented is based on proposed EVA equipment for the space shuttle program. General information on EVA crewman requirements in a weightless environment and an EVA capabilities overview are provided.
Streamlining environmental product declarations: a stage model
NASA Astrophysics Data System (ADS)
Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael
2001-02-01
General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development
A Reference Architecture for Space Information Management
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.
2006-01-01
We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.
Initiating Formal Requirements Specifications with Object-Oriented Models
NASA Technical Reports Server (NTRS)
Ampo, Yoko; Lutz, Robyn R.
1994-01-01
This paper reports results of an investigation into the suitability of object-oriented models as an initial step in developing formal specifications. The requirements for two critical system-level software modules were used as target applications. It was found that creating object-oriented diagrams prior to formally specifying the requirements enhanced the accuracy of the initial formal specifications and reduced the effort required to produce them. However, the formal specifications incorporated some information not found in the object-oriented diagrams, such as higher-level strategy or goals of the software.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
San Juan National Forest Land Management Planning Support System (LMPSS) requirements definition
NASA Technical Reports Server (NTRS)
Werth, L. F. (Principal Investigator)
1981-01-01
The role of remote sensing data as it relates to a three-component land management planning system (geographic information, data base management, and planning model) can be understood only when user requirements are known. Personnel at the San Juan National Forest in southwestern Colorado were interviewed to determine data needs for managing and monitoring timber, rangelands, wildlife, fisheries, soils, water, geology and recreation facilities. While all the information required for land management planning cannot be obtained using remote sensing techniques, valuable information can be provided for the geographic information system. A wide range of sensors such as small and large format cameras, synthetic aperture radar, and LANDSAT data should be utilized. Because of the detail and accuracy required, high altitude color infrared photography should serve as the baseline data base and be supplemented and updated with data from the other sensors.
Mental models in risk assessment: informing people about drugs.
Jungermann, H; Schütz, H; Thüring, M
1988-03-01
One way to communicate about the risks of drugs is through the use of package inserts. The problems associated with this medium of informing patients have been investigated by several researchers who found that people require information about drugs they are using, including extensive risk information, and that they are willing to take this information into account in their usage of drugs. But empirical results also show that people easily misinterpret the information given. A conceptual framework is proposed that might be used for better understanding the cognitive processes involved in such a type of risk assessment and communication. It is based on the idea that people develop, through experience, a mental model of how a drug works, which effects it might produce, that contraindications have to be considered, etc. This mental model is "run" when a specific package insert has been read and a specific question arises such as, for example, whether certain symptoms can be explained as normal or whether they require special attention and action. We argue that the mental model approach offers a useful perspective for examining how people understand package inserts, and consequently for improving their content and design. The approach promises to be equally useful for other aspects of risk analysis that are dependent upon human judgment and decision making, e.g., threat diagnosis and human reliability analysis.
On the management and processing of earth resources information
NASA Technical Reports Server (NTRS)
Skinner, C. W.; Gonzalez, R. C.
1973-01-01
The basic concepts of a recently completed large-scale earth resources information system plan are reported. Attention is focused throughout the paper on the information management and processing requirements. After the development of the principal system concepts, a model system for implementation at the state level is discussed.
SLS Navigation Model-Based Design Approach
NASA Technical Reports Server (NTRS)
Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas
2018-01-01
The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed
Experiences Using Lightweight Formal Methods for Requirements Modeling
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Lutz, Robyn; Covington, Rick; Kelly, John; Ampo, Yoko; Hamilton, David
1997-01-01
This paper describes three case studies in the lightweight application of formal methods to requirements modeling for spacecraft fault protection systems. The case studies differ from previously reported applications of formal methods in that formal methods were applied very early in the requirements engineering process, to validate the evolving requirements. The results were fed back into the projects, to improve the informal specifications. For each case study, we describe what methods were applied, how they were applied, how much effort was involved, and what the findings were. In all three cases, formal methods enhanced the existing verification and validation processes, by testing key properties of the evolving requirements, and helping to identify weaknesses. We conclude that the benefits gained from early modeling of unstable requirements more than outweigh the effort needed to maintain multiple representations.
Dugan, J M; Berrios, D C; Liu, X; Kim, D K; Kaizer, H; Fagan, L M
1999-01-01
Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models.
Spatial Allocator for air quality modeling
The Spatial Allocator is a set of tools that helps users manipulate and generate data files related to emissions and air quality modeling without requiring the use of a commercial Geographic Information System.
Ahmadi, Maryam; Ghazisaeidi, Marjan; Bashiri, Azadeh
2015-03-18
In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. This study provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with the electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. This is a cross-sectional study that was conducted in 2013. The study population was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also Visual Paradigm software was used to design a conceptual model. Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, with providing the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.
Habitat Suitability Index Models: Beaver
Allen, Arthur W.
1982-01-01
Habitat preferences of the beaver (Castor canadensis) are described in this publication, which is one of a series of Habitat Suitability Index (HSI) models. Habitat use information is presented in a synthesis of the literature on the species-habitat requirements of the beaver, followed by the development of the HSI model. The model is designed to provide information for use in impact assessment and habitat management activities, and should be used in conjunction with habitat evaluation procedures previously developed by the Fish and Wildlife Service. This revised model updates the original publication dated September 1982.
40 CFR 1054.250 - What records must I keep and what reports must I send to EPA?
Code of Federal Regulations, 2012 CFR
2012-07-01
... model year, you must send us a report describing information about engines you produced during the model... send us. (2) Any of the information we specify in § 1054.205 that you were not required to include in... certificate of conformity. (c) Keep data from routine emission tests (such as test cell temperatures and...
40 CFR 1054.250 - What records must I keep and what reports must I send to EPA?
Code of Federal Regulations, 2014 CFR
2014-07-01
... model year, you must send us a report describing information about engines you produced during the model... send us. (2) Any of the information we specify in § 1054.205 that you were not required to include in... certificate of conformity. (c) Keep data from routine emission tests (such as test cell temperatures and...
40 CFR 1054.250 - What records must I keep and what reports must I send to EPA?
Code of Federal Regulations, 2013 CFR
2013-07-01
... model year, you must send us a report describing information about engines you produced during the model... send us. (2) Any of the information we specify in § 1054.205 that you were not required to include in... certificate of conformity. (c) Keep data from routine emission tests (such as test cell temperatures and...
40 CFR 1054.250 - What records must I keep and what reports must I send to EPA?
Code of Federal Regulations, 2011 CFR
2011-07-01
... model year, you must send us a report describing information about engines you produced during the model... send us. (2) Any of the information we specify in § 1054.205 that you were not required to include in... certificate of conformity. (c) Keep data from routine emission tests (such as test cell temperatures and...
Information data systems for a global change technology initiative architecture trade study
NASA Technical Reports Server (NTRS)
Murray, Nicholas D.
1991-01-01
The Global Change Technology Initiative (GCTI) was established to develop technology which will enable use of satellite systems of Earth observations on a global scale, enable use of the observations to predictively model Earth's changes, and provide scientists, government, business, and industry with quick access to the resulting information. At LaRC, a GCTI Architecture Trade Study was undertaken to develop and evaluate the architectural implications to meet the requirements of the global change studies and the eventual implementation of a global change system. The output of the trade study are recommended technologies for the GCTI. That portion of the study concerned with the information data system is documented. The information data system for an earth global change modeling system can be very extensive and beyond affordability in terms of today's costs. Therefore, an incremental approach to gaining a system is most likely. An options approach to levels of capability versus needed technologies was developed. The primary drivers of the requirements for the information data system evaluation were the needed science products, the science measurements, the spacecraft orbits, the instruments configurations, and the spacecraft configurations and their attendant architectures. The science products requirements were not studied here; however, some consideration of the product needs were included in the evaluation results. The information data system technology items were identified from the viewpoint of the desirable overall information system characteristics.
Toward designing for trust in database automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duez, P. P.; Jamieson, G. A.
Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operatingmore » functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process. The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)« less
Methods to estimate irrigated reference crop evapotranspiration - a review.
Kumar, R; Jat, M K; Shankar, V
2012-01-01
Efficient water management of crops requires accurate irrigation scheduling which, in turn, requires the accurate measurement of crop water requirement. Irrigation is applied to replenish depleted moisture for optimum plant growth. Reference evapotranspiration plays an important role for the determination of water requirements for crops and irrigation scheduling. Various models/approaches varying from empirical to physically base distributed are available for the estimation of reference evapotranspiration. Mathematical models are useful tools to estimate the evapotranspiration and water requirement of crops, which is essential information required to design or choose best water management practices. In this paper the most commonly used models/approaches, which are suitable for the estimation of daily water requirement for agricultural crops grown in different agro-climatic regions, are reviewed. Further, an effort has been made to compare the accuracy of various widely used methods under different climatic conditions.
Scalable DB+IR Technology: Processing Probabilistic Datalog with HySpirit.
Frommholz, Ingo; Roelleke, Thomas
2016-01-01
Probabilistic Datalog (PDatalog, proposed in 1995) is a probabilistic variant of Datalog and a nice conceptual idea to model Information Retrieval in a logical, rule-based programming paradigm. Making PDatalog work in real-world applications requires more than probabilistic facts and rules, and the semantics associated with the evaluation of the programs. We report in this paper some of the key features of the HySpirit system required to scale the execution of PDatalog programs. Firstly, there is the requirement to express probability estimation in PDatalog. Secondly, fuzzy-like predicates are required to model vague predicates (e.g. vague match of attributes such as age or price). Thirdly, to handle large data sets there are scalability issues to be addressed, and therefore, HySpirit provides probabilistic relational indexes and parallel and distributed processing . The main contribution of this paper is a consolidated view on the methods of the HySpirit system to make PDatalog applicable in real-scale applications that involve a wide range of requirements typical for data (information) management and analysis.
Ecotoxicological models generally have large data requirements and are frequently based on existing information from diverse sources. Standardizing data for toxicological models may be necessary to reduce extraneous variation and to ensure models reflect intrinsic relationships. ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tome, Carlos N; Caro, J A; Lebensohn, R A
2010-01-01
Advancing the performance of Light Water Reactors, Advanced Nuclear Fuel Cycles, and Advanced Reactors, such as the Next Generation Nuclear Power Plants, requires enhancing our fundamental understanding of fuel and materials behavior under irradiation. The capability to accurately model the nuclear fuel systems to develop predictive tools is critical. Not only are fabrication and performance models needed to understand specific aspects of the nuclear fuel, fully coupled fuel simulation codes are required to achieve licensing of specific nuclear fuel designs for operation. The backbone of these codes, models, and simulations is a fundamental understanding and predictive capability for simulating themore » phase and microstructural behavior of the nuclear fuel system materials and matrices. In this paper we review the current status of the advanced modeling and simulation of nuclear reactor cladding, with emphasis on what is available and what is to be developed in each scale of the project, how we propose to pass information from one scale to the next, and what experimental information is required for benchmarking and advancing the modeling at each scale level.« less
NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2013-01-01
The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.
Cai, Qianqian; Turner, Brett D; Sheng, Daichao; Sloan, Scott
2018-03-01
The kinetics of fluoride sorption by calcite in the presence of metal ions (Co, Mn, Cd and Ba) have been investigated and modelled using the intra-particle diffusion (IPD), pseudo-second order (PSO), and the Hill 4 and Hill 5 kinetic models. Model comparison using the Akaike Information Criterion (AIC), the Schwarz Bayseian Information Criterion (BIC) and the Bayes Factor allows direct comparison of model results irrespective of the number of model parameters. Information Criterion results indicate "very strong" evidence that the Hill 5 model was the best fitting model for all observed data due to its ability to fit sigmoidal data, with confidence contour analysis showing the model parameters were well constrained by the data. Kinetic results were used to determine the thickness of a calcite permeable reactive barrier required to achieve up to 99.9% fluoride removal at a groundwater flow of 0.1 m.day -1 . Fluoride removal half-life (t 0.5 ) values were found to increase in the order Ba ≈ stonedust (a 99% pure natural calcite) < Cd < Co < Mn. A barrier width of 0.97 ± 0.02 m was found to be required for the fluoride/calcite (stonedust) only system when using no factor of safety, whilst in the presence of Mn and Co, the width increased to 2.76 ± 0.28 and 19.83 ± 0.37 m respectively. In comparison, the PSO model predicted a required barrier thickness of ∼46.0, 62.6 & 50.3 m respectively for the fluoride/calcite, Mn and Co systems under the same conditions. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
The PDS4 Information Model and its Role in Agile Science Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D.
2017-12-01
PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
...The National Highway Traffic Safety Administration (NHTSA) published a document in the Federal Register of June 21, 2010, announcing NHTSA's determination that there were no new model year (MY) 2011 light-duty truck lines subject to the requirements of the Federal motor vehicle theft prevention standard. The final rule also identified those vehicle lines that had been granted an exemption from the parts- marking requirements for the 2011 model year and those vehicle lines the agency removed because certain vehicle lines had been discontinued more than 5 years ago. This document corrects certain information published in the SUPPLEMENTARY INFORMATION section and Appendix A-I listing of the final rule. All previous information associated with the published notice remains the same.
Hydrologic Process-oriented Optimization of Electrical Resistivity Tomography
NASA Astrophysics Data System (ADS)
Hinnell, A.; Bechtold, M.; Ferre, T. A.; van der Kruk, J.
2010-12-01
Electrical resistivity tomography (ERT) is commonly used in hydrologic investigations. Advances in joint and coupled hydrogeophysical inversion have enhanced the quantitative use of ERT to construct and condition hydrologic models (i.e. identify hydrologic structure and estimate hydrologic parameters). However the selection of which electrical resistivity data to collect and use is often determined by a combination of data requirements for geophysical analysis, intuition on the part of the hydrogeophysicist and logistical constraints of the laboratory or field site. One of the advantages of coupled hydrogeophysical inversion is the direct link between the hydrologic model and the individual geophysical data used to condition the model. That is, there is no requirement to collect geophysical data suitable for independent geophysical inversion. The geophysical measurements collected can be optimized for estimation of hydrologic model parameters rather than to develop a geophysical model. Using a synthetic model of drip irrigation we evaluate the value of individual resistivity measurements to describe the soil hydraulic properties and then use this information to build a data set optimized for characterizing hydrologic processes. We then compare the information content in the optimized data set with the information content in a data set optimized using a Jacobian sensitivity analysis.
Requirement analysis for the one-stop logistics management of fresh agricultural products
NASA Astrophysics Data System (ADS)
Li, Jun; Gao, Hongmei; Liu, Yuchuan
2017-08-01
Issues and concerns for food safety, agro-processing, and the environmental and ecological impact of food production have been attracted many research interests. Traceability and logistics management of fresh agricultural products is faced with the technological challenges including food product label and identification, activity/process characterization, information systems for the supply chain, i.e., from farm to table. Application of one-stop logistics service focuses on the whole supply chain process integration for fresh agricultural products is studied. A collaborative research project for the supply and logistics of fresh agricultural products in Tianjin was performed. Requirement analysis for the one-stop logistics management information system is studied. The model-driven business transformation, an approach uses formal models to explicitly define the structure and behavior of a business, is applied for the review and analysis process. Specific requirements for the logistic management solutions are proposed. Development of this research is crucial for the solution of one-stop logistics management information system integration platform for fresh agricultural products.
Kim, Katherine K; Browe, Dennis K; Logan, Holly C; Holm, Roberta; Hack, Lori; Ohno-Machado, Lucila
2014-01-01
There is currently limited information on best practices for the development of governance requirements for distributed research networks (DRNs), an emerging model that promotes clinical data reuse and improves timeliness of comparative effectiveness research. Much of the existing information is based on a single type of stakeholder such as researchers or administrators. This paper reports on a triangulated approach to developing DRN data governance requirements based on a combination of policy analysis with experts, interviews with institutional leaders, and patient focus groups. This approach is illustrated with an example from the Scalable National Network for Effectiveness Research, which resulted in 91 requirements. These requirements were analyzed against the Fair Information Practice Principles (FIPPs) and Health Insurance Portability and Accountability Act (HIPAA) protected versus non-protected health information. The requirements addressed all FIPPs, showing how a DRN's technical infrastructure is able to fulfill HIPAA regulations, protect privacy, and provide a trustworthy platform for research. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
System requirements specification for SMART structures mode
NASA Technical Reports Server (NTRS)
1992-01-01
Specified here are the functional and informational requirements for software modules which address the geometric and data modeling needs of the aerospace structural engineer. The modules are to be included as part of the Solid Modeling Aerospace Research Tool (SMART) package developed for the Vehicle Analysis Branch (VAB) at the NASA Langley Research Center (LaRC). The purpose is to precisely state what the SMART Structures modules will do, without consideration of how it will be done. Each requirement is numbered for reference in development and testing.
Framework for a clinical information system.
Van De Velde, R; Lansiers, R; Antonissen, G
2002-01-01
The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.
Handapangoda, Chintha C; Premaratne, Malin; Paganin, David M; Hendahewa, Priyantha R D S
2008-10-27
A novel algorithm for mapping the photon transport equation (PTE) to Maxwell's equations is presented. Owing to its accuracy, wave propagation through biological tissue is modeled using the PTE. The mapping of the PTE to Maxwell's equations is required to model wave propagation through foreign structures implanted in biological tissue for sensing and characterization of tissue properties. The PTE solves for only the magnitude of the intensity but Maxwell's equations require the phase information as well. However, it is possible to construct the phase information approximately by solving the transport of intensity equation (TIE) using the full multigrid algorithm.
An information based approach to improving overhead imagery collection
NASA Astrophysics Data System (ADS)
Sourwine, Matthew J.; Hintz, Kenneth J.
2011-06-01
Recent growth in commercial imaging satellite development has resulted in a complex and diverse set of systems. To simplify this environment for both customer and vendor, an information based sensor management model was built to integrate tasking and scheduling systems. By establishing a relationship between image quality and information, tasking by NIIRS can be utilized to measure the customer's required information content. Focused on a reduction in uncertainty about a target of interest, the sensor manager finds the best sensors to complete the task given the active suite of imaging sensors' functions. This is done through determination of which satellite will meet customer information and timeliness requirements with low likelihood of interference at the highest rate of return.
Evolutionary Capability Delivery of Coast Guard Manpower System
2014-06-01
Office IID iterative incremental development model IT information technology MA major accomplishment MRA manpower requirements analysis MRD manpower...CG will need to ensure that development is low risk. The CG uses Manpower Requirements Analysis ( MRAs ) to collect the necessary manpower data to...of users. The CG uses two business processes to manage human capital: Manpower Requirements Analysis ( MRA ) and Manpower Requirements
Requirements for Medical Modeling Languages
van der Maas, Arnoud A.F.; Ter Hofstede, Arthur H.M.; Ten Hoopen, A. Johannes
2001-01-01
Objective: The development of tailor-made domain-specific modeling languages is sometimes desirable in medical informatics. Naturally, the development of such languages should be guided. The purpose of this article is to introduce a set of requirements for such languages and show their application in analyzing and comparing existing modeling languages. Design: The requirements arise from the practical experience of the authors and others in the development of modeling languages in both general informatics and medical informatics. The requirements initially emerged from the analysis of information modeling techniques. The requirements are designed to be orthogonal, i.e., one requirement can be violated without violation of the others. Results: The proposed requirements for any modeling language are that it be “formal” with regard to syntax and semantics, “conceptual,” “expressive,” “comprehensible,” “suitable,” and “executable.” The requirements are illustrated using both the medical logic modules of the Arden Syntax as a running example and selected examples from other modeling languages. Conclusion: Activity diagrams of the Unified Modeling Language, task structures for work flows, and Petri nets are discussed with regard to the list of requirements, and various tradeoffs are thus made explicit. It is concluded that this set of requirements has the potential to play a vital role in both the evaluation of existing domain-specific languages and the development of new ones. PMID:11230383
Thermal APU/hydraulics analysis program. User's guide and programmer's manual
NASA Technical Reports Server (NTRS)
Deluna, T. A.
1976-01-01
The User's Guide information plus program description necessary to run and have a general understanding of the Thermal APU/Hydraulics Analysis Program (TAHAP) is described. This information consists of general descriptions of the APU/hydraulic system and the TAHAP model, input and output data descriptions, and specific subroutine requirements. Deck setups and input data formats are included and other necessary and/or helpful information for using TAHAP is given. The math model descriptions for the driver program and each of its supporting subroutines are outlined.
Modelling topographic potential for erosion and deposition using GIS
Helena Mitasova; Louis R. Iverson
1996-01-01
Modelling of erosion and deposition in complex terrain within a geographical information system (GIS) requires a high resolution digital elevation model (DEM), reliable estimation of topographic parameters, and formulation of erosion models adequate for digital representation of spatially distributed parameters. Regularized spline with tension was integrated within a...
Function modeling improves the efficiency of spatial modeling using big data from remote sensing
John Hogland; Nathaniel Anderson
2017-01-01
Spatial modeling is an integral component of most geographic information systems (GISs). However, conventional GIS modeling techniques can require substantial processing time and storage space and have limited statistical and machine learning functionality. To address these limitations, many have parallelized spatial models using multiple coding libraries and have...
NASA Astrophysics Data System (ADS)
Kase, Sue E.; Vanni, Michelle; Caylor, Justine; Hoye, Jeff
2017-05-01
The Human-Assisted Machine Information Exploitation (HAMIE) investigation utilizes large-scale online data collection for developing models of information-based problem solving (IBPS) behavior in a simulated time-critical operational environment. These types of environments are characteristic of intelligence workflow processes conducted during human-geo-political unrest situations when the ability to make the best decision at the right time ensures strategic overmatch. The project takes a systems approach to Human Information Interaction (HII) by harnessing the expertise of crowds to model the interaction of the information consumer and the information required to solve a problem at different levels of system restrictiveness and decisional guidance. The design variables derived from Decision Support Systems (DSS) research represent the experimental conditions in this online single-player against-the-clock game where the player, acting in the role of an intelligence analyst, is tasked with a Commander's Critical Information Requirement (CCIR) in an information overload scenario. The player performs a sequence of three information processing tasks (annotation, relation identification, and link diagram formation) with the assistance of `HAMIE the robot' who offers varying levels of information understanding dependent on question complexity. We provide preliminary results from a pilot study conducted with Amazon Mechanical Turk (AMT) participants on the Volunteer Science scientific research platform.
User modeling for distributed virtual environment intelligent agents
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
1999-07-01
This paper emphasizes the requirement for user modeling by presenting the necessary information to motivate the need for and use of user modeling for intelligent agent development. The paper will present information on our current intelligent agent development program, the Symbiotic Information Reasoning and Decision Support (SIRDS) project. We then discuss the areas of intelligent agents and user modeling, which form the foundation of the SIRDS project. Included in the discussion of user modeling are its major components, which are cognitive modeling and behavioral modeling. We next motivate the need for and user of a methodology to develop user models to encompass work within cognitive task analysis. We close the paper by drawing conclusions from our current intelligent agent research project and discuss avenues of future research in the utilization of user modeling for the development of intelligent agents for virtual environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharp, J.K.
1997-11-01
This seminar describes a process and methodology that uses structured natural language to enable the construction of precise information requirements directly from users, experts, and managers. The main focus of this natural language approach is to create the precise information requirements and to do it in such a way that the business and technical experts are fully accountable for the results. These requirements can then be implemented using appropriate tools and technology. This requirement set is also a universal learning tool because it has all of the knowledge that is needed to understand a particular process (e.g., expense vouchers, projectmore » management, budget reviews, tax, laws, machine function).« less
Inventory of environmental impact models related to energy technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owen, P.T.; Dailey, N.S.; Johnson, C.A.
The purpose of this inventory is to identify and collect data on computer simulations and computational models related to the environmental effects of energy source development, energy conversion, or energy utilization. Information for 33 data fields was sought for each model reported. All of the information which could be obtained within the time alloted for completion of the project is presented for each model listed. Efforts will be continued toward acquiring the needed information. Readers who are interested in these particular models are invited to contact ESIC for assistance in locating them. In addition to the standard bibliographic information, othermore » data fields of interest to modelers, such as computer hardware and software requirements, algorithms, applications, and existing model validation information, are included. Indexes are provided for contact person, acronym, keyword, and title. The models are grouped into the following categories: atmospheric transport, air quality, aquatic transport, terrestrial food chains, soil transport, aquatic food chains, water quality, dosimetry, and human effects, animal effects, plant effects, and generalized environmental transport. Within these categories, the models are arranged alphabetically by last name of the contact person.« less
Rabideau, Dustin J; Pei, Pamela P; Walensky, Rochelle P; Zheng, Amy; Parker, Robert A
2018-02-01
The expected value of sample information (EVSI) can help prioritize research but its application is hampered by computational infeasibility, especially for complex models. We investigated an approach by Strong and colleagues to estimate EVSI by applying generalized additive models (GAM) to results generated from a probabilistic sensitivity analysis (PSA). For 3 potential HIV prevention and treatment strategies, we estimated life expectancy and lifetime costs using the Cost-effectiveness of Preventing AIDS Complications (CEPAC) model, a complex patient-level microsimulation model of HIV progression. We fitted a GAM-a flexible regression model that estimates the functional form as part of the model fitting process-to the incremental net monetary benefits obtained from the CEPAC PSA. For each case study, we calculated the expected value of partial perfect information (EVPPI) using both the conventional nested Monte Carlo approach and the GAM approach. EVSI was calculated using the GAM approach. For all 3 case studies, the GAM approach consistently gave similar estimates of EVPPI compared with the conventional approach. The EVSI behaved as expected: it increased and converged to EVPPI for larger sample sizes. For each case study, generating the PSA results for the GAM approach required 3 to 4 days on a shared cluster, after which EVPPI and EVSI across a range of sample sizes were evaluated in minutes. The conventional approach required approximately 5 weeks for the EVPPI calculation alone. Estimating EVSI using the GAM approach with results from a PSA dramatically reduced the time required to conduct a computationally intense project, which would otherwise have been impractical. Using the GAM approach, we can efficiently provide policy makers with EVSI estimates, even for complex patient-level microsimulation models.
1995-09-01
vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems
Information Security and Data Breach Notification Safeguards
2007-07-31
for unauthorized purposes. Data breach notification requirements obligate covered entities to provide notice to affected persons (e.g., cardholders...customers) about the occurrence of a data security breach involving personally identifiable information. The first data breach notification law was...computerized personal information to disclose any breach of a resident’s personal information. S.B. 1386 was the model for subsequent data breach notification
Development of Optimal Stressor Scenarios for New Operational Energy Systems
2017-12-01
Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical information about the associated operational...from experimentation. The resulting system requirements can be used to revisit the design requirements and develop a more robust system. This process...stressor scenarios for acceptance testing. Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical
GUIDELINES TO ASSESSING REGIONAL VULNERABILITIES
Decision-makers today face increasingly complex environmental problems that require integrative and innovative approaches for analyzing, modeling, and interpreting various types of information. ReVA acknowledges this need and is designed to evaluate methods and models for synthe...
Artificial retina model for the retinally blind based on wavelet transform
NASA Astrophysics Data System (ADS)
Zeng, Yan-an; Song, Xin-qiang; Jiang, Fa-gang; Chang, Da-ding
2007-01-01
Artificial retina is aimed for the stimulation of remained retinal neurons in the patients with degenerated photoreceptors. Microelectrode arrays have been developed for this as a part of stimulator. Design such microelectrode arrays first requires a suitable mathematical method for human retinal information processing. In this paper, a flexible and adjustable human visual information extracting model is presented, which is based on the wavelet transform. With the flexible of wavelet transform to image information processing and the consistent to human visual information extracting, wavelet transform theory is applied to the artificial retina model for the retinally blind. The response of the model to synthetic image is shown. The simulated experiment demonstrates that the model behaves in a manner qualitatively similar to biological retinas and thus may serve as a basis for the development of an artificial retina.
40 CFR 1042.250 - Recordkeeping and reporting.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 45 days after the end of the model year, you must send us a report describing information about... information you send us. (2) Any of the information we specify in § 1042.205 that you were not required to... tests (such as test cell temperatures and relative humidity readings) for one year after we issue the...
40 CFR 1042.250 - Recordkeeping and reporting.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 45 days after the end of the model year, you must send us a report describing information about... information you send us. (2) Any of the information we specify in § 1042.205 that you were not required to... tests (such as test cell temperatures and relative humidity readings) for one year after we issue the...
40 CFR 1042.250 - Recordkeeping and reporting.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 45 days after the end of the model year, you must send us a report describing information about... information you send us. (2) Any of the information we specify in § 1042.205 that you were not required to... tests (such as test cell temperatures and relative humidity readings) for one year after we issue the...
40 CFR 1042.250 - Recordkeeping and reporting.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 45 days after the end of the model year, you must send us a report describing information about... information you send us. (2) Any of the information we specify in § 1042.205 that you were not required to... tests (such as test cell temperatures and relative humidity readings) for one year after we issue the...
ERIC Educational Resources Information Center
Hall, Jacqueline Huynh
2011-01-01
In today's modern business world, most organizations use information as a critical business asset to gain competitive advantage and create market value. Increasingly, an organization's ability to protect information assets plays a critical role in its ability to meet regulatory compliance requirements, increase customer trust, preserve brand…
Information Brokers: Case Studies of Successful Ventures.
ERIC Educational Resources Information Center
Holland Johnson, Alice Jane
This guide is intended for librarians planning to start an information brokerage, whether as an entrepreneur or as a member of a document delivery group in a library. The guide identifies specific skills and relevant characteristics required to establish a successful information brokerage firm and describes a model to assist readers in the process…
ERIC Educational Resources Information Center
Morales-del-Castillo, Jose Manuel; Peis, Eduardo; Moreno, Juan Manuel; Herrera-Viedma, Enrique
2009-01-01
Introduction: In this paper we propose a multi-agent Selective Dissemination of Information service to improve the research community's access to digital library resources. The service also provides a new recommendation approach to satisfy researchers' specific information requirements. Method: The service model is developed by jointly applying…
ERIC Educational Resources Information Center
Kiriakou, Charles M.
2012-01-01
Adoption of a comprehensive information security governance model and security controls is the best option organizations may have to protect their information assets and comply with regulatory requirements. Understanding acceptance factors of the National Institute of Standards and Technology (NIST) Risk Management Framework (RMF) comprehensive…
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
The Planetary Data System Information Model for Geometry Metadata
NASA Astrophysics Data System (ADS)
Guinness, E. A.; Gordon, M. K.
2014-12-01
The NASA Planetary Data System (PDS) has recently developed a new set of archiving standards based on a rigorously defined information model. An important part of the new PDS information model is the model for geometry metadata, which includes, for example, attributes of the lighting and viewing angles of observations, position and velocity vectors of a spacecraft relative to Sun and observing body at the time of observation and the location and orientation of an observation on the target. The PDS geometry model is based on requirements gathered from the planetary research community, data producers, and software engineers who build search tools. A key requirement for the model is that it fully supports the breadth of PDS archives that include a wide range of data types from missions and instruments observing many types of solar system bodies such as planets, ring systems, and smaller bodies (moons, comets, and asteroids). Thus, important design aspects of the geometry model are that it standardizes the definition of the geometry attributes and provides consistency of geometry metadata across planetary science disciplines. The model specification also includes parameters so that the context of values can be unambiguously interpreted. For example, the reference frame used for specifying geographic locations on a planetary body is explicitly included with the other geometry metadata parameters. The structure and content of the new PDS geometry model is designed to enable both science analysis and efficient development of search tools. The geometry model is implemented in XML, as is the main PDS information model, and uses XML schema for validation. The initial version of the geometry model is focused on geometry for remote sensing observations conducted by flyby and orbiting spacecraft. Future releases of the PDS geometry model will be expanded to include metadata for landed and rover spacecraft.
NASA Technical Reports Server (NTRS)
1975-01-01
The Model is described along with data preparation, determining model parameters, initializing and optimizing parameters (calibration) selecting control options and interpreting results. Some background information is included, and appendices contain a dictionary of variables, a source program listing, and flow charts. The model was operated on an IBM System/360 Model 44, using a model 2250 keyboard/graphics terminal for interactive operation. The model can be set up and operated in a batch processing mode on any System/360 or 370 that has the memory capacity. The model requires 210K bytes of core storage, and the optimization program, OPSET (which was used previous to but not in this study), requires 240K bytes. The data band for one small watershed requires approximately 32 tracks of disk storage.
One decade of the Data Fusion Information Group (DFIG) model
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-05-01
The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.
Information Requirements for Integrating Spatially Discrete, Feature-Based Earth Observations
NASA Astrophysics Data System (ADS)
Horsburgh, J. S.; Aufdenkampe, A. K.; Lehnert, K. A.; Mayorga, E.; Hsu, L.; Song, L.; Zaslavsky, I.; Valentine, D. L.
2014-12-01
Several cyberinfrastructures have emerged for sharing observational data collected at densely sampled and/or highly instrumented field sites. These include the CUAHSI Hydrologic Information System (HIS), the Critical Zone Observatory Integrated Data Management System (CZOData), the Integrated Earth Data Applications (IEDA) and EarthChem system, and the Integrated Ocean Observing System (IOOS). These systems rely on standard data encodings and, in some cases, standard semantics for classes of geoscience data. Their focus is on sharing data on the Internet via web services in domain specific encodings or markup languages. While they have made progress in making data available, it still takes investigators significant effort to discover and access datasets from multiple repositories because of inconsistencies in the way domain systems describe, encode, and share data. Yet, there are many scenarios that require efficient integration of these data types across different domains. For example, understanding a soil profile's geochemical response to extreme weather events requires integration of hydrologic and atmospheric time series with geochemical data from soil samples collected over various depth intervals from soil cores or pits at different positions on a landscape. Integrated access to and analysis of data for such studies are hindered because common characteristics of data, including time, location, provenance, methods, and units are described differently within different systems. Integration requires syntactic and semantic translations that can be manual, error-prone, and lossy. We report information requirements identified as part of our work to define an information model for a broad class of earth science data - i.e., spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples. We sought to answer the question: "What information must accompany observational data for them to be archivable and discoverable within a publication system as well as interpretable once retrieved from such a system for analysis and (re)use?" We also describe development of multiple functional schemas (i.e., physical implementations for data storage, transfer, and archival) for the information model that capture the requirements reported here.
Canyon, Deon V; Burkle, Frederick M; Speare, Rick
2015-12-01
Earth's climate is changing and national and international decision-makers are recognizing that global health security requires urgent attention and a significant investment to protect the future. In most locations, current data are inadequate to conduct a full assessment of the direct and indirect health impacts of climate change. All states require this information to evaluate community-level resilience to climate extremes and climate change. A model that is being used successfully in the United Kingdom, Australia, and New Zealand is recommended to generate rapid information to assist decision-makers in the event of a disaster. The model overcomes barriers to success inherent in the traditional ''top-down'' approach to managing crises and recognizes the capacity of capable citizens and community organizers to facilitate response and recovery if provided the opportunity and resources. Local information is a prerequisite for strategic and tactical statewide planning. Time and resources are required to analyze risks within each community and what is required to prevent (mitigate), prepare, respond, recover (rehabilitate), anticipate, and assess any threatening events. Specific requirements at all levels from state to community must emphasize community roles by focusing on how best to maintain, respond, and recover public health protections and the infrastructure necessary for health security.
Uncertainty in surface water flood risk modelling
NASA Astrophysics Data System (ADS)
Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.
2009-04-01
Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.
NASA Astrophysics Data System (ADS)
Merla, Yu; Wu, Billy; Yufit, Vladimir; Martinez-Botas, Ricardo F.; Offer, Gregory J.
2018-04-01
Accurate diagnosis of lithium ion battery state-of-health (SOH) is of significant value for many applications, to improve performance, extend life and increase safety. However, in-situ or in-operando diagnosis of SOH often requires robust models. There are many models available however these often require expensive-to-measure ex-situ parameters and/or contain unmeasurable parameters that were fitted/assumed. In this work, we have developed a new empirically parameterised physics-informed equivalent circuit model. Its modular construction and low-cost parametrisation requirements allow end users to parameterise cells quickly and easily. The model is accurate to 19.6 mV for dynamic loads without any global fitting/optimisation, only that of the individual elements. The consequences of various degradation mechanisms are simulated, and the impact of a degraded cell on pack performance is explored, validated by comparison with experiment. Results show that an aged cell in a parallel pack does not have a noticeable effect on the available capacity of other cells in the pack. The model shows that cells perform better when electrodes are more porous towards the separator and have a uniform particle size distribution, validated by comparison with published data. The model is provided with this publication for readers to use.
NASA Astrophysics Data System (ADS)
Warner, T. T.; Swerdlin, S. P.; Chen, F.; Hayden, M.
2009-05-01
The innovative use of Computational Fluid-Dynamics (CFD) models to define the building- and street-scale atmospheric environment in urban areas can benefit society in a number of ways. Design criteria used by architectural climatologists, who help plan the livable cities of the future, require information about air movement within street canyons for different seasons and weather regimes. Understanding indoor urban air- quality problems and their mitigation, especially for older buildings, requires data on air movement and associated dynamic pressures near buildings. Learning how heat waves and anthropogenic forcing in cities collectively affect the health of vulnerable residents is a problem in building thermodynamics, human behavior, and neighborhood-scale and street-canyon-scale atmospheric sciences. And, predicting the movement of plumes of hazardous material released in urban industrial or transportation accidents requires detailed information about vertical and horizontal air motions in the street canyons. These challenges are closer to being addressed because of advances in CFD modeling, the coupling of CFD models with models of indoor air motion and air quality, and the coupling of CFD models with mesoscale weather-prediction models. This paper will review some of the new knowledge and technologies that are being developed to meet these atmospheric-environment needs of our growing urban populations.
Using building information modeling to track and assess the structural condition of bridges.
DOT National Transportation Integrated Search
2016-08-01
National Bridge Inspection Standards do not require documenting damage locations during an inspection, but bridge evaluation provisions highlight the importance of it. When determining a safe load-carrying capacity of a bridge, damage location inform...
Commitment to Cybersecurity and Information Technology Governance: A Case Study and Leadership Model
ERIC Educational Resources Information Center
Curtis, Scipiaruth Kendall
2012-01-01
The continual emergence of technologies has infiltrated government and industry business infrastructures, requiring reforming organizations and fragile network infrastructures. Emerging technologies necessitates countermeasures, commitment to cybersecurity and information technology governance for organization's survivability and sustainability.…
Section 4. The GIS Weasel User's Manual
Viger, Roland J.; Leavesley, George H.
2007-01-01
INTRODUCTION The GIS Weasel was designed to aid in the preparation of spatial information for input to lumped and distributed parameter hydrologic or other environmental models. The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to a user's model and to generate parameters from those maps. The operation of the GIS Weasel does not require the user to be a GIS expert, only that the user have an understanding of the spatial information requirements of the environmental simulation model being used. The GIS Weasel software system uses a GIS-based graphical user interface (GUI), the C programming language, and external scripting languages. The software will run on any computing platform where ArcInfo Workstation (version 8.0.2 or later) and the GRID extension are accessible. The user controls the processing of the GIS Weasel by interacting with menus, maps, and tables. The purpose of this document is to describe the operation of the software. This document is not intended to describe the usage of this software in support of any particular environmental simulation model. Such guides are published separately.
Optimal design of focused experiments and surveys
NASA Astrophysics Data System (ADS)
Curtis, Andrew
1999-10-01
Experiments and surveys are often performed to obtain data that constrain some previously underconstrained model. Often, constraints are most desired in a particular subspace of model space. Experiment design optimization requires that the quality of any particular design can be both quantified and then maximized. This study shows how the quality can be defined such that it depends on the amount of information that is focused in the particular subspace of interest. In addition, algorithms are presented which allow one particular focused quality measure (from the class of focused measures) to be evaluated efficiently. A subclass of focused quality measures is also related to the standard variance and resolution measures from linearized inverse theory. The theory presented here requires that the relationship between model parameters and data can be linearized around a reference model without significant loss of information. Physical and financial constraints define the space of possible experiment designs. Cross-well tomographic examples are presented, plus a strategy for survey design to maximize information about linear combinations of parameters such as bulk modulus, κ =λ+ 2μ/3.
NASA Astrophysics Data System (ADS)
Aksenova, Olesya; Pachkina, Anna
2017-11-01
The article deals with the problem of necessity of educational process transformation to meet the requirements of modern miming industry; cooperative developing of new educational programs and implementation of educational process taking into account modern manufacturability. The paper proves the idea of introduction into mining professionals learning process studying of three-dimensional models of surface technological complex, ore reserves and underground digging complex as well as creating these models in different graphic editors and working with the information analysis model obtained on the basis of these three-dimensional models. The technological process of manless coal mining at the premises of the mine Polysaevskaya controlled by the information analysis models built on the basis of three-dimensional models of individual objects and technological process as a whole, and at the same time requiring the staff able to use the programs of three-dimensional positioning in the miners and equipment global frame of reference is covered.
Information Model Translation to Support a Wider Science Community
NASA Astrophysics Data System (ADS)
Hughes, John S.; Crichton, Daniel; Ritschel, Bernd; Hardman, Sean; Joyner, Ronald
2014-05-01
The Planetary Data System (PDS), NASA's long-term archive for solar system exploration data, has just released PDS4, a modernization of the PDS architecture, data standards, and technical infrastructure. This next generation system positions the PDS to meet the demands of the coming decade, including big data, international cooperation, distributed nodes, and multiple ways of analysing and interpreting data. It also addresses three fundamental project goals: providing more efficient data delivery by data providers to the PDS, enabling a stable, long-term usable planetary science data archive, and enabling services for the data consumer to find, access, and use the data they require in contemporary data formats. The PDS4 information architecture is used to describe all PDS data using a common model. Captured in an ontology modeling tool it supports a hierarchy of data dictionaries built to the ISO/IEC 11179 standard and is designed to increase flexibility, enable complex searches at the product level, and to promote interoperability that facilitates data sharing both nationally and internationally. A PDS4 information architecture design requirement stipulates that the content of the information model must be translatable to external data definition languages such as XML Schema, XMI/XML, and RDF/XML. To support the semantic Web standards we are now in the process of mapping the contents into RDF/XML to support SPARQL capable databases. We are also building a terminological ontology to support virtually unified data retrieval and access. This paper will provide an overview of the PDS4 information architecture focusing on its domain information model and how the translation and mapping are being accomplished.
ModSAF-based development of operational requirements for light armored vehicles
NASA Astrophysics Data System (ADS)
Rapanotti, John; Palmarini, Marc
2003-09-01
Light Armoured Vehicles (LAVs) are being developed to meet the modern requirements of rapid deployment and operations other than war. To achieve these requirements, passive armour is minimized and survivability depends more on sensors, computers, countermeasures and communications to detect and avoid threats. The performance, reliability, and ultimately the cost of these systems, will be determined by the technology trends and the rates at which they mature. Defining vehicle requirements will depend upon an accurate assessment of these trends over a longer term than was previously needed. Modelling and simulation are being developed to study these long-term trends and how they contribute to establishing vehicle requirements. ModSAF is being developed for research and development, in addition to the original requirement of Simulation and Modelling for Acquisition, Rehearsal, Requirements and Training (SMARRT), and is becoming useful as a means for transferring technology to other users, researchers and contractors. This procedure eliminates the need to construct ad hoc models and databases. The integration of various technologies into a Defensive Aids Suite (DAS) can be designed and analyzed by combining field trials and laboratory data with modelling and simulation. ModSAF (Modular Semi-Automated Forces,) is used to construct the virtual battlefield and, through scripted input files, a "fixed battle" approach is used to define and implement contributions from three different sources. These contributions include: models of technology and natural phenomena from scientists and engineers, tactics and doctrine from the military and detailed analyses from operations research. This approach ensures the modelling of processes known to be important regardless of the level of information available about the system. Survivability of DAS-equipped vehicles based on future and foreign technology can be investigated by ModSAF and assessed relative to a test vehicle. A vehicle can be modelled phenomenologically until more information is available. These concepts and approach will be discussed in the paper.
Toward a patient-centric medical information model: issues and challenges for US adoption.
Lorence, Daniel; Monatesti, Sabatini; Margenthaler, Robert; Hoadley, Ellen
2005-01-01
As the USA moves, incrementally, toward evidence-based medicine, there is growing awareness of the importance of innovation in information management. Mandates for change include improved use of resources, accelerated diffusion of knowledge and an advanced consumer role. Key among these requirements is the need for a fundamentally different patient information recording system. Within the challenges identified in the most recent national health information technology initiative, we propose a model for an electronic, patient-centric medical information infrastructure, highlighting a transportable, scalable and integrated resource. We identify resources available for technology transfer, promoting consumers as integral parts of the collaborative medical decision-making process.
NASA Astrophysics Data System (ADS)
Bezawada, Rajesh; Uijt de Haag, Maarten
2010-04-01
This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.
A study on building data warehouse of hospital information system.
Li, Ping; Wu, Tao; Chen, Mu; Zhou, Bin; Xu, Wei-guo
2011-08-01
Existing hospital information systems with simple statistical functions cannot meet current management needs. It is well known that hospital resources are distributed with private property rights among hospitals, such as in the case of the regional coordination of medical services. In this study, to integrate and make full use of medical data effectively, we propose a data warehouse modeling method for the hospital information system. The method can also be employed for a distributed-hospital medical service system. To ensure that hospital information supports the diverse needs of health care, the framework of the hospital information system has three layers: datacenter layer, system-function layer, and user-interface layer. This paper discusses the role of a data warehouse management system in handling hospital information from the establishment of the data theme to the design of a data model to the establishment of a data warehouse. Online analytical processing tools assist user-friendly multidimensional analysis from a number of different angles to extract the required data and information. Use of the data warehouse improves online analytical processing and mitigates deficiencies in the decision support system. The hospital information system based on a data warehouse effectively employs statistical analysis and data mining technology to handle massive quantities of historical data, and summarizes from clinical and hospital information for decision making. This paper proposes the use of a data warehouse for a hospital information system, specifically a data warehouse for the theme of hospital information to determine latitude, modeling and so on. The processing of patient information is given as an example that demonstrates the usefulness of this method in the case of hospital information management. Data warehouse technology is an evolving technology, and more and more decision support information extracted by data mining and with decision-making technology is required for further research.
Field Markup Language: biological field representation in XML.
Chang, David; Lovell, Nigel H; Dokos, Socrates
2007-01-01
With an ever increasing number of biological models available on the internet, a standardized modeling framework is required to allow information to be accessed or visualized. Based on the Physiome Modeling Framework, the Field Markup Language (FML) is being developed to describe and exchange field information for biological models. In this paper, we describe the basic features of FML, its supporting application framework and its ability to incorporate CellML models to construct tissue-scale biological models. As a typical application example, we present a spatially-heterogeneous cardiac pacemaker model which utilizes both FML and CellML to describe and solve the underlying equations of electrical activation and propagation.
Innovative Organization of Project Activity of Construction Students
NASA Astrophysics Data System (ADS)
Stolbova, I. D.; Aleksandrova, E. P.; Krainova, M. N.
2017-11-01
The construction industry competitiveness depends on its equipping with information modeling technologies. This requires training and development of human resources. The advantages of BIM-technologies are considered. The requirements for the specialists capable of promoting information modeling technologies in the construction industry are discussed. For a wide application of BIM-technologies, the problem of training personnel with a new thinking must be solved. When preparing graduates of the major “Construction”, it is necessary to introduce innovative educational technologies aimed at building the students’ ability for team work, competences in the field of modern information and communication technologies, as well as design skills basing on spatial modeling. Graphic training is the first discipline of the professional orientation for construction students. In the context of training it is important to create such learning environment that is close to a professional one. The paper provides the examples of practice-oriented assignments based on the project method in the course of students’ independent work.
de Lusignan, S; Krause, P; Michalakidis, G; Vicente, M Tristan; Thompson, S; McGilchrist, M; Sullivan, F; van Royen, P; Agreus, L; Desombre, T; Taweel, A; Delaney, B
2012-01-01
To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data. We extended our initial data-centric approach to include socio-cultural and business requirements. We created reference models of core data requirements common to most studies using unified modelling language (UML), dataflow diagrams (DFD) and business process modelling notation (BPMN). We conducted a stakeholder analysis and constructed DFD and UML diagrams for use cases based on simulated research studies. We used research output as a sensitivity analysis. Differences between the reference model and use cases identified study specific data requirements. The stakeholder analysis identified: tensions, changes in specification, some indifference from data providers and enthusiastic informaticians urging inclusion of socio-cultural context. We identified requirements to collect information at three levels: micro- data items, which need to be semantically interoperable, meso- the medical record and data extraction, and macro- the health system and socio-cultural issues. BPMN clarified complex business requirements among data providers and vendors; and additional geographical requirements for patients to be represented in both linked datasets. High quality research output was the norm for most repositories. Reference models provide high-level schemata of the core data requirements. However, business requirements' modelling identifies stakeholder issues and identifies what needs to be addressed to enable participation.
McBride, Sebastian; Huelse, Martin; Lee, Mark
2013-01-01
Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.
McBride, Sebastian; Huelse, Martin; Lee, Mark
2013-01-01
Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as ‘active vision’, to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of ‘where’ and ‘what’ information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate ‘active’ visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a ‘priority map’. PMID:23437044
NASA Astrophysics Data System (ADS)
Sidi, Fatimah; Daud, Maslina; Ahmad, Sabariah; Zainuddin, Naqliyah; Anneisa Abdullah, Syafiqa; Jabar, Marzanah A.; Suriani Affendey, Lilly; Ishak, Iskandar; Sharef, Nurfadhlina Mohd; Zolkepli, Maslina; Nur Majdina Nordin, Fatin; Amat Sejani, Hashimah; Ramadzan Hairani, Saiful
2017-09-01
Information security has been identified by organizations as part of internal operations that need to be well implemented and protected. This is because each day the organizations face a high probability of increase of threats to their networks and services that will lead to information security issues. Thus, effective information security management is required in order to protect their information assets. Threat profiling is a method that can be used by an organization to address the security challenges. Threat profiling allows analysts to understand and organize intelligent information related to threat groups. This paper presents a comparative analysis that was conducted to study the existing threat profiling models. It was found that existing threat models were constructed based on specific objectives, thus each model is limited to only certain components or factors such as assets, threat sources, countermeasures, threat agents, threat outcomes and threat actors. It is suggested that threat profiling can be improved by the combination of components found in each existing threat profiling model/framework. The proposed model can be used by an organization in executing a proactive approach to incident management.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-18
...The Department of Transportation (DOT) invites public comments about our intention to request the Office of Management and Budget (OMB) approval for a renewal of an information collection. The collection involves vehicle manufacturers submitting reports to the Secretary of Transportation on whether a manufacturer will comply with an applicable average fuel economy standard for the model year for which the report is made, the actions a manufacturer has taken or intends to take to comply with the standard and other information the Secretary requires by regulation. The information to be collected will be used to and/or is necessary because of the requirements of 49 U.S.C. 32902. We are required to publish this notice in the Federal Register by the Paperwork Reduction Act of 1995, Public Law 104-13.
DLA Systems Modernization Methodology: Logical Analysis and Design Procedures
1990-07-01
Information Requirement would have little meaning and thus would lose its value . 3 I3 I 1.1.3 INPUT PRODUCTS 3 1.1.3.1 Enterprise Model Objective List 1.1.3.2...at the same time, the attribute is said to be multi- valued . i For example, an E-R model may contain information on the languages an employee speaks...Relationship model is examined in detail to ensure that each data group contains attributes whose values are absolutely determined by their respective
A risk-based coverage model for video surveillance camera control optimization
NASA Astrophysics Data System (ADS)
Zhang, Hongzhou; Du, Zhiguo; Zhao, Xingtao; Li, Peiyue; Li, Dehua
2015-12-01
Visual surveillance system for law enforcement or police case investigation is different from traditional application, for it is designed to monitor pedestrians, vehicles or potential accidents. Visual surveillance risk is defined as uncertainty of visual information of targets and events monitored in present work and risk entropy is introduced to modeling the requirement of police surveillance task on quality and quantity of vide information. the prosed coverage model is applied to calculate the preset FoV position of PTZ camera.
Privacy-preserving periodical publishing for medical information
NASA Astrophysics Data System (ADS)
Jin, Hua; Ju, Shi-guang; Liu, Shan-cheng
2013-07-01
Existing privacy-preserving publishing models can not meet the requirement of periodical publishing for medical information whether these models are static or dynamic. This paper presents a (k,l)-anonymity model with keeping individual association and a principle based on (Epsilon)-invariance group for subsequent periodical publishing, and then, the PKIA and PSIGI algorithms are designed for them. The proposed methods can reserve more individual association with privacy-preserving and have better publishing quality. Experiments confirm our theoretical results and its practicability.
International Planetary Data Alliance (IPDA) Information Model
NASA Technical Reports Server (NTRS)
Hughes, John Steven; Beebe, R.; Guinness, E.; Heather, D.; Huang, M.; Kasaba, Y.; Osuna, P.; Rye, E.; Savorskiy, V.
2007-01-01
This document is the third deliverable of the International Planetary Data Alliance (IPDA) Archive Data Standards Requirements Identification project. The goal of the project is to identify a subset of the standards currently in use by NASAs Planetary Data System (PDS) that are appropriate for internationalization. As shown in the highlighted sections of Figure 1, the focus of this project is the Information Model component of the Data Architecture Standards, namely the object models, a data dictionary, and a set of data formats.
Habitat Suitability Index Models: Veery
Sousa, Patrick J.
1982-01-01
Habitat preferences and species characteristics of the veery (Catharus fuscesens) are described in this publication. It is one of a series of Habitat Suitability Index (HSI) models and was developed through an analysis of available scientific data on the habitat requirements of the veery. Habitat use information is presented in a review of the literature, followed by the development of an HSI model. The model is presented in three formats: graphic; word; and mathematical. Suitability index graphs quantify the species-habitat relationship. These data are synthesized into a model designed to provide information for use in impact assessment and habitat management.
A statistical framework for protein quantitation in bottom-up MS-based proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less
McElreath, Richard; Bell, Adrian V; Efferson, Charles; Lubell, Mark; Richerson, Peter J; Waring, Timothy
2008-11-12
The existence of social learning has been confirmed in diverse taxa, from apes to guppies. In order to advance our understanding of the consequences of social transmission and evolution of behaviour, however, we require statistical tools that can distinguish among diverse social learning strategies. In this paper, we advance two main ideas. First, social learning is diverse, in the sense that individuals can take advantage of different kinds of information and combine them in different ways. Examining learning strategies for different information conditions illuminates the more detailed design of social learning. We construct and analyse an evolutionary model of diverse social learning heuristics, in order to generate predictions and illustrate the impact of design differences on an organism's fitness. Second, in order to eventually escape the laboratory and apply social learning models to natural behaviour, we require statistical methods that do not depend upon tight experimental control. Therefore, we examine strategic social learning in an experimental setting in which the social information itself is endogenous to the experimental group, as it is in natural settings. We develop statistical models for distinguishing among different strategic uses of social information. The experimental data strongly suggest that most participants employ a hierarchical strategy that uses both average observed pay-offs of options as well as frequency information, the same model predicted by our evolutionary analysis to dominate a wide range of conditions.
NASA Technical Reports Server (NTRS)
Lewandowski, B. E.; DeWitt, J. K.; Gallo, C. A.; Gilkey, K. M.; Godfrey, A. P.; Humphreys, B. T.; Jagodnik, K. M.; Kassemi, M.; Myers, J. G.; Nelson, E. S.;
2017-01-01
MOTIVATION: Spaceflight countermeasures mitigate the harmful effects of the space environment on astronaut health and performance. Exercise has historically been used as a countermeasure to physical deconditioning, and additional countermeasures including lower body negative pressure, blood flow occlusion and artificial gravity are being researched as countermeasures to spaceflight-induced fluid shifts. The NASA Digital Astronaut Project uses computational models of physiological systems to inform countermeasure design and to predict countermeasure efficacy.OVERVIEW: Computational modeling supports the development of the exercise devices that will be flown on NASAs new exploration crew vehicles. Biomechanical modeling is used to inform design requirements to ensure that exercises can be properly performed within the volume allocated for exercise and to determine whether the limited mass, volume and power requirements of the devices will affect biomechanical outcomes. Models of muscle atrophy and bone remodeling can predict device efficacy for protecting musculoskeletal health during long-duration missions. A lumped-parameter whole-body model of the fluids within the body, which includes the blood within the cardiovascular system, the cerebral spinal fluid, interstitial fluid and lymphatic system fluid, estimates compartmental changes in pressure and volume due to gravitational changes. These models simulate fluid shift countermeasure effects and predict the associated changes in tissue strain in areas of physiological interest to aid in predicting countermeasure effectiveness. SIGNIFICANCE: Development and testing of spaceflight countermeasure prototypes are resource-intensive efforts. Computational modeling can supplement this process by performing simulations that reduce the amount of necessary experimental testing. Outcomes of the simulations are often important for the definition of design requirements and the identification of factors essential in ensuring countermeasure efficacy.
BIM and IoT: A Synopsis from GIS Perspective
NASA Astrophysics Data System (ADS)
Isikdag, U.
2015-10-01
Internet-of-Things (IoT) focuses on enabling communication between all devices, things that are existent in real life or that are virtual. Building Information Models (BIMs) and Building Information Modelling is a hype that has been the buzzword of the construction industry for last 15 years. BIMs emerged as a result of a push by the software companies, to tackle the problems of inefficient information exchange between different software and to enable true interoperability. In BIM approach most up-to-date an accurate models of a building are stored in shared central databases during the design and the construction of a project and at post-construction stages. GIS based city monitoring / city management applications require the fusion of information acquired from multiple resources, BIMs, City Models and Sensors. This paper focuses on providing a method for facilitating the GIS based fusion of information residing in digital building "Models" and information acquired from the city objects i.e. "Things". Once this information fusion is accomplished, many fields ranging from Emergency Response, Urban Surveillance, Urban Monitoring to Smart Buildings will have potential benefits.
NASA Astrophysics Data System (ADS)
Pascoe, Charlotte; Lawrence, Bryan; Moine, Marie-Pierre; Ford, Rupert; Devine, Gerry
2010-05-01
The EU METAFOR Project (http://metaforclimate.eu) has created a web-based model documentation questionnaire to collect metadata from the modelling groups that are running simulations in support of the Coupled Model Intercomparison Project - 5 (CMIP5). The CMIP5 model documentation questionnaire will retrieve information about the details of the models used, how the simulations were carried out, how the simulations conformed to the CMIP5 experiment requirements and details of the hardware used to perform the simulations. The metadata collected by the CMIP5 questionnaire will allow CMIP5 data to be compared in a scientifically meaningful way. This paper describes the life-cycle of the CMIP5 questionnaire development which starts with relatively unstructured input from domain specialists and ends with formal XML documents that comply with the METAFOR Common Information Model (CIM). Each development step is associated with a specific tool. (1) Mind maps are used to capture information requirements from domain experts and build a controlled vocabulary, (2) a python parser processes the XML files generated by the mind maps, (3) Django (python) is used to generate the dynamic structure and content of the web based questionnaire from processed xml and the METAFOR CIM, (4) Python parsers ensure that information entered into the CMIP5 questionnaire is output as CIM compliant xml, (5) CIM compliant output allows automatic information capture tools to harvest questionnaire content into databases such as the Earth System Grid (ESG) metadata catalogue. This paper will focus on how Django (python) and XML input files are used to generate the structure and content of the CMIP5 questionnaire. It will also address how the choice of development tools listed above provided a framework that enabled working scientists (who we would never ordinarily get to interact with UML and XML) to be part the iterative development process and ensure that the CMIP5 model documentation questionnaire reflects what scientists want to know about the models. Keywords: metadata, CMIP5, automatic information capture, tool development
Assessing FAÇADE Visibility in 3d City Models for City Marketing
NASA Astrophysics Data System (ADS)
Albrecht, F.; Moser, J.; Hijazi, I.
2013-08-01
In city marketing, different applications require the evaluation of the visual impression of displays in the urban environment on people that visit the city. Therefore, this research focuses on the way how visual displays on façades for movie performances are perceived during a cultural event triggered by city marketing. We describe the different visibility analysis methods that are applicable to the analysis of façades. The methods advanced from the domains of Geographic Information Science, architecture and computer graphics. A detailed scenario is described in order to perform a requirements analysis for identifying the requirements to visibility information. This visibility information needs to describe the visual perception of displays on façades adequately. The requirements are compared to the visibility information that can be provided by the visibility methods. A discussion of the comparison summarizes the advantages and disadvantages of existing visibility analysis methods for describing the visibility of façades. The results show that part of the researched approaches is able to support the requirements to visibility information. But they also show that for a complete support of the entire analysis workflow, there remain unsolved workflow integration issues.
The use of analytical models in human-computer interface design
NASA Technical Reports Server (NTRS)
Gugerty, Leo
1993-01-01
Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.
Generic Sensor Failure Modeling for Cooperative Systems.
Jäger, Georg; Zug, Sebastian; Casimiro, António
2018-03-20
The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application's fault tolerance and thereby promises maintainability of such system's safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques.
Generic Sensor Failure Modeling for Cooperative Systems
Jäger, Georg; Zug, Sebastian
2018-01-01
The advent of cooperative systems entails a dynamic composition of their components. As this contrasts current, statically composed systems, new approaches for maintaining their safety are required. In that endeavor, we propose an integration step that evaluates the failure model of shared information in relation to an application’s fault tolerance and thereby promises maintainability of such system’s safety. However, it also poses new requirements on failure models, which are not fulfilled by state-of-the-art approaches. Consequently, this work presents a mathematically defined generic failure model as well as a processing chain for automatically extracting such failure models from empirical data. By examining data of an Sharp GP2D12 distance sensor, we show that the generic failure model not only fulfills the predefined requirements, but also models failure characteristics appropriately when compared to traditional techniques. PMID:29558435
Code of Federal Regulations, 2010 CFR
2010-10-01
... insurance costs for different makes and models of passenger motor vehicles based upon differences in damage susceptibility and crashworthiness, pursuant to section 201(e) of the Motor Vehicle Information and Cost Savings... OF TRANSPORTATION (CONTINUED) INSURANCE COST INFORMATION REGULATION § 582.1 Scope. This part requires...
A Practical Model for Forecasting New Freshman Enrollment during the Application Period.
ERIC Educational Resources Information Center
Paulsen, Michael B.
1989-01-01
A simple and effective model for forecasting freshman enrollment during the application period is presented step by step. The model requires minimal and readily available information, uses a simple linear regression analysis on a personal computer, and provides updated monthly forecasts. (MSE)
NASA Astrophysics Data System (ADS)
McGibbney, L. J.; Hausman, J.; Laurencelle, J. C.; Toaz, R., Jr.; McAuley, J.; Freeborn, D. J.; Stoner, C.
2016-12-01
The Surface Water & Ocean Topography (SWOT) mission brings together two communities focused on a better understanding of the world's oceans and its terrestrial surface waters. U.S. and French oceanographers and hydrologists and international partners have joined forces to develop this new space mission. At NASA JPL's PO.DAAC, the team is currently engaged in the gathering of SWOT User Stores (access patterns, metadata requirements, primary and value added product requirements, data access protocols, etc.) to better inform the adaptive planning of what will be known as the next generation PO.DAAC Information Architecture (IA). The IA effort acknowledges that missions such as SWOT (and NISAR) have few or no precedent in terms of data volume, hot and cold storage, archival, analysis, existing system engineering complexities, etc. and that the only way we can better understand the projected impacts of such requirements is to interface directly with the User Community. Additionally, it also acknowledges that collective learning has taken place to understand certain limitations in the existing data models (DM) underlying the existing PO.DAAC Data Management and Archival System. This work documents an evolutionary, use case based, standards driven approach to adapting the legacy DM and accompanying knowledge representation infrastructure at NASA JPL's PO.DAAC to address forthcoming DAAC mission requirements presented by missions such as SWOT. Some of the topics covered in this evolution include, but are not limited to: How we are leveraging lessons learned from the development of existing DM (such as that generated for SMAP) in an attempt to map them to SWOT. What is the governance model for the SWOT IA? What are the `governing' entities? What is the hierarchy of the `governed entities'? How are elements grouped? How is the design-working group formed? How is model independence maintained and what choices/requirements do we have for the implementation language? The use of Standards such as CF Conventions, NetCDF, HDF and ISO Metadata, etc. Beyond SWOT… what choices were made such that the new PO.DAAC IA will flexible enough and adequately design such that future missions with even more advanced requirements can be accommodated within PO.DAAC.
Nursing Information Flow in Long-Term Care Facilities.
Wei, Quan; Courtney, Karen L
2018-04-01
Long-term care (LTC), residential care requiring 24-hour nursing services, plays an important role in the health care service delivery system. The purpose of this study was to identify the needed clinical information and information flow to support LTC Registered Nurses (RNs) in care collaboration and clinical decision making. This descriptive qualitative study combines direct observations and semistructured interviews, conducted at Alberta's LTC facilities between May 2014 and August 2015. The constant comparative method (CCM) of joint coding was used for data analysis. Nine RNs from six LTC facilities participated in the study. The RN practice environment includes two essential RN information management aspects: information resources and information spaces. Ten commonly used information resources by RNs included: (1) RN-personal notes; (2) facility-specific templates/forms; (3) nursing processes/tasks; (4) paper-based resident profile; (5) daily care plans; (6) RN-notebooks; (7) medication administration records (MARs); (8) reporting software application (RAI-MDS); (9) people (care providers); and (10) references (i.e., books). Nurses used a combination of shared information spaces, such as the Nurses Station or RN-notebook, and personal information spaces, such as personal notebooks or "sticky" notes. Four essential RN information management functions were identified: collection, classification, storage, and distribution. Six sets of information were necessary to perform RN care tasks and communication, including: (1) admission, discharge, and transfer (ADT); (2) assessment; (3) care plan; (4) intervention (with two subsets: medication and care procedure); (5) report; and (6) reference. Based on the RN information management system requirements, a graphic information flow model was constructed. This baseline study identified key components of a current LTC nursing information management system. The information flow model may assist health information technology (HIT) developers to consolidate the design of HIT solutions for LTC, and serve as a communication tool between nurses and information technology (IT) staff to refine requirements and support further LTC HIT research. Schattauer GmbH Stuttgart.
Using chemical organization theory for model checking
Kaleta, Christoph; Richter, Stephan; Dittrich, Peter
2009-01-01
Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053
Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach
Kneifel, Joshua; Webb, David
2016-01-01
Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF. PMID:27956756
Predicting Energy Performance of a Net-Zero Energy Building: A Statistical Approach.
Kneifel, Joshua; Webb, David
2016-09-01
Performance-based building requirements have become more prevalent because it gives freedom in building design while still maintaining or exceeding the energy performance required by prescriptive-based requirements. In order to determine if building designs reach target energy efficiency improvements, it is necessary to estimate the energy performance of a building using predictive models and different weather conditions. Physics-based whole building energy simulation modeling is the most common approach. However, these physics-based models include underlying assumptions and require significant amounts of information in order to specify the input parameter values. An alternative approach to test the performance of a building is to develop a statistically derived predictive regression model using post-occupancy data that can accurately predict energy consumption and production based on a few common weather-based factors, thus requiring less information than simulation models. A regression model based on measured data should be able to predict energy performance of a building for a given day as long as the weather conditions are similar to those during the data collection time frame. This article uses data from the National Institute of Standards and Technology (NIST) Net-Zero Energy Residential Test Facility (NZERTF) to develop and validate a regression model to predict the energy performance of the NZERTF using two weather variables aggregated to the daily level, applies the model to estimate the energy performance of hypothetical NZERTFs located in different cities in the Mixed-Humid climate zone, and compares these estimates to the results from already existing EnergyPlus whole building energy simulations. This regression model exhibits agreement with EnergyPlus predictive trends in energy production and net consumption, but differs greatly in energy consumption. The model can be used as a framework for alternative and more complex models based on the experimental data collected from the NZERTF.
A Conceptual Framework and Principles for Trusted Pervasive Health
Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli
2012-01-01
Background Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept—pervasive health—which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. Objective This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and polices which can make pervasive health trustworthy. Methods In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. Results In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. Conclusions The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed. PMID:22481297
Ancient Glass: A Literature Search and its Role in Waste Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strachan, Denis M.; Pierce, Eric M.
2010-07-01
When developing a performance assessment model for the long-term disposal of immobilized low-activity waste (ILAW) glass, it is desirable to determine the durability of glass forms over very long periods of time. However, testing is limited to short time spans, so experiments are performed under conditions that accelerate the key geochemical processes that control weathering. Verification that models currently being used can reliably calculate the long term behavior ILAW glass is a key component of the overall PA strategy. Therefore, Pacific Northwest National Laboratory was contracted by Washington River Protection Solutions, LLC to evaluate alternative strategies that can be usedmore » for PA source term model validation. One viable alternative strategy is the use of independent experimental data from archaeological studies of ancient or natural glass contained in the literature. These results represent a potential independent experiment that date back to approximately 3600 years ago or 1600 before the current era (bce) in the case of ancient glass and 106 years or older in the case of natural glass. The results of this literature review suggest that additional experimental data may be needed before the result from archaeological studies can be used as a tool for model validation of glass weathering and more specifically disposal facility performance. This is largely because none of the existing data set contains all of the information required to conduct PA source term calculations. For example, in many cases the sediments surrounding the glass was not collected and analyzed; therefore having the data required to compare computer simulations of concentration flux is not possible. This type of information is important to understanding the element release profile from the glass to the surrounding environment and provides a metric that can be used to calibrate source term models. Although useful, the available literature sources do not contain the required information needed to simulate the long-term performance of nuclear waste glasses in a near-surface or deep geologic repositories. The information that will be required include 1) experimental measurements to quantify the model parameters, 2) detailed analyses of altered glass samples, and 3) detailed analyses of the sediment surrounding the ancient glass samples.« less
A conceptual framework and principles for trusted pervasive health.
Ruotsalainen, Pekka Sakari; Blobel, Bernd Gerhard; Seppälä, Antto Veikko; Sorvari, Hannu Olavi; Nykänen, Pirkko Anneli
2012-04-06
Ubiquitous computing technology, sensor networks, wireless communication and the latest developments of the Internet have enabled the rise of a new concept-pervasive health-which takes place in an open, unsecure, and highly dynamic environment (ie, in the information space). To be successful, pervasive health requires implementable principles for privacy and trustworthiness. This research has two interconnected objectives. The first is to define pervasive health as a system and to understand its trust and privacy challenges. The second goal is to build a conceptual model for pervasive health and use it to develop principles and policies which can make pervasive health trustworthy. In this study, a five-step system analysis method is used. Pervasive health is defined using a metaphor of digital bubbles. A conceptual framework model focused on trustworthiness and privacy is then developed for pervasive health. On that model, principles and rules for trusted information management in pervasive health are defined. In the first phase of this study, a new definition of pervasive health was created. Using this model, differences between pervasive health and health care are stated. Reviewed publications demonstrate that the widely used principles of predefined and static trust cannot guarantee trustworthiness and privacy in pervasive health. Instead, such an environment requires personal dynamic and context-aware policies, awareness, and transparency. A conceptual framework model focused on information processing in pervasive health is developed. Using features of pervasive health and relations from the framework model, new principles for trusted pervasive health have been developed. The principles propose that personal health data should be under control of the data subject. The person shall have the right to verify the level of trust of any system which collects or processes his or her health information. Principles require that any stakeholder or system collecting or processing health data must support transparency and shall publish its trust and privacy attributes and even its domain specific policies. The developed principles enable trustworthiness and guarantee privacy in pervasive health. The implementation of principles requires new infrastructural services such as trust verification and policy conflict resolution. After implementation, the accuracy and usability of principles should be analyzed.
ERIC Educational Resources Information Center
Morse, Emile L.; Schmidt, Heidi; Butter, Karen; Rider, Cynthia; Hickey, Thomas B.; O'Neill, Edward T.; Toves, Jenny; Green, Marlan; Soy, Sue; Gunn, Stan; Galloway, Patricia
2002-01-01
Includes four articles that discuss evaluation methods for information management systems under the Defense Advanced Research Projects Agency; building digital libraries at the University of California San Francisco's Tobacco Control Archives; IFLA's Functional Requirements for Bibliographic Records; and designing the Texas email repository model…
Planning Study to Establish DoD Manufacturing Technology Information Analysis Center.
1981-01-01
model for an MTIAC. 5-3 I Type of information inputs from potential MTIAC sources. 5-5 5-3 Processing functions required to produce MTIAC outputs. 5-8...short supply * Energy conservation and concerns of energy inten- siveness of various manufacturing processes and systems required for production of DOD...not play a major role in the process of MT invention, innovation, or diffusion. MT productivity efforts for private industry are carried out by
A New Proof of the Expected Frequency Spectrum under the Standard Neutral Model.
Hudson, Richard R
2015-01-01
The sample frequency spectrum is an informative and frequently employed approach for summarizing DNA variation data. Under the standard neutral model the expectation of the sample frequency spectrum has been derived by at least two distinct approaches. One relies on using results from diffusion approximations to the Wright-Fisher Model. The other is based on Pólya urn models that correspond to the standard coalescent model. A new proof of the expected frequency spectrum is presented here. It is a proof by induction and does not require diffusion results and does not require the somewhat complex sums and combinatorics of the derivations based on urn models.
Using a 3D CAD plant model to simplify process hazard reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tolpa, G.
A Hazard and Operability (HAZOP) review is a formal predictive procedure used to identify potential hazard and operability problems associated with certain processes and facilities. The HAZOP procedure takes place several times during the life cycle of the facility. Replacing plastic models, layout and detail drawings with a 3D CAD electronic model, provides access to process safety information and a detailed level of plant topology that approaches the visualization capability of the imagination. This paper describes the process that is used for adding the use of a 3D CAD model to flowsheets and proven computer programs for the conduct ofmore » hazard and operability reviews. Using flowsheets and study nodes as a road map for the review the need for layout and other detail drawings is all but eliminated. Using the 3D CAD model again for a post-P and ID HAZOP supports conformance to layout and safety requirements, provides superior visualization of the plant configuration and preserves the owners equity in the design. The response from the review teams are overwhelmingly in favor of this type of review over a review that uses only drawings. Over the long term the plant model serves more than just process hazards analysis. Ongoing use of the model can satisfy the required access to process safety information, OHSA documentation and other legal requirements. In this paper extensive instructions address the logic for the process hazards analysis and the preparation required to assist anyone who wishes to add the use of a 3D model to their review.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... adequate time to digest public comments before it renders a decision. 60 FR 44983, Aug. 29, 1995. Therefore... times the amount of details of information are supplied to support why the customer believes the car..., specific drawings, reflectorization, engineering information such as test or modeling of components. Also...
Hierarchical models and bayesian analysis of bird survey information
John R. Sauer; William A. Link; J. Andrew Royle
2005-01-01
Summary of bird survey information is a critical component of conservation activities, but often our summaries rely on statistical methods that do not accommodate the limitations of the information. Prioritization of species requires ranking and analysis of species by magnitude of population trend, but often magnitude of trend is a misleading measure of actual decline...
Sensor/Response Coordination In A Tactical Self-Protection System
NASA Astrophysics Data System (ADS)
Steinberg, Alan N.
1988-08-01
This paper describes a model for integrating information acquisition functions into a response planner within a tactical self-defense system. This model may be used in defining requirements in such applications for sensor systems and for associated processing and control functions. The goal of information acquisition in a self-defense system is generally not that of achieving the best possible estimate of the threat environment; but rather to provide resolution of that environment sufficient to support response decisions. We model the information acquisition problem as that of achieving a partition among possible world states such that the final partition maps into the system's repertoire of possible responses.
NASA Astrophysics Data System (ADS)
Gigante-Barrera, Ángel; Dindar, Serdar; Kaewunruen, Sakdirat; Ruikar, Darshan
2017-10-01
Railway turnouts are complex systems designed using complex geometries and grades which makes them difficult to be managed in terms of risk prevention. This feature poses a substantial peril to rail users as it is considered a cause of derailment. In addition, derailment deals to financial losses due to operational downtimes and monetary compensations in case of death or injure. These are fundamental drivers to consider mitigating risks arising from poor risk management during design. Prevention through design (PtD) is a process that introduces tacit knowledge from industry professionals during the design process. There is evidence that Building Information Modelling (BIM) can help to mitigate risk since the inception of the project. BIM is considered an Information System (IS) were tacit knowledge can be stored and retrieved from a digital database making easy to take promptly decisions as information is ready to be analysed. BIM at the model element level entails working with 3D elements and embedded data, therefore adding a layer of complexity to the management of information along the different stages of the project and across different disciplines. In order to overcome this problem, the industry has created a framework for model progression specification named Level of Development (LOD). The paper presents an IDM based framework for design risk mitigation through code validation using the LOD. This effort resulted on risk datasets which describe graphically and non-graphically a rail turnout as the model progresses. Thus, permitting its inclusion within risk information systems. The assignment of an LOD construct to a set of data, requires specialised management and process related expertise. Furthermore, the selection of a set of LOD constructs requires a purpose based analysis. Therefore, a framework for LOD constructs implementation within the IDM for code checking is required for the industry to progress in this particular field.
Integration of remote sensing based surface information into a three-dimensional microclimate model
NASA Astrophysics Data System (ADS)
Heldens, Wieke; Heiden, Uta; Esch, Thomas; Mueller, Andreas; Dech, Stefan
2017-03-01
Climate change urges cities to consider the urban climate as part of sustainable planning. Urban microclimate models can provide knowledge on the climate at building block level. However, very detailed information on the area of interest is required. Most microclimate studies therefore make use of assumptions and generalizations to describe the model area. Remote sensing data with area wide coverage provides a means to derive many parameters at the detailed spatial and thematic scale required by urban climate models. This study shows how microclimate simulations for a series of real world urban areas can be supported by using remote sensing data. In an automated process, surface materials, albedo, LAI/LAD and object height have been derived and integrated into the urban microclimate model ENVI-met. Multiple microclimate simulations have been carried out both with the dynamic remote sensing based input data as well as with manual and static input data to analyze the impact of the RS-based surface information and the suitability of the applied data and techniques. A valuable support of the integration of the remote sensing based input data for ENVI-met is the use of an automated processing chain. This saves tedious manual editing and allows for fast and area wide generation of simulation areas. The analysis of the different modes shows the importance of high quality height data, detailed surface material information and albedo.
Cultural Consensus Theory: Aggregating Continuous Responses in a Finite Interval
NASA Astrophysics Data System (ADS)
Batchelder, William H.; Strashny, Alex; Romney, A. Kimball
Cultural consensus theory (CCT) consists of cognitive models for aggregating responses of "informants" to test items about some domain of their shared cultural knowledge. This paper develops a CCT model for items requiring bounded numerical responses, e.g. probability estimates, confidence judgments, or similarity judgments. The model assumes that each item generates a latent random representation in each informant, with mean equal to the consensus answer and variance depending jointly on the informant and the location of the consensus answer. The manifest responses may reflect biases of the informants. Markov Chain Monte Carlo (MCMC) methods were used to estimate the model, and simulation studies validated the approach. The model was applied to an existing cross-cultural dataset involving native Japanese and English speakers judging the similarity of emotion terms. The results sharpened earlier studies that showed that both cultures appear to have very similar cognitive representations of emotion terms.
Matrix models for the black hole information paradox
NASA Astrophysics Data System (ADS)
Iizuka, Norihiro; Okuda, Takuya; Polchinski, Joseph
2010-02-01
We study various matrix models with a charge-charge interaction as toy models of the gauge dual of the AdS black hole. These models show a continuous spectrum and power-law decay of correlators at late time and infinite N, implying information loss in this limit. At finite N, the spectrum is discrete and correlators have recurrences, so there is no information loss. We study these models by a variety of techniques, such as Feynman graph expansion, loop equations, and sum over Young tableaux, and we obtain explicitly the leading 1/ N 2 corrections for the spectrum and correlators. These techniques are suggestive of possible dual bulk descriptions. At fixed order in 1/ N 2 the spectrum remains continuous and no recurrence occurs, so information loss persists. However, the interchange of the long-time and large- N limits is subtle and requires further study.
Informing pedagogy through the brain-targeted teaching model.
Hardiman, Mariale
2012-01-01
Improving teaching to foster creative thinking and problem-solving for students of all ages will require two essential changes in current educational practice. First, to allow more time for deeper engagement with material, it is critical to reduce the vast number of topics often required in many courses. Second, and perhaps more challenging, is the alignment of pedagogy with recent research on cognition and learning. With a growing focus on the use of research to inform teaching practices, educators need a pedagogical framework that helps them interpret and apply research findings. This article describes the Brain-Targeted Teaching Model, a scheme that relates six distinct aspects of instruction to research from the neuro- and cognitive sciences.
Construction of language models for an handwritten mail reading system
NASA Astrophysics Data System (ADS)
Morillot, Olivier; Likforman-Sulem, Laurence; Grosicki, Emmanuèle
2012-01-01
This paper presents a system for the recognition of unconstrained handwritten mails. The main part of this system is an HMM recognizer which uses trigraphs to model contextual information. This recognition system does not require any segmentation into words or characters and directly works at line level. To take into account linguistic information and enhance performance, a language model is introduced. This language model is based on bigrams and built from training document transcriptions only. Different experiments with various vocabulary sizes and language models have been conducted. Word Error Rate and Perplexity values are compared to show the interest of specific language models, fit to handwritten mail recognition task.
General RMP Guidance - Chapter 4: Offsite Consequence Analysis
This chapter provides basic compliance information, not modeling methodologies, for people who plan to do their own air dispersion modeling. OCA is a required part of the risk management program, and involves worst-case and alternative release scenarios.
NASA Technical Reports Server (NTRS)
Pace, Dale K.
2000-01-01
A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.
NASA Technical Reports Server (NTRS)
Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.
1995-01-01
Computing architectures are being assembled that extend concurrent engineering practices by providing more efficient execution and collaboration on distributed, heterogeneous computing networks. Built on the successes of initial architectures, requirements for a next-generation design computing infrastructure can be developed. These requirements concentrate on those needed by a designer in decision-making processes from product conception to recycling and can be categorized in two areas: design process and design information management. A designer both designs and executes design processes throughout design time to achieve better product and process capabilities while expanding fewer resources. In order to accomplish this, information, or more appropriately design knowledge, needs to be adequately managed during product and process decomposition as well as recomposition. A foundation has been laid that captures these requirements in a design architecture called DREAMS (Developing Robust Engineering Analysis Models and Specifications). In addition, a computing infrastructure, called IMAGE (Intelligent Multidisciplinary Aircraft Generation Environment), is being developed that satisfies design requirements defined in DREAMS and incorporates enabling computational technologies.
From Data to Knowledge: GEOSS experience and the GEOSS Knowledge Base contribution to the GCI
NASA Astrophysics Data System (ADS)
Santoro, M.; Nativi, S.; Mazzetti, P., Sr.; Plag, H. P.
2016-12-01
According to systems theory, data is raw, it simply exists and has no significance beyond its existence; while, information is data that has been given meaning by way of relational connection. The appropriate collection of information, such that it contributes to understanding, is a process of knowledge creation.The Global Earth Observation System of Systems (GEOSS) developed by the Group on Earth Observations (GEO) is a set of coordinated, independent Earth observation, information and processing systems that interact and provide access to diverse information for a broad range of users in both public and private sectors. GEOSS links these systems to strengthen the monitoring of the state of the Earth. In the past ten years, the development of GEOSS has taught several lessons dealing with the need to move from (open) data to information and knowledge sharing. Advanced user-focused services require to move from a data-driven framework to a knowledge sharing platform. Such a platform needs to manage information and knowledge, in addition to datasets linked to them. For this scope, GEO has launched a specific task called "GEOSS Knowledge Base", which deals with resources, like user requirements, Sustainable Development Goals (SDGs), observation and processing ontologies, publications, guidelines, best practices, business processes/algorithms, definition of advanced concepts like Essential Variables (EVs), indicators, strategic goals, etc. In turn, information and knowledge (e.g. guidelines, best practices, user requirements, business processes, algorithms, etc.) can be used to generate additional information and knowledge from shared datasets. To fully utilize and leverage the GEOSS Knowledge Base, the current GEOSS Common Infrastructure (GCI) model will be extended and advanced to consider important concepts and implementation artifacts, such as data processing services and environmental/economic models as well as EVs, Primary Indicators, and SDGs. The new GCI model will link these concepts to the present dataset, observation and sensor concepts, enabling a set of very important new capabilities to be offered to GEOSS users.
Teacher Modeling Using Complex Informational Texts
ERIC Educational Resources Information Center
Fisher, Douglas; Frey, Nancy
2015-01-01
Modeling in complex texts requires that teachers analyze the text for factors of qualitative complexity and then design lessons that introduce students to that complexity. In addition, teachers can model the disciplinary nature of content area texts as well as word solving and comprehension strategies. Included is a planning guide for think aloud.
Transferability of habitat suitability models for nesting woodpeckers associated with wildfire
Quresh S. Latif; Vicki Saab; Jeff P. Hollenbeck; Jonathan G. Dudley
2016-01-01
Following wildfire, forest managers are challenged with meeting both socioeconomic demands (e.g., salvage logging) and mandates requiring habitat conservation for disturbance-associated wildlife (e.g., woodpeckers). Habitat suitability models for nesting woodpeckers can be informative, but tests of model transferability are needed to understand how broadly...
77 FR 38744 - Airworthiness Directives; Sikorsky Aircraft-Manufactured Model S-64F Helicopters
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-29
... the Sikorsky Aircraft Corporation-manufactured Model S-64F helicopters, now under the Erickson Air-Crane Incorporated (Erickson) Model S-64F type certificate. That AD currently requires inspections... docket shortly after receipt. For service information identified in this proposed AD, contact Erickson...
Sustainability-based decision making is a challenging process that requires balancing trade-offs among social, economic, and environmental components. System Dynamic (SD) models can be useful tools to inform sustainability-based decision making because they provide a holistic co...
Class Extraction and Classification Accuracy in Latent Class Models
ERIC Educational Resources Information Center
Wu, Qiong
2009-01-01
Despite the increasing popularity of latent class models (LCM) in educational research, methodological studies have not yet accumulated much information on the appropriate application of this modeling technique, especially with regard to requirement on sample size and number of indicators. This dissertation study represented an initial attempt to…
Conservation planning for a species requires knowledge of the species’ population status and distribution. An important step in obtaining this information for many species is the development of models that predict the habitat distribution for the species. Such models can be usef...
Information Technology Security Training Requirements: A Role- and Performance-Based Model
1998-04-01
Journal, Vol.9, no. 2, pp. 18-20, 1995. Kearsley, Greg. Andragogy (M. Knowles), Washington, DC: George Washington University, 1996. Knowles, M.S...The Modern Practice of Adult Education: Andragogy vs. Pedagogy, New York: Association Press, 1970. Information Technology Security Training
Volume and Value of Big Healthcare Data.
Dinov, Ivo D
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.
Volume and Value of Big Healthcare Data
Dinov, Ivo D.
2016-01-01
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309
CALCULATION OF NONLINEAR CONFIDENCE AND PREDICTION INTERVALS FOR GROUND-WATER FLOW MODELS.
Cooley, Richard L.; Vecchia, Aldo V.
1987-01-01
A method is derived to efficiently compute nonlinear confidence and prediction intervals on any function of parameters derived as output from a mathematical model of a physical system. The method is applied to the problem of obtaining confidence and prediction intervals for manually-calibrated ground-water flow models. To obtain confidence and prediction intervals resulting from uncertainties in parameters, the calibrated model and information on extreme ranges and ordering of the model parameters within one or more independent groups are required. If random errors in the dependent variable are present in addition to uncertainties in parameters, then calculation of prediction intervals also requires information on the extreme range of error expected. A simple Monte Carlo method is used to compute the quantiles necessary to establish probability levels for the confidence and prediction intervals. Application of the method to a hypothetical example showed that inclusion of random errors in the dependent variable in addition to uncertainties in parameters can considerably widen the prediction intervals.
Dynamic information processing states revealed through neurocognitive models of object semantics
Clarke, Alex
2015-01-01
Recognising objects relies on highly dynamic, interactive brain networks to process multiple aspects of object information. To fully understand how different forms of information about objects are represented and processed in the brain requires a neurocognitive account of visual object recognition that combines a detailed cognitive model of semantic knowledge with a neurobiological model of visual object processing. Here we ask how specific cognitive factors are instantiated in our mental processes and how they dynamically evolve over time. We suggest that coarse semantic information, based on generic shared semantic knowledge, is rapidly extracted from visual inputs and is sufficient to drive rapid category decisions. Subsequent recurrent neural activity between the anterior temporal lobe and posterior fusiform supports the formation of object-specific semantic representations – a conjunctive process primarily driven by the perirhinal cortex. These object-specific representations require the integration of shared and distinguishing object properties and support the unique recognition of objects. We conclude that a valuable way of understanding the cognitive activity of the brain is though testing the relationship between specific cognitive measures and dynamic neural activity. This kind of approach allows us to move towards uncovering the information processing states of the brain and how they evolve over time. PMID:25745632
NASA Astrophysics Data System (ADS)
Adams, R.; Quinn, P. F.; Bowes, M. J.
2015-04-01
A model for simulating runoff pathways and water quality fluxes has been developed using the minimum information requirement (MIR) approach. The model, the Catchment Runoff Attenuation Flux Tool (CRAFT), is applicable to mesoscale catchments and focusses primarily on hydrological pathways that mobilise nutrients. Hence CRAFT can be used to investigate the impact of flow pathway management intervention strategies designed to reduce the loads of nutrients into receiving watercourses. The model can help policy makers meet water quality targets and consider methods to obtain "good" ecological status. A case study of the 414 km2 Frome catchment, Dorset, UK, has been described here as an application of CRAFT in order to highlight the above issues at the mesoscale. The model was primarily calibrated on 10-year records of weekly data to reproduce the observed flows and nutrient (nitrate nitrogen - N; phosphorus - P) concentrations. Data from 2 years with sub-daily monitoring at the same site were also analysed. These data highlighted some additional signals in the nutrient flux, particularly of soluble reactive phosphorus, which were not observable in the weekly data. This analysis has prompted the choice of using a daily time step as the minimum information requirement to simulate the processes observed at the mesoscale, including the impact of uncertainty. A management intervention scenario was also run to demonstrate how the model can support catchment managers investigating how reducing the concentrations of N and P in the various flow pathways. This mesoscale modelling tool can help policy makers consider a range of strategies to meet the European Union (EU) water quality targets for this type of catchment.
Users Manual for the Geospatial Stream Flow Model (GeoSFM)
Artan, Guleid A.; Asante, Kwabena; Smith, Jodie; Pervez, Md Shahriar; Entenmann, Debbie; Verdin, James P.; Rowland, James
2008-01-01
The monitoring of wide-area hydrologic events requires the manipulation of large amounts of geospatial and time series data into concise information products that characterize the location and magnitude of the event. To perform these manipulations, scientists at the U.S. Geological Survey Center for Earth Resources Observation and Science (EROS), with the cooperation of the U.S. Agency for International Development, Office of Foreign Disaster Assistance (USAID/OFDA), have implemented a hydrologic modeling system. The system includes a data assimilation component to generate data for a Geospatial Stream Flow Model (GeoSFM) that can be run operationally to identify and map wide-area streamflow anomalies. GeoSFM integrates a geographical information system (GIS) for geospatial preprocessing and postprocessing tasks and hydrologic modeling routines implemented as dynamically linked libraries (DLLs) for time series manipulations. Model results include maps that depicting the status of streamflow and soil water conditions. This Users Manual provides step-by-step instructions for running the model and for downloading and processing the input data required for initial model parameterization and daily operation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo
2004-06-01
In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less
NASA Astrophysics Data System (ADS)
Hoffman, Kenneth J.; Keithley, Hudson
1994-12-01
There are few systems which aggregate standardized pertinent clinical observations of discrete patient problems and resolutions. The systematic information supplied by clinicians is generally provided to justify reimbursement from insurers. Insurers, by their nature, and expert in modeling health care costs by diagnosis, procedures, and population risk groups. Medically, they rely on clinician generated diagnostic and coded procedure information. Clinicians will document a patient's status at a discrete point in time through narrative. Clinical notes do not support aggregate and systematic analysis of outcome. A methodology exists and has been used by the US Army Drug and Alcohol Program to model the clinical activities, associated costs, and data requirements of an outpatient clinic. This has broad applicability for a comprehensive health care system to which patient costs and data requirements can be established.
Towards improving software security by using simulation to inform requirements and conceptual design
Nutaro, James J.; Allgood, Glenn O.; Kuruganti, Teja
2015-06-17
We illustrate the use of modeling and simulation early in the system life-cycle to improve security and reduce costs. The models that we develop for this illustration are inspired by problems in reliability analysis and supervisory control, for which similar models are used to quantify failure probabilities and rates. In the context of security, we propose that models of this general type can be used to understand trades between risk and cost while writing system requirements and during conceptual design, and thereby significantly reduce the need for expensive security corrections after a system enters operation
USEPA Resistance Management Model development
The US EPA requires registrants of plant incorporated protectant (PIP) crops to provide information relating to the time frame for pest resistance development related to the control traits of the crop. Simulation models are used to evaluate the future conditions for resistance de...
30 CFR 7.303 - Application requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... APPROVAL OF MINING PRODUCTS TESTING BY APPLICANT OR THIRD PARTY Electric Motor Assemblies § 7.303 Application requirements. (a) An application for approval of a motor assembly shall include a composite drawing or drawings with the following information: (1) Model (type), frame size, and rating of the motor...
Dugan, J. M.; Berrios, D. C.; Liu, X.; Kim, D. K.; Kaizer, H.; Fagan, L. M.
1999-01-01
Our group has built an information retrieval system based on a complex semantic markup of medical textbooks. We describe the construction of a set of web-based knowledge-acquisition tools that expedites the collection and maintenance of the concepts required for text markup and the search interface required for information retrieval from the marked text. In the text markup system, domain experts (DEs) identify sections of text that contain one or more elements from a finite set of concepts. End users can then query the text using a predefined set of questions, each of which identifies a subset of complementary concepts. The search process matches that subset of concepts to relevant points in the text. The current process requires that the DE invest significant time to generate the required concepts and questions. We propose a new system--called ACQUIRE (Acquisition of Concepts and Queries in an Integrated Retrieval Environment)--that assists a DE in two essential tasks in the text-markup process. First, it helps her to develop, edit, and maintain the concept model: the set of concepts with which she marks the text. Second, ACQUIRE helps her to develop a query model: the set of specific questions that end users can later use to search the marked text. The DE incorporates concepts from the concept model when she creates the questions in the query model. The major benefit of the ACQUIRE system is a reduction in the time and effort required for the text-markup process. We compared the process of concept- and query-model creation using ACQUIRE to the process used in previous work by rebuilding two existing models that we previously constructed manually. We observed a significant decrease in the time required to build and maintain the concept and query models. Images Figure 1 Figure 2 Figure 4 Figure 5 PMID:10566457
Computer simulation modeling of recreation use: Current status, case studies, and future directions
David N. Cole
2005-01-01
This report compiles information about recent progress in the application of computer simulation modeling to planning and management of recreation use, particularly in parks and wilderness. Early modeling efforts are described in a chapter that provides an historical perspective. Another chapter provides an overview of modeling options, common data input requirements,...
Toward a Stress Process Model of Children's Exposure to Physical Family and Community Violence
ERIC Educational Resources Information Center
Foster, Holly; Brooks-Gunn, Jeanne
2009-01-01
Theoretically informed models are required to further the comprehensive understanding of children's ETV. We draw on the stress process paradigm to forward an overall conceptual model of ETV (ETV) in childhood and adolescence. Around this conceptual model, we synthesize research in four dominant areas of the literature which are detailed but often…
The application of remote sensing to the development and formulation of hydrologic planning models
NASA Technical Reports Server (NTRS)
Castruccio, P. A.; Loats, H. L., Jr.; Fowler, T. R.
1976-01-01
A hydrologic planning model is developed based on remotely sensed inputs. Data from LANDSAT 1 are used to supply the model's quantitative parameters and coefficients. The use of LANDSAT data as information input to all categories of hydrologic models requiring quantitative surface parameters for their effects functioning is also investigated.
Modeling uncertainty in requirements engineering decision support
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.
2005-01-01
One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.
DEVS Unified Process for Web-Centric Development and Testing of System of Systems
2008-05-20
gathering from the user. Further, methodologies have been developed to generate DEVS models from BPMN /BPEL-based and message-based requirement specifications...27] 3. BPMN /BPEL based system specifications: Business Process Modeling Notation ( BPMN ) [bpm] or Business Process Execution Language (BPEL) provide a...information is stored in .wsdl and .bpel files for BPEL but in proprietary format for BPMN . 4. DoDAF-based requirement specifications: Department of
NASA Technical Reports Server (NTRS)
1973-01-01
The traffic analyses and system requirements data generated in the study resulted in the development of two traffic models; the baseline traffic model and the new traffic model. The baseline traffic model provides traceability between the numbers and types of geosynchronous missions considered in the study and the entire spectrum of missions foreseen in the total national space program. The information presented pertaining to the baseline traffic model includes: (1) definition of the baseline traffic model, including identification of specific geosynchronous missions and their payload delivery schedules through 1990; (2) Satellite location criteria, including the resulting distribution of the satellite population; (3) Geosynchronous orbit saturation analyses, including the effects of satellite physical proximity and potential electromagnetic interference; and (4) Platform system requirements analyses, including satellite and mission equipment descriptions, the options and limitations in grouping satellites, and on-orbit servicing criteria (both remotely controlled and man-attended).
NASA Astrophysics Data System (ADS)
Seim, H. E.; Fletcher, M.; Mooers, C. N. K.; Nelson, J. R.; Weisberg, R. H.
2009-05-01
A conceptual design for a southeast United States regional coastal ocean observing system (RCOOS) is built upon a partnership between institutions of the region and among elements of the academic, government and private sectors. This design envisions support of a broad range of applications (e.g., marine operations, natural hazards, and ecosystem-based management) through the routine operation of predictive models that utilize the system observations to ensure their validity. A distributed information management system enables information flow, and a centralized information hub serves to aggregate information regionally and distribute it as needed. A variety of observing assets are needed to satisfy model requirements. An initial distribution of assets is proposed that recognizes the physical structure and forcing in the southeast U.S. coastal ocean. In-situ data collection includes moorings, profilers and gliders to provide 3D, time-dependent sampling, HF radar and surface drifters for synoptic sampling of surface currents, and satellite remote sensing of surface ocean properties. Nested model systems are required to properly represent ocean conditions from the outer edge of the EEZ to the watersheds. An effective RCOOS will depend upon a vital "National Backbone" (federally supported) system of in situ and satellite observations, model products, and data management. This dependence highlights the needs for a clear definition of the National Backbone components and a Concept of Operations (CONOPS) that defines the roles, functions and interactions of regional and federal components of the integrated system. A preliminary CONOPS is offered for the Southeast (SE) RCOOS. Thorough system testing is advocated using a combination of application-specific and process-oriented experiments. Estimates of costs and personnel required as initial components of the SE RCOOS are included. Initial thoughts on the Research and Development program required to support the RCOOS are also outlined.
Heterosexual Anal Intercourse: A Neglected Risk Factor for HIV?
Baggaley, Rebecca F.; Dimitrov, Dobromir; Owen, Branwen N.; Pickles, Michael; Butler, Ailsa R.; Masse, Ben; Boily, Marie-Claude
2014-01-01
Heterosexual anal intercourse confers a much greater risk of HIV transmission than vaginal intercourse, yet its contribution to heterosexual HIV epidemics has been under researched. In this article we review the current state of knowledge of heterosexual anal intercourse practice worldwide and identify the information required to assess its role in HIV transmission within heterosexual populations, including input measures required to inform mathematical models. We then discuss the evidence relating anal intercourse and HIV with sexual violence. PMID:23279040
Reusable Launch Vehicle (RLV) Market Analysis Model
NASA Technical Reports Server (NTRS)
Prince, Frank A.
1999-01-01
The RLV Market Analysis model is at best a rough order approximation of actual market behavior. However, it does give a quick indication if the flights exists to enable an economically viable RLV, and the assumptions necessary for the vehicle to capture those flights. Additional analysis, market research, and updating with the latest information on payloads and launches would improve the model. Plans are to update the model as new information becomes available and new requirements are levied. This tool will continue to be a vital part of NASA's RLV business analysis capability for the foreseeable future.
Water quality modeling using geographic information system (GIS) data
NASA Technical Reports Server (NTRS)
Engel, Bernard A
1992-01-01
Protection of the environment and natural resources at the Kennedy Space Center (KSC) is of great concern. The potential for surface and ground water quality problems resulting from non-point sources of pollution was examined using models. Since spatial variation of parameters required was important, geographic information systems (GIS) and their data were used. The potential for groundwater contamination was examined using the SEEPAGE (System for Early Evaluation of the Pollution Potential of Agricultural Groundwater Environments) model. A watershed near the VAB was selected to examine potential for surface water pollution and erosion using the AGNPS (Agricultural Non-Point Source Pollution) model.
Viger, Roland J.
2008-01-01
This fact sheet provides a high-level description of the GIS Weasel, a software system designed to aid users in preparing spatial information as input to lumped and distributed parameter environmental simulation models (ESMs). The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to the application of a user?s ESM and to generate parameters from those maps. The operation of the GIS Weasel does not require a user to be a GIS expert, only that a user has an understanding of the spatial information requirements of the model. The GIS Weasel software system provides a GIS-based graphical user interface (GUI), C programming language executables, and general utility scripts. The software will run on any computing platform where ArcInfo Workstation (version 8.1 or later) and the GRID extension are accessible. The user controls the GIS Weasel by interacting with menus, maps, and tables.
Shuttle operations simulation model programmers'/users' manual
NASA Technical Reports Server (NTRS)
Porter, D. G.
1972-01-01
The prospective user of the shuttle operations simulation (SOS) model is given sufficient information to enable him to perform simulation studies of the space shuttle launch-to-launch operations cycle. The procedures used for modifying the SOS model to meet user requirements are described. The various control card sequences required to execute the SOS model are given. The report is written for users with varying computer simulation experience. A description of the components of the SOS model is included that presents both an explanation of the logic involved in the simulation of the shuttle operations cycle and a description of the routines used to support the actual simulation.
Building hydrologic information systems to promote climate resilience in the Blue Nile/Abay higlands
USDA-ARS?s Scientific Manuscript database
Climate adaptation requires information about climate and land-surface conditions – spatially distributed, and at scales of human influence (the field scale). This article describes a project aimed at combining meteorological data, satellite remote sensing, hydrologic modeling, and downscaled clima...
44 CFR 80.13 - Application information.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ACQUISITION AND RELOCATION FOR OPEN SPACE Requirements Prior to Award § 80.13 Application information. (a) An application for acquisition of property for the purpose of open space must include: (1) A photograph that... deed restriction language, which shall be consistent with the FEMA model deed restriction that the...
44 CFR 80.13 - Application information.
Code of Federal Regulations, 2014 CFR
2014-10-01
... ACQUISITION AND RELOCATION FOR OPEN SPACE Requirements Prior to Award § 80.13 Application information. (a) An application for acquisition of property for the purpose of open space must include: (1) A photograph that... deed restriction language, which shall be consistent with the FEMA model deed restriction that the...
44 CFR 80.13 - Application information.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ACQUISITION AND RELOCATION FOR OPEN SPACE Requirements Prior to Award § 80.13 Application information. (a) An application for acquisition of property for the purpose of open space must include: (1) A photograph that... deed restriction language, which shall be consistent with the FEMA model deed restriction that the...
Iterative combination of national phenotype, genotype, pedigree, and foreign information
USDA-ARS?s Scientific Manuscript database
Single step methods can combine all sources of information into accurate rankings for animals with and without genotypes. Equations that require inverting the genomic relationship matrix G work well with limited numbers of animals, but equivalent models without inversion are needed as numbers increa...
Developing a 3D Road Cadastral System: Comparing Legal Requirements and User Needs
NASA Astrophysics Data System (ADS)
Gristina, S.; Ellul, C.; Scianna, A.
2016-10-01
Road transport has always played an important role in a country's growth and, in order to manage road networks and ensure a high standard of road performance (e.g. durability, efficiency and safety), both public and private road inventories have been implemented using databases and Geographical Information Systems. They enable registering and managing significant amounts of different road information, but to date do not focus on 3D road information, data integration and interoperability. In an increasingly complex 3D urban environment, and in the age of smart cities, however, applications including intelligent transport systems, mobility and traffic management, road maintenance and safety require digital data infrastructures to manage road data: thus new inventories based on integrated 3D road models (queryable, updateable and shareable on line) are required. This paper outlines the first step towards the implementation of 3D GIS-based road inventories. Focusing on the case study of the "Road Cadastre" (the Italian road inventory as established by law), it investigates current limitations and required improvements, and also compares the required data structure imposed by cadastral legislation with real road users' needs. The study aims to: a) determine whether 3D GIS would improve road cadastre (for better management of data through the complete life-cycle infrastructure projects); b) define a conceptual model for a 3D road cadastre for Italy (whose general principles may be extended also to other countries).
DOT National Transportation Integrated Search
1998-01-01
To the traveling public, the most readily apparent benefit of AZTech is easy access to traveler information. Providing travelers with real value requires that information to be factual, comprehensive and timely. Through AZTech, numerous services are ...
ERIC Educational Resources Information Center
Delcroix, Jean-Claude
There is a general feeling that European telecommunications are delaying the introduction of new information services. This paper responds to some of the questions concerning online information. The views result from research work at DECADE (Belgium) on the requirements of smaller organizations on the one hand and on telecommunications costs on…
2006-11-01
29 3.2.4 National Register Information System Model ............................................................... 30 3.3 Summary of...are later based on that information . Despite their general level of power and resolution, Federal data management and accounting tools have not yet...have begun tracking their historic building and structure inven- tories using geographic information systems (GISs). A geospatial-referenced data
The Model of ICT-Based Career Information Services and Decision-Making Ability of Learners
ERIC Educational Resources Information Center
Syakir, Muhammad; Mahmud, Alimuddin; Achmad, Arifin
2016-01-01
One of the impacts of information technology in guidance counseling is in the implementation of the support system. Entering the world of globalization and rapid technological breadth of information requires counseling to adjust to the environment in order to meet the needs of learners. Therefore, cyber-counseling is now developing. It is one of…
A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning
NASA Astrophysics Data System (ADS)
Basdekas, L.; Stewart, N.; Triana, E.
2013-12-01
Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.
Operator interface design considerations for a PACS information management system
NASA Astrophysics Data System (ADS)
Steinke, James E.; Nabijee, Kamal H.; Freeman, Rick H.; Prior, Fred W.
1990-08-01
As prototype PACS grow into fully digital departmental and hospital-wide systems, effective information storage and retrieval mechanisms become increasingly important. Thus far, designers of PACS workstations have concentrated on image communication and display functionality. The new challenge is to provide appropriate operator interface environments to facilitate information retrieval. The "Marburg Model" 1 provides a detailed analysis of the functions, control flows and data structures used in Radiology. It identifies a set of "actors" who perform information manipulation functions. Drawing on this model and its associated methodology it is possible to identify four modes of use of information systems in Radiology: Clinical Routine, Research, Consultation, and Administration. Each mode has its own specific access requirements and views of information. An operator interface strategy appropriate for each mode will be proposed. Clinical Routine mode is the principal concern of PACS primary diagnosis workstations. In a full PACS implementation, such workstations must provide a simple and consistent navigational aid for the on-line image database, a local work list of cases to be reviewed, and easy access to information from other hospital information systems. A hierarchical method of information access is preferred because it provides the ability to start at high-level entities and iteratively narrow the scope of information from which to select subsequent operations. An implementation using hierarchical, nested software windows which fulfills such requirements shall be examined.
Habitat Suitability Index Models: Marten
Allen, Arthur W.
1982-01-01
Habitat preferences and species characteristics of the pine marten (Martes americana) are described in this publication. It is one of a series of Habitat Suitability Index (HSI) models and was developed through an analysis of available scientific data on the species-habitat requirements of the pine marten. Habitat use information is presented in a review of the literature, followed by the development of a HSI model. The model is presented in three formats: graphic, word and mathematical. Suitability index graphs quantify the species-habitat relationship. These data are then synthesized into a model which is designed to provide information for use in impact assessment and habitat management activities.
Bayesian networks and information theory for audio-visual perception modeling.
Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis
2010-09-01
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn
2006-09-01
Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.
NASA Technical Reports Server (NTRS)
Killough, Brian; Stover, Shelley
2008-01-01
The Committee on Earth Observation Satellites (CEOS) provides a brief to the Goddard Institute for Space Studies (GISS) regarding the CEOS Systems Engineering Office (SEO) and current work on climate requirements and analysis. A "system framework" is provided for the Global Earth Observation System of Systems (GEOSS). SEO climate-related tasks are outlined including the assessment of essential climate variable (ECV) parameters, use of the "systems framework" to determine relevant informational products and science models and the performance of assessments and gap analyses of measurements and missions for each ECV. Climate requirements, including instruments and missions, measurements, knowledge and models, and decision makers, are also outlined. These requirements would establish traceability from instruments to products and services allowing for benefit evaluation of instruments and measurements. Additionally, traceable climate requirements would provide a better understanding of global climate models.
Meta-Modeling: A Knowledge-Based Approach to Facilitating Model Construction and Reuse
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Dungan, Jennifer L.
1997-01-01
In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.
Using a Prediction Model to Manage Cyber Security Threats.
Jaganathan, Venkatesh; Cherurveettil, Priyesh; Muthu Sivashanmugam, Premapriya
2015-01-01
Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization.
Using a Prediction Model to Manage Cyber Security Threats
Muthu Sivashanmugam, Premapriya
2015-01-01
Cyber-attacks are an important issue faced by all organizations. Securing information systems is critical. Organizations should be able to understand the ecosystem and predict attacks. Predicting attacks quantitatively should be part of risk management. The cost impact due to worms, viruses, or other malicious software is significant. This paper proposes a mathematical model to predict the impact of an attack based on significant factors that influence cyber security. This model also considers the environmental information required. It is generalized and can be customized to the needs of the individual organization. PMID:26065024
Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak
2017-01-01
The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.
The secure authorization model for healthcare information system.
Hsu, Wen-Shin; Pan, Jiann-I
2013-10-01
Exploring healthcare system for assisting medical services or transmitting patients' personal health information in web application has been widely investigated. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. In the healthcare system, not all users are allowed to access all the information. Several authorization models for restricting users to access specific information at specific permissions have been proposed. However, as the number of users and the amount of information grows, the difficulties for administrating user authorization will increase. The critical problem limits the widespread usage of the healthcare system. This paper proposes an approach for role-based and extends it to deal with the information for authorizations in the healthcare system. We propose the role-based authorization model which supports authorizations for different kinds of objects, and a new authorization domain. Based on this model, we discuss the issues and requirements of security in the healthcare systems. The security issues for services shared between different healthcare industries will also be discussed.
Wiggins, Paul A
2015-07-21
This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Building the bridge between animal movement and population dynamics.
Morales, Juan M; Moorcroft, Paul R; Matthiopoulos, Jason; Frair, Jacqueline L; Kie, John G; Powell, Roger A; Merrill, Evelyn H; Haydon, Daniel T
2010-07-27
While the mechanistic links between animal movement and population dynamics are ecologically obvious, it is much less clear when knowledge of animal movement is a prerequisite for understanding and predicting population dynamics. GPS and other technologies enable detailed tracking of animal location concurrently with acquisition of landscape data and information on individual physiology. These tools can be used to refine our understanding of the mechanistic links between behaviour and individual condition through 'spatially informed' movement models where time allocation to different behaviours affects individual survival and reproduction. For some species, socially informed models that address the movements and average fitness of differently sized groups and how they are affected by fission-fusion processes at relevant temporal scales are required. Furthermore, as most animals revisit some places and avoid others based on their previous experiences, we foresee the incorporation of long-term memory and intention in movement models. The way animals move has important consequences for the degree of mixing that we expect to find both within a population and between individuals of different species. The mixing rate dictates the level of detail required by models to capture the influence of heterogeneity and the dynamics of intra- and interspecific interaction.
Concept-based query language approach to enterprise information systems
NASA Astrophysics Data System (ADS)
Niemi, Timo; Junkkari, Marko; Järvelin, Kalervo
2014-01-01
In enterprise information systems (EISs) it is necessary to model, integrate and compute very diverse data. In advanced EISs the stored data often are based both on structured (e.g. relational) and semi-structured (e.g. XML) data models. In addition, the ad hoc information needs of end-users may require the manipulation of data-oriented (structural), behavioural and deductive aspects of data. Contemporary languages capable of treating this kind of diversity suit only persons with good programming skills. In this paper we present a concept-oriented query language approach to manipulate this diversity so that the programming skill requirements are considerably reduced. In our query language, the features which need technical knowledge are hidden in application-specific concepts and structures. Therefore, users need not be aware of the underlying technology. Application-specific concepts and structures are represented by the modelling primitives of the extended RDOOM (relational deductive object-oriented modelling) which contains primitives for all crucial real world relationships (is-a relationship, part-of relationship, association), XML documents and views. Our query language also supports intensional and extensional-intensional queries, in addition to conventional extensional queries. In its query formulation, the end-user combines available application-specific concepts and structures through shared variables.
Dashboard systems: implementing pharmacometrics from bench to bedside.
Mould, Diane R; Upton, Richard N; Wojciechowski, Jessica
2014-09-01
In recent years, there has been increasing interest in the development of medical decision-support tools, including dashboard systems. Dashboard systems are software packages that integrate information and calculations about therapeutics from multiple components into a single interface for use in the clinical environment. Given the high cost of medical care, and the increasing need to demonstrate positive clinical outcomes for reimbursement, dashboard systems may become an important tool for improving patient outcome, improving clinical efficiency and containing healthcare costs. Similarly the costs associated with drug development are also rising. The use of model-based drug development (MBDD) has been proposed as a tool to streamline this process, facilitating the selection of appropriate doses and making informed go/no-go decisions. However, complete implementation of MBDD has not always been successful owing to a variety of factors, including the resources required to provide timely modeling and simulation updates. The application of dashboard systems in drug development reduces the resource requirement and may expedite updating models as new data are collected, allowing modeling results to be available in a timely fashion. In this paper, we present some background information on dashboard systems and propose the use of these systems both in the clinic and during drug development.
PDS4 - Some Principles for Agile Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.
2015-12-01
PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.
75 FR 27560 - Proposed Information Collection Activity; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-17
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Administration for Children and Families Proposed... LIHEAP and Detailed Model Plan. OMB No.: 0970-0075. Description: States, including the District of... annual application (Model Plan) that meets the LIHEAP statutory and regulatory requirements prior to...
ERIC Educational Resources Information Center
Litaker, R. Gregory
The applications of a recently developed computer program for microcomputers in developing models in an institutional research environment are considered. The VISICALC program requires no user programming skills, is available for all major brands of microcomputers, and provides for easy exchange of information between users of different computing…
49 CFR 545.4 - Response to inquiries.
Code of Federal Regulations, 2012 CFR
2012-10-01
... information identifying the vehicles (by make, model, and vehicle identification number) that have been... vehicles (by make, model, and vehicle identification number) that are excluded from the requirements of 49... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL MOTOR VEHICLE THEFT PREVENTION STANDARD PHASE-IN AND SMALL...
49 CFR 545.4 - Response to inquiries.
Code of Federal Regulations, 2014 CFR
2014-10-01
... information identifying the vehicles (by make, model, and vehicle identification number) that have been... vehicles (by make, model, and vehicle identification number) that are excluded from the requirements of 49... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL MOTOR VEHICLE THEFT PREVENTION STANDARD PHASE-IN AND SMALL...
49 CFR 545.4 - Response to inquiries.
Code of Federal Regulations, 2013 CFR
2013-10-01
... information identifying the vehicles (by make, model, and vehicle identification number) that have been... vehicles (by make, model, and vehicle identification number) that are excluded from the requirements of 49... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL MOTOR VEHICLE THEFT PREVENTION STANDARD PHASE-IN AND SMALL...
49 CFR 545.4 - Response to inquiries.
Code of Federal Regulations, 2011 CFR
2011-10-01
... information identifying the vehicles (by make, model, and vehicle identification number) that have been... vehicles (by make, model, and vehicle identification number) that are excluded from the requirements of 49... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION FEDERAL MOTOR VEHICLE THEFT PREVENTION STANDARD PHASE-IN AND SMALL...
Modeling Rare and Unique Documents: Using FRBR[subscript OO]/CIDOC CRM
ERIC Educational Resources Information Center
Le Boeuf, Patrick
2012-01-01
Both the library and the museum communities have developed conceptual models for the information they produce about the collections they hold: FRBR (Functional Requirements for Bibliographic Records) and CIDOC CRM (Conceptual Reference Model). But neither proves perfectly adequate when it comes to some specific types of rare and unique materials:…
The creation of Physiologically Based Pharmacokinetic (PBPK) models for a new chemical requires the selection of an appropriate model structure and the collection of a large amount of data for parameterization. Commonly, a large proportion of the needed information is collected ...
Getting a Picture that Is Both Accurate and Stable: Situation Models and Epistemic Validation
ERIC Educational Resources Information Center
Schroeder, Sascha; Richter, Tobias; Hoever, Inga
2008-01-01
Text comprehension entails the construction of a situation model that prepares individuals for situated action. In order to meet this function, situation model representations are required to be both accurate and stable. We propose a framework according to which comprehenders rely on epistemic validation to prevent inaccurate information from…
Information processing requirements for on-board monitoring of automatic landing
NASA Technical Reports Server (NTRS)
Sorensen, J. A.; Karmarkar, J. S.
1977-01-01
A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.
40 CFR 600.405-08 - Dealer requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Dealer Availability of Fuel Economy Information § 600.405-08 Dealer... sale a copy of the annual Fuel Economy Guide containing the information specified in § 600.407. The...
Creating Better Library Information Systems: The Road to FRBR-Land
ERIC Educational Resources Information Center
Mercun, Tanja; Švab, Katarina; Harej, Viktor; Žumer, Maja
2013-01-01
Introduction: To provide valuable services in the future, libraries will need to create better information systems and set up an infrastructure more in line with the current technologies. The "Functional Requirements for Bibliographic Records" conceptual model provides a basis for this transformation, but there are still a number of…
Information Model for Reusability in Clinical Trial Documentation
ERIC Educational Resources Information Center
Bahl, Bhanu
2013-01-01
In clinical research, New Drug Application (NDA) to health agencies requires generation of a large number of documents throughout the clinical development life cycle, many of which are also submitted to public databases and external partners. Current processes to assemble the information, author, review and approve the clinical research documents,…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-22
... Proposed Information Collection to OMB ``Logic Model'' Grant Performance Report Standard AGENCY: Office of... proposal. Applicants of HUD Federal Financial Assistance are required to indicate intended results and impacts. Grant recipients report against their baseline performance standards. This process standardizes...
40 CFR 600.405-08 - Dealer requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Fuel Economy Regulations for 1977 and Later Model Year Automobiles-Dealer Availability of Fuel Economy Information § 600.405-08 Dealer... sale a copy of the annual Fuel Economy Guide containing the information specified in § 600.407. The...
Supporting the Educational Needs of Students with Orthopedic Impairments.
ERIC Educational Resources Information Center
Heller, Kathryn Wolff; Swinehart-Jones, Dawn
2003-01-01
This article provides information on orthopedic impairments and the unique knowledge and skills required to provide these students with an appropriate education. Information on current practice is provided, as well as training and technical assistance models that can be used to help provide teachers with the necessary training. (Contains…
49 CFR 580.7 - Disclosure of odometer information for leased motor vehicles.
Code of Federal Regulations, 2014 CFR
2014-10-01
... the vehicle, including its make, model, year, and body type, and its vehicle identification number; (7... motor vehicles. 580.7 Section 580.7 Transportation Other Regulations Relating to Transportation... DISCLOSURE REQUIREMENTS § 580.7 Disclosure of odometer information for leased motor vehicles. (a) Before...
49 CFR 580.7 - Disclosure of odometer information for leased motor vehicles.
Code of Federal Regulations, 2010 CFR
2010-10-01
... the vehicle, including its make, model, year, and body type, and its vehicle identification number; (7... motor vehicles. 580.7 Section 580.7 Transportation Other Regulations Relating to Transportation... DISCLOSURE REQUIREMENTS § 580.7 Disclosure of odometer information for leased motor vehicles. (a) Before...
49 CFR 580.7 - Disclosure of odometer information for leased motor vehicles.
Code of Federal Regulations, 2012 CFR
2012-10-01
... the vehicle, including its make, model, year, and body type, and its vehicle identification number; (7... motor vehicles. 580.7 Section 580.7 Transportation Other Regulations Relating to Transportation... DISCLOSURE REQUIREMENTS § 580.7 Disclosure of odometer information for leased motor vehicles. (a) Before...
49 CFR 580.7 - Disclosure of odometer information for leased motor vehicles.
Code of Federal Regulations, 2013 CFR
2013-10-01
... the vehicle, including its make, model, year, and body type, and its vehicle identification number; (7... motor vehicles. 580.7 Section 580.7 Transportation Other Regulations Relating to Transportation... DISCLOSURE REQUIREMENTS § 580.7 Disclosure of odometer information for leased motor vehicles. (a) Before...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-09
... procedures DOE uses to process loan applications submitted to DOE's Advanced Technology Vehicles... information. The procedures are modeled after existing procedures DOE uses to process loan applications... requirements as described above for any information submitted through the Title XVII loan application process...
Modeling and Frequency Tracking of Marine Mammal Whistle Calls
2009-02-01
retrieve em- bedded information from watermarked synthetic whistle calls. Different fundamental frequency watermarking schemes are proposed b&ed on...unmodified frequency contour is relatively constant, there is little frequency separation between information bits, and watermark retrieval requires...UHYLHZLQJWKHFROOHFWLRQRILQIRUPDWLRQ6HQGFRPPHQWVUHJDUGLQJWKLVEXUGHQHVWLPDWH RU DQ\\RWKHUDVSHFWRIWKLVFROOHFWLRQ RI LQIRUPDWLRQ LQFOXGLQJ
Enhanced Online Access Requires Redesigned Delivery Options and Cost Models
ERIC Educational Resources Information Center
Stern, David
2007-01-01
Rapidly developing online information technologies provide dramatically new capabilities and opportunities, and place new responsibilities on all involved to recreate networks for scholarly communication. Collaborations between all segments of the information network are made possible and necessary as we attempt to find a balanced and mutually…
LISPA (Library and Information Center Staff Planning Advisor): A Microcomputer-Based System.
ERIC Educational Resources Information Center
Devadason, F. J.; Vespry, H. A.
1996-01-01
Describes LISPA (Library and Information Center Staff Planning Advisor), a set of programs based on Ranganathan's staff plan model. LISPA particularly aids in planning for library staff requirements, both professional and paraprofessional, in developing countries where automated systems for other library operations are not yet available.…
NASA Technical Reports Server (NTRS)
Dickinson, William B.
1995-01-01
An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.
Advances in analytical chemistry
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.
1991-01-01
Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.
Small passenger car transmission test-Chevrolet 200 transmission
NASA Technical Reports Server (NTRS)
Bujold, M. P.
1980-01-01
The small passenger car transmission was tested to supply electric vehicle manufacturers with technical information regarding the performance of commerically available transmissions which would enable them to design a more energy efficient vehicle. With this information the manufacturers could estimate vehicle driving range as well as speed and torque requirements for specific road load performance characteristics. A 1979 Chevrolet Model 200 automatic transmission was tested per a passenger car automatic transmission test code (SAE J651b) which required drive performance, coast performance, and no load test conditions. The transmission attained maximum efficiencies in the mid-eighty percent range for both drive performance tests and coast performance tests. Torque, speed and efficiency curves map the complete performance characteristics for Chevrolet Model 200 transmission.
eHealth integration and interoperability issues: towards a solution through enterprise architecture.
Adenuga, Olugbenga A; Kekwaletswe, Ray M; Coleman, Alfred
2015-01-01
Investments in healthcare information and communication technology (ICT) and health information systems (HIS) continue to increase. This is creating immense pressure on healthcare ICT and HIS to deliver and show significance in such investments in technology. It is discovered in this study that integration and interoperability contribute largely to this failure in ICT and HIS investment in healthcare, thus resulting in the need towards healthcare architecture for eHealth. This study proposes an eHealth architectural model that accommodates requirement based on healthcare need, system, implementer, and hardware requirements. The model is adaptable and examines the developer's and user's views that systems hold high hopes for their potential to change traditional organizational design, intelligence, and decision-making.
Mathematical Modeling of Programmatic Requirements for Yaws Eradication
Mitjà, Oriol; Fitzpatrick, Christopher; Asiedu, Kingsley; Solomon, Anthony W.; Mabey, David C.W.; Funk, Sebastian
2017-01-01
Yaws is targeted for eradication by 2020. The mainstay of the eradication strategy is mass treatment followed by case finding. Modeling has been used to inform programmatic requirements for other neglected tropical diseases and could provide insights into yaws eradication. We developed a model of yaws transmission varying the coverage and number of rounds of treatment. The estimated number of cases arising from an index case (basic reproduction number [R0]) ranged from 1.08 to 3.32. To have 80% probability of achieving eradication, 8 rounds of treatment with 80% coverage were required at low estimates of R0 (1.45). This requirement increased to 95% at high estimates of R0 (2.47). Extending the treatment interval to 12 months increased requirements at all estimates of R0. At high estimates of R0 with 12 monthly rounds of treatment, no combination of variables achieved eradication. Models should be used to guide the scale-up of yaws eradication. PMID:27983500
Technical requirements for bioassay support services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hickman, D.P.; Anderson, A.L.
1991-05-01
This document provides the technical basis for the Chem-Nuclear Geotech (Geotech) bioassay program. It includes information and details that can be used as a model in providing technical contents and requirements for bioassay laboratory support, either internally or in solicitations by Geotech to obtain subcontractor laboratory support. It provides a detailed summary and description of the types of bioassay samples to be expected in support of Geotech remedial projects for the US Department of Energy and the bioassay services and analytical requirements necessary to process such samples, including required limits of sensitivity. General responsibilities of the bioassay laboratory are alsomore » addressed, including quality assurance. Peripheral information of importance to the program is included in the appendices of this document. 7 tabs.« less
Participatory interaction design in user requirements specification in healthcare.
Martikainen, Susanna; Ikävalko, Pauliina; Korpela, Mikko
2010-01-01
Healthcare information systems are accused of poor usability even in the popular media in Finland. Doctors especially have been very critical and actively expressed their opinions in public. User involvement and user-centered design methods are seen as the key solution to usability problems. In this paper we describe a research case where participatory methods were experimented within healthcare information systems development in medicinal care in a hospital. The study was part of a larger research project on Activity-driven Information Systems Development in healthcare. The study started by finding out about and modeling the present state of medicinal care in the hospital. After that it was important to define and model the goal state. The goal state, facilitated by the would-be software package, was modeled with the help of user interface drawings as one way of prototyping. Traditional usability methods were extended during the study. According to the health professionals' feedback, the use of participatory and user-centered interaction design methods, particularly user interface drawings enabled them to describe their requirements and create common understanding with the system developers.
The National Map seamless digital elevation model specifications
Archuleta, Christy-Ann M.; Constance, Eric W.; Arundel, Samantha T.; Lowe, Amanda J.; Mantey, Kimberly S.; Phillips, Lori A.
2017-08-02
This specification documents the requirements and standards used to produce the seamless elevation layers for The National Map of the United States. Seamless elevation data are available for the conterminous United States, Hawaii, Alaska, and the U.S. territories, in three different resolutions—1/3-arc-second, 1-arc-second, and 2-arc-second. These specifications include requirements and standards information about source data requirements, spatial reference system, distribution tiling schemes, horizontal resolution, vertical accuracy, digital elevation model surface treatment, georeferencing, data source and tile dates, distribution and supporting file formats, void areas, metadata, spatial metadata, and quality assurance and control.
Application of Artificial Intelligence for Bridge Deterioration Model.
Chen, Zhang; Wu, Yangyang; Li, Li; Sun, Lijun
2015-01-01
The deterministic bridge deterioration model updating problem is well established in bridge management, while the traditional methods and approaches for this problem require manual intervention. An artificial-intelligence-based approach was presented to self-updated parameters of the bridge deterioration model in this paper. When new information and data are collected, a posterior distribution was constructed to describe the integrated result of historical information and the new gained information according to Bayesian theorem, which was used to update model parameters. This AI-based approach is applied to the case of updating parameters of bridge deterioration model, which is the data collected from bridges of 12 districts in Shanghai from 2004 to 2013, and the results showed that it is an accurate, effective, and satisfactory approach to deal with the problem of the parameter updating without manual intervention.
Framework for a clinical information system.
Van de Velde, R
2000-01-01
The current status of our work towards the design and implementation of a reference architecture for a Clinical Information System is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the 'middle' tier apply the clinical (business) model and application rules to communicate with so-called 'thin client' workstations. The main characteristics are the focus on modelling and reuse of both data and business logic as there is a shift away from data and functional modelling towards object modelling. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.
Application of Artificial Intelligence for Bridge Deterioration Model
Chen, Zhang; Wu, Yangyang; Sun, Lijun
2015-01-01
The deterministic bridge deterioration model updating problem is well established in bridge management, while the traditional methods and approaches for this problem require manual intervention. An artificial-intelligence-based approach was presented to self-updated parameters of the bridge deterioration model in this paper. When new information and data are collected, a posterior distribution was constructed to describe the integrated result of historical information and the new gained information according to Bayesian theorem, which was used to update model parameters. This AI-based approach is applied to the case of updating parameters of bridge deterioration model, which is the data collected from bridges of 12 districts in Shanghai from 2004 to 2013, and the results showed that it is an accurate, effective, and satisfactory approach to deal with the problem of the parameter updating without manual intervention. PMID:26601121
Constructing RBAC Based Security Model in u-Healthcare Service Platform
Shin, Moon Sun; Jeon, Heung Seok; Ju, Yong Wan; Lee, Bum Ju; Jeong, Seon-Phil
2015-01-01
In today's era of aging society, people want to handle personal health care by themselves in everyday life. In particular, the evolution of medical and IT convergence technology and mobile smart devices has made it possible for people to gather information on their health status anytime and anywhere easily using biometric information acquisition devices. Healthcare information systems can contribute to the improvement of the nation's healthcare quality and the reduction of related cost. However, there are no perfect security models or mechanisms for healthcare service applications, and privacy information can therefore be leaked. In this paper, we examine security requirements related to privacy protection in u-healthcare service and propose an extended RBAC based security model. We propose and design u-healthcare service integration platform (u-HCSIP) applying RBAC security model. The proposed u-HCSIP performs four main functions: storing and exchanging personal health records (PHR), recommending meals and exercise, buying/selling private health information or experience, and managing personal health data using smart devices. PMID:25695104
López, Diego M; Blobel, Bernd; Gonzalez, Carolina
2010-01-01
Requirement analysis, design, implementation, evaluation, use, and maintenance of semantically interoperable Health Information Systems (HIS) have to be based on eHealth standards. HIS-DF is a comprehensive approach for HIS architectural development based on standard information models and vocabulary. The empirical validity of HIS-DF has not been demonstrated so far. Through an empirical experiment, the paper demonstrates that using HIS-DF and HL7 information models, semantic quality of HIS architecture can be improved, compared to architectures developed using traditional RUP process. Semantic quality of the architecture has been measured in terms of model's completeness and validity metrics. The experimental results demonstrated an increased completeness of 14.38% and an increased validity of 16.63% when using the HIS-DF and HL7 information models in a sample HIS development project. Quality assurance of the system architecture in earlier stages of HIS development presumes an increased quality of final HIS systems, which supposes an indirect impact on patient care.
Constructing RBAC based security model in u-healthcare service platform.
Shin, Moon Sun; Jeon, Heung Seok; Ju, Yong Wan; Lee, Bum Ju; Jeong, Seon-Phil
2015-01-01
In today's era of aging society, people want to handle personal health care by themselves in everyday life. In particular, the evolution of medical and IT convergence technology and mobile smart devices has made it possible for people to gather information on their health status anytime and anywhere easily using biometric information acquisition devices. Healthcare information systems can contribute to the improvement of the nation's healthcare quality and the reduction of related cost. However, there are no perfect security models or mechanisms for healthcare service applications, and privacy information can therefore be leaked. In this paper, we examine security requirements related to privacy protection in u-healthcare service and propose an extended RBAC based security model. We propose and design u-healthcare service integration platform (u-HCSIP) applying RBAC security model. The proposed u-HCSIP performs four main functions: storing and exchanging personal health records (PHR), recommending meals and exercise, buying/selling private health information or experience, and managing personal health data using smart devices.
On Utilizing Optimal and Information Theoretic Syntactic Modeling for Peptide Classification
NASA Astrophysics Data System (ADS)
Aygün, Eser; Oommen, B. John; Cataltepe, Zehra
Syntactic methods in pattern recognition have been used extensively in bioinformatics, and in particular, in the analysis of gene and protein expressions, and in the recognition and classification of bio-sequences. These methods are almost universally distance-based. This paper concerns the use of an Optimal and Information Theoretic (OIT) probabilistic model [11] to achieve peptide classification using the information residing in their syntactic representations. The latter has traditionally been achieved using the edit distances required in the respective peptide comparisons. We advocate that one can model the differences between compared strings as a mutation model consisting of random Substitutions, Insertions and Deletions (SID) obeying the OIT model. Thus, in this paper, we show that the probability measure obtained from the OIT model can be perceived as a sequence similarity metric, using which a Support Vector Machine (SVM)-based peptide classifier, referred to as OIT_SVM, can be devised.
NASA Astrophysics Data System (ADS)
Elliott, Thomas J.; Gu, Mile
2018-03-01
Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.
3D Modelling of Urban Terrain (Modelisation 3D de milieu urbain)
2011-09-01
Panel • IST Information Systems Technology Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis and Studies Panel • SCI... Systems Concepts and Integration Panel • SET Sensors and Electronics Technology Panel These bodies are made up of national representatives as well as...of a part of it may be made for individual use only. The approval of the RTA Information Management Systems Branch is required for more than one
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter Andrew
The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomicmore » scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.« less
Laurence, Caroline O; Heywood, Troy; Bell, Janice; Atkinson, Kaye; Karnon, Jonathan
2018-03-27
Health workforce planning models have been developed to estimate the future health workforce requirements for a population whom they serve and have been used to inform policy decisions. To adapt and further develop a need-based GP workforce simulation model to incorporate current and estimated geographic distribution of patients and GPs. A need-based simulation model that estimates the supply of GPs and levels of services required in South Australia (SA) was adapted and applied to the Western Australian (WA) workforce. The main outcome measure was the differences in the number of full-time equivalent (FTE) GPs supplied and required from 2013 to 2033. The base scenario estimated a shortage of GPs in WA from 2019 onwards with a shortage of 493 FTE GPs in 2033, while for SA, estimates showed an oversupply over the projection period. The WA urban and rural models estimated an urban shortage of GPs over this period. A reduced international medical graduate recruitment scenario resulted in estimated shortfalls of GPs by 2033 for WA and SA. The WA-specific scenarios of lower population projections and registrar work value resulted in a reduced shortage of FTE GPs in 2033, while unfilled training places increased the shortfall of FTE GPs in 2033. The simulation model incorporates contextual differences to its structure that allows within and cross jurisdictional comparisons of workforce estimations. It also provides greater insights into the drivers of supply and demand and the impact of changes in workforce policy, promoting more informed decision-making.
Common Data Models and Efficient Reproducible Workflows for Distributed Ocean Model Skill Assessment
NASA Astrophysics Data System (ADS)
Signell, R. P.; Snowden, D. P.; Howlett, E.; Fernandes, F. A.
2014-12-01
Model skill assessment requires discovery, access, analysis, and visualization of information from both sensors and models, and traditionally has been possible only by a few experts. The US Integrated Ocean Observing System (US-IOOS) consists of 17 Federal Agencies and 11 Regional Associations that produce data from various sensors and numerical models; exactly the information required for model skill assessment. US-IOOS is seeking to develop documented skill assessment workflows that are standardized, efficient, and reproducible so that a much wider community can participate in the use and assessment of model results. Standardization requires common data models for observational and model data. US-IOOS relies on the CF Conventions for observations and structured grid data, and on the UGRID Conventions for unstructured (e.g. triangular) grid data. This allows applications to obtain only the data they require in a uniform and parsimonious way using web services: OPeNDAP for model output and OGC Sensor Observation Service (SOS) for observed data. Reproducibility is enabled with IPython Notebooks shared on GitHub (http://github.com/ioos). These capture the entire skill assessment workflow, including user input, search, access, analysis, and visualization, ensuring that workflows are self-documenting and reproducible by anyone, using free software. Python packages for common data models are Pyugrid and the British Met Office Iris package. Python packages required to run the workflows (pyugrid, pyoos, and the British Met Office Iris package) are also available on GitHub and on Binstar.org so that users can run scenarios using the free Anaconda Python distribution. Hosted services such as Wakari enable anyone to reproduce these workflows for free, without installing any software locally, using just their web browser. We are also experimenting with Wakari Enterprise, which allows multi-user access from a web browser to an IPython Server running where large quantities of model output reside, increasing the efficiency. The open development and distribution of these workflows, and the software on which they depend, is an educational resource for those new to the field and a center of focus where practitioners can contribute new software and ideas.
Nonlinear Unsteady Aerodynamic Modeling Using Wind Tunnel and Computational Data
NASA Technical Reports Server (NTRS)
Murphy, Patrick C.; Klein, Vladislav; Frink, Neal T.
2016-01-01
Extensions to conventional aircraft aerodynamic models are required to adequately predict responses when nonlinear unsteady flight regimes are encountered, especially at high incidence angles and under maneuvering conditions. For a number of reasons, such as loss of control, both military and civilian aircraft may extend beyond normal and benign aerodynamic flight conditions. In addition, military applications may require controlled flight beyond the normal envelope, and civilian flight may require adequate recovery or prevention methods from these adverse conditions. These requirements have led to the development of more general aerodynamic modeling methods and provided impetus for researchers to improve both techniques and the degree of collaboration between analytical and experimental research efforts. In addition to more general mathematical model structures, dynamic test methods have been designed to provide sufficient information to allow model identification. This paper summarizes research to develop a modeling methodology appropriate for modeling aircraft aerodynamics that include nonlinear unsteady behaviors using both experimental and computational test methods. This work was done at Langley Research Center, primarily under the NASA Aviation Safety Program, to address aircraft loss of control, prevention, and recovery aerodynamics.
NASA Astrophysics Data System (ADS)
Kolodny, Michael A.
2017-05-01
Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.
Eguzkiza, Aitor; Trigo, Jesús Daniel; Martínez-Espronceda, Miguel; Serrano, Luis; Andonegui, José
2015-08-01
Most healthcare services use information and communication technologies to reduce and redistribute the workload associated with follow-up of chronic conditions. However, the lack of normalization of the information handled in and exchanged between such services hinders the scalability and extendibility. The use of medical standards for modelling and exchanging information, especially dual-model based approaches, can enhance the features of screening services. Hence, the approach of this paper is twofold. First, this article presents a generic methodology to model patient-centered clinical processes. Second, a proof of concept of the proposed methodology was conducted within the diabetic retinopathy (DR) screening service of the Health Service of Navarre (Spain) in compliance with a specific dual-model norm (openEHR). As a result, a set of elements required for deploying a model-driven DR screening service has been established, namely: clinical concepts, archetypes, termsets, templates, guideline definition rules, and user interface definitions. This model fosters reusability, because those elements are available to be downloaded and integrated in any healthcare service, and interoperability, since from then on such services can share information seamlessly. Copyright © 2015 Elsevier Inc. All rights reserved.
Hydrological modeling of upper Indus Basin and assessment of deltaic ecology
USDA-ARS?s Scientific Manuscript database
Managing water resources is mostly required at watershed scale where the complex hydrology processes and interactions linking land surface, climatic factors and human activities can be studied. Geographical Information System based watershed model; Soil and Water Assessment Tool (SWAT) is applied f...
GIS and crop simulation modelling applications in climate change research
USDA-ARS?s Scientific Manuscript database
The challenges that climate change presents humanity require an unprecedented ability to predict the responses of crops to environment and management. Geographic information systems (GIS) and crop simulation models are two powerful and highly complementary tools that are increasingly used for such p...
AIR QUALITY MODELING AT COARSE-TO-FINE SCALES IN URBAN AREAS
Urban air toxics control strategies are moving towards a community based modeling approach, with an emphasis on assessing those areas that experience high air toxic concentration levels, the so-called "hot spots". This approach will require information that accurately maps and...
DOT National Transportation Integrated Search
1993-12-01
This report presents a comprehensive modeling framework for user responses to Advanced Traveler Information Systems (ATIS) services and identifies the data needs for the validation of such a framework. The authors present overviews of the framework b...
The study and implementation of the wireless network data security model
NASA Astrophysics Data System (ADS)
Lin, Haifeng
2013-03-01
In recent years, the rapid development of Internet technology and the advent of information age, people are increasing the strong demand for the information products and the market for information technology. Particularly, the network security requirements have become more sophisticated. This paper analyzes the wireless network in the data security vulnerabilities. And a list of wireless networks in the framework is the serious defects with the related problems. It has proposed the virtual private network technology and wireless network security defense structure; and it also given the wireless networks and related network intrusion detection model for the detection strategies.
Reorienting health systems to meet the demand for consumer health solutions.
Buckeridge, David L
2014-01-01
There is a clear and pronounced gap between the demand for and access to consumer health solutions. Existing health information systems and broader health system factors such as funding models are reasons for this gap. There are strong arguments from the perspectives of the consumer and population health for closing this gap, but the case from the perspective of the current health system is mixed. Closing the gap will require a concerted effort to reorient health information systems and funding models to support online access by consumers to health information and health services.
Bayesian Integration of Information in Hippocampal Place Cells
Madl, Tamas; Franklin, Stan; Chen, Ke; Montaldi, Daniela; Trappl, Robert
2014-01-01
Accurate spatial localization requires a mechanism that corrects for errors, which might arise from inaccurate sensory information or neuronal noise. In this paper, we propose that Hippocampal place cells might implement such an error correction mechanism by integrating different sources of information in an approximately Bayes-optimal fashion. We compare the predictions of our model with physiological data from rats. Our results suggest that useful predictions regarding the firing fields of place cells can be made based on a single underlying principle, Bayesian cue integration, and that such predictions are possible using a remarkably small number of model parameters. PMID:24603429
2018-03-13
all information . Use additional pages if necessary.) PROTOCOL #: FDG20160012A DATE: 13 March 2018 PROTOCOL TITLE: Accelerating Coagulation...Investigator Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract Submission (Mandatory) 4 FDG20160012A...Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission This abstract requires a brief (no more than 200 words) factual summary of the
2018-03-09
all information . Use additional pages if necessary.) PROTOCOL #: FDG20170005A DATE: 9 March 2018 PROTOCOL TITLE: Determining...Investigator Attachments: Attachment 1: Defense Technical Information Center (DTIC) Abstract Submission (Mandatory) 4 FDG20170005A...Attachment 1 Defense Technical Information Center (DTIC) Abstract Submission This abstract requires a brief (no more than 200 words) factual summary of the
ERIC Educational Resources Information Center
Tajuddin, Muhammad
2015-01-01
Information System (IS) is a requirement for private colleges in improving their governance to reach Good University Governance (GUG). From 2006 to 2008 information technology (IT) assistance had been granted to 1,072 private colleges and continued by grant development program. Considering such a big IT grant, there is a need to study the IT grant…
Sub-component modeling for face image reconstruction in video communications
NASA Astrophysics Data System (ADS)
Shiell, Derek J.; Xiao, Jing; Katsaggelos, Aggelos K.
2008-08-01
Emerging communications trends point to streaming video as a new form of content delivery. These systems are implemented over wired systems, such as cable or ethernet, and wireless networks, cell phones, and portable game systems. These communications systems require sophisticated methods of compression and error-resilience encoding to enable communications across band-limited and noisy delivery channels. Additionally, the transmitted video data must be of high enough quality to ensure a satisfactory end-user experience. Traditionally, video compression makes use of temporal and spatial coherence to reduce the information required to represent an image. In many communications systems, the communications channel is characterized by a probabilistic model which describes the capacity or fidelity of the channel. The implication is that information is lost or distorted in the channel, and requires concealment on the receiving end. We demonstrate a generative model based transmission scheme to compress human face images in video, which has the advantages of a potentially higher compression ratio, while maintaining robustness to errors and data corruption. This is accomplished by training an offline face model and using the model to reconstruct face images on the receiving end. We propose a sub-component AAM modeling the appearance of sub-facial components individually, and show face reconstruction results under different types of video degradation using a weighted and non-weighted version of the sub-component AAM.
Hadwin, Julie A; Garner, Matthew; Perez-Olivas, Gisela
2006-11-01
The aim of this paper is to explore parenting as one potential route through which information processing biases for threat develop in children. It reviews information processing biases in childhood anxiety in the context of theoretical models and empirical research in the adult anxiety literature. Specifically, it considers how adult models have been used and adapted to develop a theoretical framework with which to investigate information processing biases in children. The paper then considers research which specifically aims to understand the relationship between parenting and the development of information processing biases in children. It concludes that a clearer theoretical framework is required to understand the significance of information biases in childhood anxiety, as well as their origins in parenting.
A nursing-specific model of EPR documentation: organizational and professional requirements.
von Krogh, Gunn; Nåden, Dagfinn
2008-01-01
To present the Norwegian documentation KPO model (quality assurance, problem solving, and caring). To present the requirements and multiple electronic patient record (EPR) functions the model is designed to address. The model's professional substance, a conceptual framework for nursing practice is developed by examining, reorganizing, and completing existing frameworks. The model's methodology, an information management system, is developed using an expert group. Both model elements were clinically tested over a period of 1 year. The model is designed for nursing documentation in step with statutory, organizational, and professional requirements. Complete documentation is arranged for by incorporating the Nursing Minimum Data Set. A systematic and comprehensive documentation is arranged for by establishing categories as provided in the model's framework domains. Consistent documentation is arranged for by incorporating NANDA-I Nursing Diagnoses, Nursing Intervention Classification, and Nursing Outcome Classification. The model can be used as a tool in cooperation with vendors to ensure the interests of the nursing profession is met when developing EPR solutions in healthcare. The model can provide clinicians with a framework for documentation in step with legal and organizational requirements and at the same time retain the ability to record all aspects of clinical nursing.
Cognitive load reducing in destination decision system
NASA Astrophysics Data System (ADS)
Wu, Chunhua; Wang, Cong; Jiang, Qien; Wang, Jian; Chen, Hong
2007-12-01
With limited cognitive resource, the quantity of information can be processed by a person is limited. If the limitation is broken, the whole cognitive process would be affected, so did the final decision. The research of effective ways to reduce the cognitive load is launched from two aspects: cutting down the number of alternatives and directing the user to allocate his limited attention resource based on the selective visual attention theory. Decision-making is such a complex process that people usually have difficulties to express their requirements completely. An effective method to get user's hidden requirements is put forward in this paper. With more requirements be caught, the destination decision system can filtering more quantity of inappropriate alternatives. Different information piece has different utility, if the information with high utility would get attention easily, the decision might be made more easily. After analyzing the current selective visual attention theory, a new presentation style based on user's visual attention also put forward in this paper. This model arranges information presentation according to the movement of sightline. Through visual attention, the user can put their limited attention resource on the important information. Hidden requirements catching and presenting information based on the selective visual attention are effective ways to reducing the cognitive load.
MMM: A toolbox for integrative structure modeling.
Jeschke, Gunnar
2018-01-01
Structural characterization of proteins and their complexes may require integration of restraints from various experimental techniques. MMM (Multiscale Modeling of Macromolecules) is a Matlab-based open-source modeling toolbox for this purpose with a particular emphasis on distance distribution restraints obtained from electron paramagnetic resonance experiments on spin-labelled proteins and nucleic acids and their combination with atomistic structures of domains or whole protomers, small-angle scattering data, secondary structure information, homology information, and elastic network models. MMM does not only integrate various types of restraints, but also various existing modeling tools by providing a common graphical user interface to them. The types of restraints that can support such modeling and the available model types are illustrated by recent application examples. © 2017 The Protein Society.
An Optical Model for Estimating the Underwater Light Field from Remote Sensing
NASA Technical Reports Server (NTRS)
Liu, Cheng-Chien; Miller, Richard L.
2002-01-01
A model of the wavelength-integrated scalar irradiance for a vertically homogeneous water column is developed. It runs twenty thousand times faster than simulations obtained using full Hydrolight code and limits the percentage error to less than 3.7%. Both the distribution of incident sky radiance and a wind-roughened surface are integrated in the model. Our model removes common limitations of earlier models and can be applied to waters with any composition of the inherent optical properties. Implementation of this new model, as well as the ancillary information required for processing global-scale satellite data, is discussed. This new model is fast, accurate, and flexible and therefore provides important information of the underwater light field from remote sensing.
The role of modelling in prioritising and planning clinical trials.
Chilcott, J; Brennan, A; Booth, A; Karnon, J; Tappenden, P
2003-01-01
To identify the role of modelling in planning and prioritising trials. The review focuses on modelling methods used in the construction of disease models and on methods for their analysis and interpretation. Searches were initially developed in MEDLINE and then translated into other databases. Systematic reviews of the methodological and case study literature were undertaken. Search strategies focused on the intersection between three domains: modelling, health technology assessment and prioritisation. The review found that modelling can extend the validity of trials by: generalising from trial populations to specific target groups; generalising to other settings and countries; extrapolating trial outcomes to the longer term; linking intermediate outcome measures to final outcomes; extending analysis to the relevant comparators; adjusting for prognostic factors in trials; and synthesising research results. The review suggested that modelling may offer greatest benefits where the impact of a technology occurs over a long duration, where disease/technology characteristics are not observable, where there are long lead times in research, or for rapidly changing technologies. It was also found that modelling can inform the key parameters for research: sample size, trial duration and population characteristics. One-way, multi-way and threshold sensitivity analysis have been used in informing these aspects but are flawed. The payback approach has been piloted and while there have been weaknesses in its implementation, the approach does have potential. Expected value of information analysis is the only existing methodology that has been applied in practice and can address all these issues. The potential benefit of this methodology is that the value of research is directly related to its impact on technology commissioning decisions, and is demonstrated in real and absolute rather than relative terms; it assesses the technical efficiency of different types of research. Modelling is not a substitute for data collection. However, modelling can identify trial designs of low priority in informing health technology commissioning decisions. Good practice in undertaking and reporting economic modelling studies requires further dissemination and support, specifically in sensitivity analyses, model validation and the reporting of assumptions. Case studies of the payback approach using stochastic sensitivity analyses should be developed. Use of overall expected value of perfect information should be encouraged in modelling studies seeking to inform prioritisation and planning of health technology assessments. Research is required to assess if the potential benefits of value of information analysis can be realised in practice; on the definition of an adequate objective function; on methods for analysing computationally expensive models; and on methods for updating prior probability distributions.
A reporting protocol for thermochronologic modeling illustrated with data from the Grand Canyon
NASA Astrophysics Data System (ADS)
Flowers, Rebecca M.; Farley, Kenneth A.; Ketcham, Richard A.
2015-12-01
Apatite (U-Th)/He and fission-track dates, as well as 4He/3He and fission-track length data, provide rich thermal history information. However, numerous choices and assumptions are required on the long road from raw data and observations to potentially complex geologic interpretations. This paper outlines a conceptual framework for this path, with the aim of promoting a broader understanding of how thermochronologic conclusions are derived. The tiered structure consists of thermal history model inputs at Level 1, thermal history model outputs at Level 2, and geologic interpretations at Level 3. Because inverse thermal history modeling is at the heart of converting thermochronologic data to interpretation, for others to evaluate and reproduce conclusions derived from thermochronologic results it is necessary to publish all data required for modeling, report all model inputs, and clearly and completely depict model outputs. Here we suggest a generalized template for a model input table with which to arrange, report and explain the choice of inputs to thermal history models. Model inputs include the thermochronologic data, additional geologic information, and system- and model-specific parameters. As an example we show how the origin of discrepant thermochronologic interpretations in the Grand Canyon can be better understood by using this disciplined approach.
NASA Astrophysics Data System (ADS)
Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun
2017-12-01
Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.
A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model
NASA Technical Reports Server (NTRS)
Mathe, Nathalie; Chen, James
1994-01-01
Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.
Janssen, Sander J C; Porter, Cheryl H; Moore, Andrew D; Athanasiadis, Ioannis N; Foster, Ian; Jones, James W; Antle, John M
2017-07-01
Agricultural modeling has long suffered from fragmentation in model implementation. Many models are developed, there is much redundancy, models are often poorly coupled, model component re-use is rare, and it is frequently difficult to apply models to generate real solutions for the agricultural sector. To improve this situation, we argue that an open, self-sustained, and committed community is required to co-develop agricultural models and associated data and tools as a common resource. Such a community can benefit from recent developments in information and communications technology (ICT). We examine how such developments can be leveraged to design and implement the next generation of data, models, and decision support tools for agricultural production systems. Our objective is to assess relevant technologies for their maturity, expected development, and potential to benefit the agricultural modeling community. The technologies considered encompass methods for collaborative development and for involving stakeholders and users in development in a transdisciplinary manner. Our qualitative evaluation suggests that as an overall research challenge, the interoperability of data sources, modular granular open models, reference data sets for applications and specific user requirements analysis methodologies need to be addressed to allow agricultural modeling to enter in the big data era. This will enable much higher analytical capacities and the integrated use of new data sources. Overall agricultural systems modeling needs to rapidly adopt and absorb state-of-the-art data and ICT technologies with a focus on the needs of beneficiaries and on facilitating those who develop applications of their models. This adoption requires the widespread uptake of a set of best practices as standard operating procedures.
Information system and website design to support theautomotive manufacture ERP system
NASA Astrophysics Data System (ADS)
Amran, T. G.; Azmi, N.; Surjawati, A. A.
2017-12-01
This research is to create an on-time production system design with Heijunka model so that the product diversity for all models could meet time and capacity requirements, own production flexibility, high quality, meet the customers’ demands, realistic in production as well as creating a web-based local components’ order information system that supports the Enterprise Resource Planning (ERP) system. The Heijunka model for equalization with heuristic and stochastic model has been implemented for productions up to 3000 units by implementing Suzuki International Manufacturing. The inefficiency in the local order information system demanded the need for a new information system design that is integrated in ERP. Kaizen needs to be done is the Supplier Network that all vendors can download and utilize those data to deliver the components to the company and for vendors’ internal uses as well. The model design is presumed effective where the model is able to be utilized as a solution so that the production can run according to the schedule and presumed efficient were the model is able to show the reduction of loss time and stock.
Information architecture for a federated health record server.
Kalra, D; Lloyd, D; Austin, T; O'Connor, A; Patterson, D; Ingram, D
2002-01-01
This paper describes the information models that have been used to implement a federated health record server and to deploy it in a live clinical setting. The authors, working at the Centre for Health Informatics and Multiprofessional Education (University College London), have built up over a decade of experience within Europe on the requirements and information models that are needed to underpin comprehensive multi-professional electronic health records. This work has involved collaboration with a wide range of health care and informatics organisations and partners in the healthcare computing industry across Europe though the EU Health Telematics projects GEHR, Synapses, EHCR-SupA, SynEx and Medicate. The resulting architecture models have fed into recent European standardisation work in this area, such as CEN TC/251 ENV 13606. UCL has implemented a federated health record server based on these models which is now running in the Department of Cardiovascular Medicine at the Whittington Hospital in North London. The information models described in this paper reflect a refinement based on this implementation experience.
Atmospheric correction for remote sensing image based on multi-spectral information
NASA Astrophysics Data System (ADS)
Wang, Yu; He, Hongyan; Tan, Wei; Qi, Wenwen
2018-03-01
The light collected from remote sensors taken from space must transit through the Earth's atmosphere. All satellite images are affected at some level by lightwave scattering and absorption from aerosols, water vapor and particulates in the atmosphere. For generating high-quality scientific data, atmospheric correction is required to remove atmospheric effects and to convert digital number (DN) values to surface reflectance (SR). Every optical satellite in orbit observes the earth through the same atmosphere, but each satellite image is impacted differently because atmospheric conditions are constantly changing. A physics-based detailed radiative transfer model 6SV requires a lot of key ancillary information about the atmospheric conditions at the acquisition time. This paper investigates to achieve the simultaneous acquisition of atmospheric radiation parameters based on the multi-spectral information, in order to improve the estimates of surface reflectance through physics-based atmospheric correction. Ancillary information on the aerosol optical depth (AOD) and total water vapor (TWV) derived from the multi-spectral information based on specific spectral properties was used for the 6SV model. The experimentation was carried out on images of Sentinel-2, which carries a Multispectral Instrument (MSI), recording in 13 spectral bands, covering a wide range of wavelengths from 440 up to 2200 nm. The results suggest that per-pixel atmospheric correction through 6SV model, integrating AOD and TWV derived from multispectral information, is better suited for accurate analysis of satellite images and quantitative remote sensing application.
Weir, Charlene R; Staggers, Nancy; Gibson, Bryan; Doing-Harris, Kristina; Barrus, Robyn; Dunlea, Robert
2015-04-16
Effective implementation of a Primary Care Medical Home model of care (PCMH) requires integration of patients' contextual information (physical, mental, social and financial status) into an easily retrievable information source for the healthcare team and clinical decision-making. This project explored clinicians' perceptions about important attributes of contextual information for clinical decision-making, how contextual information is expressed in CPRS clinical documentation as well as how clinicians in a highly computerized environment manage information flow related to these areas. A qualitative design using Cognitive Task Analyses and a modified Critical Incident Technique were used. The study was conducted in a large VA with a fully implemented EHR located in the western United States. Seventeen providers working in a PCMH model of care in Primary Care, Home Based Care and Geriatrics reported on a recent difficult transition requiring contextual information for decision-making. The transcribed interviews were qualitatively analyzed for thematic development related to contextual information using an iterative process and multiple reviewers with ATLAS@ti software. Six overarching themes emerged as attributes of contextual information: Informativeness, goal language, temporality, source attribution, retrieval effort, and information quality. These results indicate that specific attributes are needed to in order for contextual information to fully support clinical decision-making in a Medical Home care delivery environment. Improved EHR designs are needed for ease of contextual information access, displaying linkages across time and settings, and explicit linkages to both clinician and patient goals. Implications relevant to providers' information needs, team functioning and EHR design are discussed.
Distributed and Dynamic Storage of Working Memory Stimulus Information in Extrastriate Cortex
Sreenivasan, Kartik K.; Vytlacil, Jason; D'Esposito, Mark
2015-01-01
The predominant neurobiological model of working memory (WM) posits that stimulus information is stored via stable elevated activity within highly selective neurons. Based on this model, which we refer to as the canonical model, the storage of stimulus information is largely associated with lateral prefrontal cortex (lPFC). A growing number of studies describe results that cannot be fully explained by the canonical model, suggesting that it is in need of revision. In the present study, we directly test key elements of the canonical model. We analyzed functional MRI data collected as participants performed a task requiring WM for faces and scenes. Multivariate decoding procedures identified patterns of activity containing information about the items maintained in WM (faces, scenes, or both). While information about WM items was identified in extrastriate visual cortex (EC) and lPFC, only EC exhibited a pattern of results consistent with a sensory representation. Information in both regions persisted even in the absence of elevated activity, suggesting that elevated population activity may not represent the storage of information in WM. Additionally, we observed that WM information was distributed across EC neural populations that exhibited a broad range of selectivity for the WM items rather than restricted to highly selective EC populations. Finally, we determined that activity patterns coding for WM information were not stable, but instead varied over the course of a trial, indicating that the neural code for WM information is dynamic rather than static. Together, these findings challenge the canonical model of WM. PMID:24392897
Accounting for Incomplete Species Detection in Fish Community Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
McManamay, Ryan A; Orth, Dr. Donald J; Jager, Yetta
2013-01-01
Riverine fish assemblages are heterogeneous and very difficult to characterize with a one-size-fits-all approach to sampling. Furthermore, detecting changes in fish assemblages over time requires accounting for variation in sampling designs. We present a modeling approach that permits heterogeneous sampling by accounting for site and sampling covariates (including method) in a model-based framework for estimation (versus a sampling-based framework). We snorkeled during three surveys and electrofished during a single survey in suite of delineated habitats stratified by reach types. We developed single-species occupancy models to determine covariates influencing patch occupancy and species detection probabilities whereas community occupancy models estimated speciesmore » richness in light of incomplete detections. For most species, information-theoretic criteria showed higher support for models that included patch size and reach as covariates of occupancy. In addition, models including patch size and sampling method as covariates of detection probabilities also had higher support. Detection probability estimates for snorkeling surveys were higher for larger non-benthic species whereas electrofishing was more effective at detecting smaller benthic species. The number of sites and sampling occasions required to accurately estimate occupancy varied among fish species. For rare benthic species, our results suggested that higher number of occasions, and especially the addition of electrofishing, may be required to improve detection probabilities and obtain accurate occupancy estimates. Community models suggested that richness was 41% higher than the number of species actually observed and the addition of an electrofishing survey increased estimated richness by 13%. These results can be useful to future fish assemblage monitoring efforts by informing sampling designs, such as site selection (e.g. stratifying based on patch size) and determining effort required (e.g. number of sites versus occasions).« less
Ontology for assessment studies of human-computer-interaction in surgery.
Machno, Andrej; Jannin, Pierre; Dameron, Olivier; Korb, Werner; Scheuermann, Gerik; Meixensberger, Jürgen
2015-02-01
New technologies improve modern medicine, but may result in unwanted consequences. Some occur due to inadequate human-computer-interactions (HCI). To assess these consequences, an investigation model was developed to facilitate the planning, implementation and documentation of studies for HCI in surgery. The investigation model was formalized in Unified Modeling Language and implemented as an ontology. Four different top-level ontologies were compared: Object-Centered High-level Reference, Basic Formal Ontology, General Formal Ontology (GFO) and Descriptive Ontology for Linguistic and Cognitive Engineering, according to the three major requirements of the investigation model: the domain-specific view, the experimental scenario and the representation of fundamental relations. Furthermore, this article emphasizes the distinction of "information model" and "model of meaning" and shows the advantages of implementing the model in an ontology rather than in a database. The results of the comparison show that GFO fits the defined requirements adequately: the domain-specific view and the fundamental relations can be implemented directly, only the representation of the experimental scenario requires minor extensions. The other candidates require wide-ranging extensions, concerning at least one of the major implementation requirements. Therefore, the GFO was selected to realize an appropriate implementation of the developed investigation model. The ensuing development considered the concrete implementation of further model aspects and entities: sub-domains, space and time, processes, properties, relations and functions. The investigation model and its ontological implementation provide a modular guideline for study planning, implementation and documentation within the area of HCI research in surgery. This guideline helps to navigate through the whole study process in the form of a kind of standard or good clinical practice, based on the involved foundational frameworks. Furthermore, it allows to acquire the structured description of the applied assessment methods within a certain surgical domain and to consider this information for own study design or to perform a comparison of different studies. The investigation model and the corresponding ontology can be used further to create new knowledge bases of HCI assessment in surgery. Copyright © 2014 Elsevier B.V. All rights reserved.
A spatial-dynamic value transfer model of economic losses from a biological invasion
Thomas P. Holmes; Andrew M. Liebhold; Kent F. Kovacs; Betsy Von Holle
2010-01-01
Rigorous assessments of the economic impacts of introduced species at broad spatial scales are required to provide credible information to policy makers. We propose that economic models of aggregate damages induced by biological invasions need to link microeconomic analyses of site-specific economic damages with spatial-dynamic models of value change associated with...
Physiological pharmacokinetic/pharmacodynamic models require Vmax, Km values for the metabolism of OPs by tissue enzymes. Current literature values cannot be easily used in OP PBPK models (i.e., parathion and chlorpyrifos) because standard methodologies were not used in their ...
Physiological pharmacokinetic\\pharmacodynamic models require Vmax, Km values for the metabolism of OPs by tissue enzymes. Current literature values cannot be easily used in OP PBPK models (i.e., parathion and chlorpyrifos) because standard methodologies were not used in their ...
ERIC Educational Resources Information Center
Mavor, A. S.; And Others
Part of a sustained program that has involved the design of personally tailored information systems responsive to the needs of scientists performing common research and teaching tasks, this project focuses on the procedural and content requirements for accomplishing need diagnosis and presents these requirements as specifications for an…
Models of unit operations used for solid-waste processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, G.M.; Glaub, J.C.; Diaz, L.F.
1984-09-01
This report documents the unit operations models that have been developed for typical refuse-derived-fuel (RDF) processing systems. These models, which represent the mass balances, energy requirements, and economics of the unit operations, are derived, where possible, from basic principles. Empiricism has been invoked where a governing theory has yet to be developed. Field test data and manufacturers' information, where available, supplement the analytical development of the models. A literature review has also been included for the purpose of compiling and discussing in one document the available information pertaining to the modeling of front-end unit operations. Separate analytics have been donemore » for each task.« less
Perspectives of UV nowcasting to monitor personal pro-health outdoor activities.
Krzyścin, Janusz W; Lesiak, Aleksandra; Narbutt, Joanna; Sobolewski, Piotr; Guzikowski, Jakub
2018-07-01
Nowcasting model for online monitoring of personal outdoor behaviour is proposed. It is envisaged that it will provide an effective e-tool used by smartphone users. The model could estimate maximum duration of safe (without erythema risk) outdoor activity. Moreover, there are options to estimate duration of sunbathing to get adequate amount of vitamin D 3 and doses necessary for the antipsoriatic heliotherapy. The application requires information of starting time of sunbathing and the user's phototype. At the beginning the user will be informed of the approximate duration of sunbathing required to get the minimum erythemal dose, adequate amount of vitamin D 3 , and the dose necessary for the antipsoriatic heliotherapy. After every 20-min the application will recalculate the remaining duration of sunbathing based on the UVI measured in the preceding 20 min. If the estimate of remaining duration is <20 min the user will be informed that the deadline of sunbathing is approaching. Finally, a warning signal will be sent to stop sunbathing if the measured dose reaches the required dose. The proposed model is verified using the data collected at two measuring sites for the warm period of 2017 (1st April-30th September) in large Polish cities (Warsaw and Lodz). First instrument represents the UVI monitoring station. The information concerning sunbathing duration, which is sent to a remote user, is evaluated on the basis of the UVI measurements collected by the second measuring unit in a distance of ~7 km and 10 km for Warsaw and Lodz, respectively. The statistical analysis of the differences between sunbathing duration by nowcasting model and observation shows that the model provides reliable doses received by the users during outdoor activities in proximity (~10 km) to the UVI source site. Standard 24 h UVI forecast based on prognostic values of total ozone and cloudiness appears to only be valid for sunny days. Copyright © 2018 Elsevier B.V. All rights reserved.
Farzandipour, Mehrdad; Riazi, Hossein; Jabali, Monireh Sadeqi
2018-01-01
Introduction: System usability assessment is among the important aspects in assessing the quality of clinical information technology, especially when the end users of the system are concerned. This study aims at providing a comprehensive list of system usability. Methods: This research is a descriptive cross-sectional one conducted using Delphi technique in three phases in 2013. After experts’ ideas were concluded, the final version of the questionnaire including 163 items in three phases was presented to 40 users of information systems in hospitals. The grading ranged from 0-4. Data analysis was conducted using SPSS software. Those requirements with a mean point of three or higher were finally confirmed. Results: The list of system usability requirements for electronic health record was designed and confirmed in nine areas including suitability for the task (24 items), self-descriptiveness (22 items), controllability (19 questions), conformity with user expectations (25 items), error tolerance (21 items), suitability for individualization (7 items), suitability for learning (19 items), visual clarity (18 items) and auditory presentation (8 items). Conclusion: A relatively comprehensive model including useful requirements for using EHR was presented which can increase functionality, effectiveness and users’ satisfaction. Thus, it is suggested that the present model be adopted by system designers and healthcare system institutions to assess those systems. PMID:29719310
Study design requirements for RNA sequencing-based breast cancer diagnostics.
Mer, Arvind Singh; Klevebring, Daniel; Grönberg, Henrik; Rantalainen, Mattias
2016-02-01
Sequencing-based molecular characterization of tumors provides information required for individualized cancer treatment. There are well-defined molecular subtypes of breast cancer that provide improved prognostication compared to routine biomarkers. However, molecular subtyping is not yet implemented in routine breast cancer care. Clinical translation is dependent on subtype prediction models providing high sensitivity and specificity. In this study we evaluate sample size and RNA-sequencing read requirements for breast cancer subtyping to facilitate rational design of translational studies. We applied subsampling to ascertain the effect of training sample size and the number of RNA sequencing reads on classification accuracy of molecular subtype and routine biomarker prediction models (unsupervised and supervised). Subtype classification accuracy improved with increasing sample size up to N = 750 (accuracy = 0.93), although with a modest improvement beyond N = 350 (accuracy = 0.92). Prediction of routine biomarkers achieved accuracy of 0.94 (ER) and 0.92 (Her2) at N = 200. Subtype classification improved with RNA-sequencing library size up to 5 million reads. Development of molecular subtyping models for cancer diagnostics requires well-designed studies. Sample size and the number of RNA sequencing reads directly influence accuracy of molecular subtyping. Results in this study provide key information for rational design of translational studies aiming to bring sequencing-based diagnostics to the clinic.
A model to inform management actions as a response to chytridiomycosis-associated decline
Converse, Sarah J.; Bailey, Larissa L.; Mosher, Brittany A.; Funk, W. Chris; Gerber, Brian D.; Muths, Erin L.
2017-01-01
Decision-analytic models provide forecasts of how systems of interest will respond to management. These models can be parameterized using empirical data, but sometimes require information elicited from experts. When evaluating the effects of disease in species translocation programs, expert judgment is likely to play a role because complete empirical information will rarely be available. We illustrate development of a decision-analytic model built to inform decision-making regarding translocations and other management actions for the boreal toad (Anaxyrus boreas boreas), a species with declines linked to chytridiomycosis caused by Batrachochytrium dendrobatidis (Bd). Using the model, we explored the management implications of major uncertainties in this system, including whether there is a genetic basis for resistance to pathogenic infection by Bd, how translocation can best be implemented, and the effectiveness of efforts to reduce the spread of Bd. Our modeling exercise suggested that while selection for resistance to pathogenic infectionDecision-analytic models provide forecasts of how systems of interest will respond to management. These models can be parameterized using empirical data, but sometimes require information elicited from experts. When evaluating the effects of disease in species translocation programs, expert judgment is likely to play a role because complete empirical information will rarely be available. We illustrate development of a decision-analytic model built to inform decision-making regarding translocations and other management actions for the boreal toad (Anaxyrus boreas boreas), a species with declines linked to chytridiomycosis caused by Batrachochytrium dendrobatidis (Bd). Using the model, we explored the management implications of major uncertainties in this system, including whether there is a genetic basis for resistance to pathogenic infection by Bd, how translocation can best be implemented, and the effectiveness of efforts to reduce the spread of Bd. Our modeling exercise suggested that while selection for resistance to pathogenic infection by Bd could increase numbers of sites occupied by toads, and translocations could increase the rate of toad recovery, efforts to reduce the spread of Bd may have little effect. We emphasize the need to continue developing and parameterizing models necessary to assess management actions for combating chytridiomycosis-associated declines. by Bd could increase numbers of sites occupied by toads, and translocations could increase the rate of toad recovery, efforts to reduce the spread of Bd may have little effect. We emphasize the need to continue developing and parameterizing models necessary to assess management actions for combating chytridiomycosis-associated declines.
A model for the electronic support of practice-based research networks.
Peterson, Kevin A; Delaney, Brendan C; Arvanitis, Theodoros N; Taweel, Adel; Sandberg, Elisabeth A; Speedie, Stuart; Richard Hobbs, F D
2012-01-01
The principal goal of the electronic Primary Care Research Network (ePCRN) is to enable the development of an electronic infrastructure to support clinical research activities in primary care practice-based research networks (PBRNs). We describe the model that the ePCRN developed to enhance the growth and to expand the reach of PBRN research. Use cases and activity diagrams were developed from interviews with key informants from 11 PBRNs from the United States and United Kingdom. Discrete functions were identified and aggregated into logical components. Interaction diagrams were created, and an overall composite diagram was constructed describing the proposed software behavior. Software for each component was written and aggregated, and the resulting prototype application was pilot tested for feasibility. A practical model was then created by separating application activities into distinct software packages based on existing PBRN business rules, hardware requirements, network requirements, and security concerns. We present an information architecture that provides for essential interactions, activities, data flows, and structural elements necessary for providing support for PBRN translational research activities. The model describes research information exchange between investigators and clusters of independent data sites supported by a contracted research director. The model was designed to support recruitment for clinical trials, collection of aggregated anonymous data, and retrieval of identifiable data from previously consented patients across hundreds of practices. The proposed model advances our understanding of the fundamental roles and activities of PBRNs and defines the information exchange commonly used by PBRNs to successfully engage community health care clinicians in translational research activities. By describing the network architecture in a language familiar to that used by software developers, the model provides an important foundation for the development of electronic support for essential PBRN research activities.
The Money-Creation Model: Another Pedagogy.
ERIC Educational Resources Information Center
Gamble, Ralph C., Jr.
1991-01-01
Describes graphical techniques to help explain the multiple creation of deposits that accompany lending in a fractional reserve banking system. Presents a model that emphasizes the banking system, the interaction of total permitted, required, and excess reserves and deposits. Argues that the approach simplifies information to examining a slope…
FuGEFlow: data model and markup language for flow cytometry.
Qian, Yu; Tchuvatkina, Olga; Spidlen, Josef; Wilkinson, Peter; Gasparetto, Maura; Jones, Andrew R; Manion, Frank J; Scheuermann, Richard H; Sekaly, Rafick-Pierre; Brinkman, Ryan R
2009-06-16
Flow cytometry technology is widely used in both health care and research. The rapid expansion of flow cytometry applications has outpaced the development of data storage and analysis tools. Collaborative efforts being taken to eliminate this gap include building common vocabularies and ontologies, designing generic data models, and defining data exchange formats. The Minimum Information about a Flow Cytometry Experiment (MIFlowCyt) standard was recently adopted by the International Society for Advancement of Cytometry. This standard guides researchers on the information that should be included in peer reviewed publications, but it is insufficient for data exchange and integration between computational systems. The Functional Genomics Experiment (FuGE) formalizes common aspects of comprehensive and high throughput experiments across different biological technologies. We have extended FuGE object model to accommodate flow cytometry data and metadata. We used the MagicDraw modelling tool to design a UML model (Flow-OM) according to the FuGE extension guidelines and the AndroMDA toolkit to transform the model to a markup language (Flow-ML). We mapped each MIFlowCyt term to either an existing FuGE class or to a new FuGEFlow class. The development environment was validated by comparing the official FuGE XSD to the schema we generated from the FuGE object model using our configuration. After the Flow-OM model was completed, the final version of the Flow-ML was generated and validated against an example MIFlowCyt compliant experiment description. The extension of FuGE for flow cytometry has resulted in a generic FuGE-compliant data model (FuGEFlow), which accommodates and links together all information required by MIFlowCyt. The FuGEFlow model can be used to build software and databases using FuGE software toolkits to facilitate automated exchange and manipulation of potentially large flow cytometry experimental data sets. Additional project documentation, including reusable design patterns and a guide for setting up a development environment, was contributed back to the FuGE project. We have shown that an extension of FuGE can be used to transform minimum information requirements in natural language to markup language in XML. Extending FuGE required significant effort, but in our experiences the benefits outweighed the costs. The FuGEFlow is expected to play a central role in describing flow cytometry experiments and ultimately facilitating data exchange including public flow cytometry repositories currently under development.
NASA Technical Reports Server (NTRS)
Lee, Jae K.; Randolph, J. C.; Lulla, Kamlesh P.; Helfert, Michael R.
1993-01-01
Because changes in the Earth's environment have become major global issues, continuous, longterm scientific information is required to assess global problems such as deforestation, desertification, greenhouse effects and climate variations. Global change studies require understanding of interactions of complex processes regulating the Earth system. Space-based Earth observation is an essential element in global change research for documenting changes in Earth environment. It provides synoptic data for conceptual predictive modeling of future environmental change. This paper provides a brief overview of remote sensing technology from the perspective of global change research.
A demonstration of the instream flow incremental methodology, Shenandoah River
Zappia, Humbert; Hayes, Donald C.
1998-01-01
Current and projected demands on the water resources of the Shenandoah River have increased concerns for the potential effect of these demands on the natural integrity of the Shenandoah River system. The Instream Flow Incremental Method (IFIM) process attempts to integrate concepts of water-supply planning, analytical hydraulic engineering models, and empirically derived habitat versus flow functions to address water-use and instream-flow issues and questions concerning life-stage specific effects on selected species and the general well being of aquatic biological populations.The demonstration project also sets the stage for the identification and compilation of the major instream-flow issues in the Shenandoah River Basin, development of the required multidisciplinary technical team to conduct more detailed studies, and development of basin specific habitat and flow requirements for fish species, species assemblages, and various water uses in the Shenandoah River Basin. This report presents the results of an IFIM demonstration project, conducted on the main stem Shenandoah River in Virginia, during 1996 and 1997, using the Physical Habitat Simulation System (PHABSIM) model.Output from PHABSIM is used to address the general flow requirements for water supply and recreation and habitat for selected life stages of several fish species. The model output is only a small part of the information necessary for effective decision making and management of river resources. The information by itself is usually insufficient for formulation of recommendations regarding instream-flow requirements. Additional information, for example, can be obtained by analysis of habitat time-series data, habitat duration data, and habitat bottlenecks. Alternative-flow analysis and habitat-duration curves are presented.
Rebich, Richard A.
1994-01-01
Available literature and data were reviewed to quantify data requirements for computer simulation of hydrogeologic effects of liquid waste injection in southeastern Mississippi. Emphasis of each review was placed on quantifying physical properties of current Class I injection zones in Harrison and Jackson Counties. Class I injection zones are zones that are used for injection of hazardous or non-hazardous liquid waste below a formation containing the lowermost underground source of drinking water located within one-quarter of a mile of the injection well. Several mathematical models have been developed to simulate injection effects. The Basic Plume Method was selected because it is commonly used in permit applications, and the Intercomp model was selected because it is generally accepted and used in injection-related research. The input data requirements of the two models were combined into a single data requirement list inclusive of physical properties of injection zones only; injected waste and well properties are not included because such information is site-specific by industry, which is beyond the scope of this report. Results of the reviews of available literature and data indicated that Class I permit applications and standard-reference chemistry and physics texts were the primary sources of information to quantify physical properties of injection zones in Harrison and Jackson Counties. With the exception of a few reports and supplementary data for one injection zone in Jackson County, very little additional information pertaining to physical properties of the injection zones was available in sources other than permit applications and standard-reference texts.
A Costing Model for Project-Based Information and Communication Technology Systems
ERIC Educational Resources Information Center
Stewart, Brian; Hrenewich, Dave
2009-01-01
A major difficulty facing IT departments is ensuring that the projects and activities to which information and communications technologies (ICT) resources are committed represent an effective, economic, and efficient use of those resources. This complex problem has no single answer. To determine effective use requires, at the least, a…
An Integrated Model for the Adoption of Information Technologies in U.S. Colleges and Universities
ERIC Educational Resources Information Center
Garcia Molina, Pablo
2013-01-01
This thesis fulfills the requirements of a Doctor of Liberal Studies degree at Georgetown University. It advances our knowledge of the rationale and mechanisms surrounding the spread, adoption and abandonment of information and communication technologies in tertiary education institutions in the United States. This interdisciplinary thesis…
Is Bigger Better? Customer Base Expansion through Word-of-Mouth Reputation
ERIC Educational Resources Information Center
Rob, Rafael; Fishman, Arthur
2005-01-01
A model of gradual reputation formation through a process of continuous investment in product quality is developed. We assume that the ability to produce high-quality products requires continuous investment and that as a consequence of informational frictions, such as search costs, information about firms' past performance diffuses only gradually…
76 FR 13896 - Equal Credit Opportunity
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... based in whole or in part on information in a consumer report. Certain model notices in Regulation B... part on information in a consumer report. The definition of adverse action in section 603(k) of the... report used in taking the adverse action. It also requires a person to disclose that a consumer has a...
ERIC Educational Resources Information Center
Giesen, Martin J.; Cavenaugh, Brenda S.
2006-01-01
Rehabilitation Services Administration (RSA) requires that independent living programs annually report demographic information on consumers receiving services and the numbers receiving specific types of services. Although some states collect information on consumer outcomes (for example, improvement in daily living skills), RSA does not request…
Nonmonotonic Logic for Use in Information Retrieval: An Exploratory Paper.
ERIC Educational Resources Information Center
Hurt, C. D.
1998-01-01
Monotonic logic requires reexamination of the entire logic string when there is a contradiction. Nonmonotonic logic allows the user to withdraw conclusions in the face of contradiction without harm to the logic string, which has considerable application to the field of information searching. Artificial intelligence models and neural networks based…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-02
... driving practices when these vehicles are operated. Estimated Annual Burden: 300 hours. Number of... information because of model changes. The light truck manufacturers gather only pre-existing data for the... average of $35.00 per hour for professional and clerical staff to gather data, distribute and print...
Digital pathology: DICOM-conform draft, testbed, and first results.
Zwönitzer, Ralf; Kalinski, Thomas; Hofmann, Harald; Roessner, Albert; Bernarding, Johannes
2007-09-01
Hospital information systems are state of the art nowadays. Therefore, Digital Pathology, also labelled as Virtual Microscopy, has gained increased attention. Triggered by radiology, standardized information models and workflows were world-wide defined based on DICOM. However, DICOM-conform integration of Digital Pathology into existing clinical information systems imposes new problems requiring specific solutions concerning the huge amount of data as well as the special structure of the data to be managed, transferred, and stored. We implemented a testbed to realize and evaluate the workflow of digitized slides from acquisition to archiving. The experiences led to the draft of a DICOM-conform information model that accounted for extensions, definitions, and technical requirements necessary to integrate digital pathology in a hospital-wide DICOM environment. Slides were digitized, compressed, and could be viewed remotely. Real-time transfer of the huge amount of data was optimized using streaming techniques. Compared to a recent discussion in the DICOM Working Group for Digital Pathology (WG26) our experiences led to a preference of a JPEG2000/JPIP-based streaming of the whole slide image. The results showed that digital pathology is feasible but strong efforts by users and vendors are still necessary to integrate Digital Pathology into existing information systems.
Modeling the Effects of Transbasin Nonlinear Internal Waves Through the South China Sea Basin
2013-06-01
sound propagation through the SCS needs to be developed to help maintain tactical superiority. This model will provide valuable information for...METHODOLOGY A. ACOUSTIC MODEL 1. Ray Trace Theory Modeling of sound propagation through the ocean requires solving the governing spherical wave equation...arrival structure simulation code. The model permits the study of the physics and phenomenology of sound propagation though the SCS
A proposal for cervical screening information systems in developing countries.
Marrett, Loraine D; Robles, Sylvia; Ashbury, Fredrick D; Green, Bo; Goel, Vivek; Luciani, Silvana
2002-11-20
The effective and efficient delivery of cervical screening programs requires information for planning, management, delivery and evaluation. Specially designed systems are generally required to meet these needs. In many developing countries, lack of information systems constitutes an important barrier to development of comprehensive screening programs and the effective control of cervical cancer. Our report outlines a framework for creating such systems in developing countries and describes a conceptual model for a cervical screening information system. The proposed system is modular, recognizing that there will be considerable between-region heterogeneity in current status and priorities. The proposed system is centered on modules that would allow for the assembly and computerization of data on Pap tests, since these represent the main screening modality at the present time. Additional modules would process data and create and maintain a screening database (e.g., standardize, edit, link and update modules) and allow for the integration of other types of data, such as cervical histopathology results. An open systems development model is proposed, since it is most compatible with the goals of local stakeholder involvement and capacity-building. Copyright 2002 Wiley-Liss, Inc.
Source modeling and inversion with near real-time GPS: a GITEWS perspective for Indonesia
NASA Astrophysics Data System (ADS)
Babeyko, A. Y.; Hoechner, A.; Sobolev, S. V.
2010-07-01
We present the GITEWS approach to source modeling for the tsunami early warning in Indonesia. Near-field tsunami implies special requirements to both warning time and details of source characterization. To meet these requirements, we employ geophysical and geological information to predefine a maximum number of rupture parameters. We discretize the tsunamigenic Sunda plate interface into an ordered grid of patches (150×25) and employ the concept of Green's functions for forward and inverse rupture modeling. Rupture Generator, a forward modeling tool, additionally employs different scaling laws and slip shape functions to construct physically reasonable source models using basic seismic information only (magnitude and epicenter location). GITEWS runs a library of semi- and fully-synthetic scenarios to be extensively employed by system testing as well as by warning center personnel teaching and training. Near real-time GPS observations are a very valuable complement to the local tsunami warning system. Their inversion provides quick (within a few minutes on an event) estimation of the earthquake magnitude, rupture position and, in case of sufficient station coverage, details of slip distribution.
Jarchow, Christopher J.; Hossack, Blake R.; Sigafus, Brent H.; Schwalbe, Cecil R.; Muths, Erin L.
2016-01-01
Managing species with intensive tools such as reintroduction may focus on single sites or entire landscapes. For vagile species, long-term persistence will require colonization and establishment in neighboring habitats. Therefore, both suitable colonization sites and suitable dispersal corridors between sites are required. Assessment of landscapes for both requirements can contribute to ranking and selection of reintroduction areas, thereby improving management success. Following eradication of invasive American Bullfrogs (Lithobates catesbeianus) from most of Buenos Aires National Wildlife Refuge (BANWR; Arizona, United States), larval Chiricahua Leopard Frogs (Lithobates chiricahuensis) from a private pond were reintroduced into three stock ponds. Populations became established at all three reintroduction sites followed by colonization of neighboring ponds in subsequent years. Our aim was to better understand colonization patterns by the federally threatened L. chiricahuensis which could help inform other reintroduction efforts. We assessed the influence of four landscape features on colonization. Using surveys from 2007 and information about the landscape, we developed a habitat connectivity model, based on electrical circuit theory, that identified potential dispersal corridors after explicitly accounting for imperfect detection of frogs. Landscape features provided little insight into why some sites were colonized and others were not, results that are likely because of the uniformity of the BANWR landscape. While corridor modeling may be effective in more-complex landscapes, our results suggest focusing on local habitat will be more useful at BANWR. We also illustrate that existing data, even when limited in spatial or temporal resolution, can provide information useful in formulating management actions.
Software Development Cost Estimation Executive Summary
NASA Technical Reports Server (NTRS)
Hihn, Jairus M.; Menzies, Tim
2006-01-01
Identify simple fully validated cost models that provide estimation uncertainty with cost estimate. Based on COCOMO variable set. Use machine learning techniques to determine: a) Minimum number of cost drivers required for NASA domain based cost models; b) Minimum number of data records required and c) Estimation Uncertainty. Build a repository of software cost estimation information. Coordinating tool development and data collection with: a) Tasks funded by PA&E Cost Analysis; b) IV&V Effort Estimation Task and c) NASA SEPG activities.
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2009-04-01
The rapidly advancing hardware technology, smart sensors and sensor networks are advancing environment sensing. One major potential of this technology is Large-Scale Surveillance Systems (LS3) especially for, homeland security, battlefield intelligence, facility guarding and other civilian applications. The efficient and effective deployment of LS3 requires addressing number of aspects impacting the scalability of such systems. The scalability factors are related to: computation and memory utilization efficiency, communication bandwidth utilization, network topology (e.g., centralized, ad-hoc, hierarchical or hybrid), network communication protocol and data routing schemes; and local and global data/information fusion scheme for situational awareness. Although, many models have been proposed to address one aspect or another of these issues but, few have addressed the need for a multi-modality multi-agent data/information fusion that has characteristics satisfying the requirements of current and future intelligent sensors and sensor networks. In this paper, we have presented a novel scalable fusion engine for multi-modality multi-agent information fusion for LS3. The new fusion engine is based on a concept we call: Energy Logic. Experimental results of this work as compared to a Fuzzy logic model strongly supported the validity of the new model and inspired future directions for different levels of fusion and different applications.
2013-06-01
ER D C/ CE RL C R- 13 -5 Ontology for Life-Cycle Modeling of Water Distribution Systems : Application of Model View Definition...2013 Ontology for Life-Cycle Modeling of Water Distribution Systems : Application of Model View Definition Attributes Kristine K. Fallon, Robert A...interior plumbing systems and the information exchange requirements for every participant in the design. The findings were used to develop an
OAST system technology planning
NASA Technical Reports Server (NTRS)
Sadin, S. R.
1978-01-01
The NASA Office of Aeronautics and Space Technology developed a planning model for space technology consisting of a space systems technology model, technology forecasts and technology surveys. The technology model describes candidate space missions through the year 2000 and identifies their technology requirements. The technology surveys and technology forecasts provide, respectively, data on the current status and estimates of the projected status of relevant technologies. These tools are used to further the understanding of the activities and resources required to ensure the timely development of technological capabilities. Technology forecasting in the areas of information systems, spacecraft systems, transportation systems, and power systems are discussed.
Barnes, Marcia A.; Raghubar, Kimberly P.; Faulkner, Heather; Denton, Carolyn A.
2014-01-01
Readers construct mental models of situations described by text to comprehend what they read, updating these situation models based on explicitly described and inferred information about causal, temporal, and spatial relations. Fluent adult readers update their situation models while reading narrative text based in part on spatial location information that is consistent with the perspective of the protagonist. The current study investigates whether children update spatial situation models in a similar way, whether there are age-related changes in children's formation of spatial situation models during reading, and whether measures of the ability to construct and update spatial situation models are predictive of reading comprehension. Typically-developing children from ages 9 through 16 years (n=81) were familiarized with a physical model of a marketplace. Then the model was covered, and children read stories that described the movement of a protagonist through the marketplace and were administered items requiring memory for both explicitly stated and inferred information about the character's movements. Accuracy of responses and response times were evaluated. Results indicated that: (a) location and object information during reading appeared to be activated and updated not simply from explicit text-based information but from a mental model of the real world situation described by the text; (b) this pattern showed no age-related differences; and (c) the ability to update the situation model of the text based on inferred information, but not explicitly stated information, was uniquely predictive of reading comprehension after accounting for word decoding. PMID:24315376
Optimizing Coverage of Three-Dimensional Wireless Sensor Networks by Means of Photon Mapping
2013-12-01
information if it does not display a currently valid OMB control number. 1. REPORT DATE DEC 2013 2. REPORT TYPE 3. DATES COVERED 00-00-2013 to 00-00...information about the monitored space is sensed?” Solving this formulation of the AGP relies upon the creation of a model describing how a set of...simulated photons will propagate in a 3D virtual environment. Furthermore, the photon model requires an efficient data structure with small memory
Process and representation in graphical displays
NASA Technical Reports Server (NTRS)
Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne
1993-01-01
Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brost, Randolph C.; McLendon, William Clarence,
2013-01-01
Modeling geospatial information with semantic graphs enables search for sites of interest based on relationships between features, without requiring strong a priori models of feature shape or other intrinsic properties. Geospatial semantic graphs can be constructed from raw sensor data with suitable preprocessing to obtain a discretized representation. This report describes initial work toward extending geospatial semantic graphs to include temporal information, and initial results applying semantic graph techniques to SAR image data. We describe an efficient graph structure that includes geospatial and temporal information, which is designed to support simultaneous spatial and temporal search queries. We also report amore » preliminary implementation of feature recognition, semantic graph modeling, and graph search based on input SAR data. The report concludes with lessons learned and suggestions for future improvements.« less
Enhanced project management tool
NASA Technical Reports Server (NTRS)
Hsu, Chen-Jung (Inventor); Patel, Hemil N. (Inventor); Maluf, David A. (Inventor); Moh Hashim, Jairon C. (Inventor); Tran, Khai Peter B. (Inventor)
2012-01-01
A system for managing a project that includes multiple tasks and a plurality of workers. Input information includes characterizations based upon a human model, a team model and a product model. Periodic reports, such as one or more of a monthly report, a task plan report, a schedule report, a budget report and a risk management report, are generated and made available for display or further analysis or collection into a customized report template. An extensible database allows searching for information based upon context and upon content. Seven different types of project risks are addressed, including non-availability of required skill mix of workers. The system can be configured to exchange data and results with corresponding portions of similar project analyses, and to provide user-specific access to specified information.
Improving the forecast for biodiversity under climate change.
Urban, M C; Bocedi, G; Hendry, A P; Mihoub, J-B; Pe'er, G; Singer, A; Bridle, J R; Crozier, L G; De Meester, L; Godsoe, W; Gonzalez, A; Hellmann, J J; Holt, R D; Huth, A; Johst, K; Krug, C B; Leadley, P W; Palmer, S C F; Pantel, J H; Schmitz, A; Zollner, P A; Travis, J M J
2016-09-09
New biological models are incorporating the realistic processes underlying biological responses to climate change and other human-caused disturbances. However, these more realistic models require detailed information, which is lacking for most species on Earth. Current monitoring efforts mainly document changes in biodiversity, rather than collecting the mechanistic data needed to predict future changes. We describe and prioritize the biological information needed to inform more realistic projections of species' responses to climate change. We also highlight how trait-based approaches and adaptive modeling can leverage sparse data to make broader predictions. We outline a global effort to collect the data necessary to better understand, anticipate, and reduce the damaging effects of climate change on biodiversity. Copyright © 2016, American Association for the Advancement of Science.
Advanced information processing system: Inter-computer communication services
NASA Technical Reports Server (NTRS)
Burkhardt, Laura; Masotto, Tom; Sims, J. Terry; Whittredge, Roy; Alger, Linda S.
1991-01-01
The purpose is to document the functional requirements and detailed specifications for the Inter-Computer Communications Services (ICCS) of the Advanced Information Processing System (AIPS). An introductory section is provided to outline the overall architecture and functional requirements of the AIPS and to present an overview of the ICCS. An overview of the AIPS architecture as well as a brief description of the AIPS software is given. The guarantees of the ICCS are provided, and the ICCS is described as a seven-layered International Standards Organization (ISO) Model. The ICCS functional requirements, functional design, and detailed specifications as well as each layer of the ICCS are also described. A summary of results and suggestions for future work are presented.
NASA Astrophysics Data System (ADS)
Adams, R.; Quinn, P. F.; Bowes, M. J.
2014-09-01
A model for simulating runoff pathways and water quality fluxes has been developed using the Minimum Information (MIR) approach. The model, the Catchment Runoff Attenuation Tool (CRAFT) is applicable to meso-scale catchments which focusses primarily on hydrological pathways that mobilise nutrients. Hence CRAFT can be used investigate the impact of management intervention strategies designed to reduce the loads of nutrients into receiving watercourses. The model can help policy makers, for example in Europe, meet water quality targets and consider methods to obtain "good" ecological status. A case study of the 414 km2 Frome catchment, Dorset UK, has been described here as an application of the CRAFT model. The model was primarily calibrated on ten years of weekly data to reproduce the observed flows and nutrient (nitrate nitrogen - N - and phosphorus - P) concentrations. Also data from two years of sub-daily high resolution monitoring at the same site were also analysed. These data highlighted some additional signals in the nutrient flux, particularly of soluble reactive phosphorus, which were not observable in the weekly data. This analysis has prompted the choice of using a daily timestep for this meso-scale modelling study as the minimum information requirement. A management intervention scenario was also run to show how the model can support catchment managers to investigate how reducing the concentrations of N and P in the various flow pathways. This scale appropriate modelling tool can help policy makers consider a range of strategies to to meet the European Union (EU) water quality targets for this type of catchment.
Optimal Network Modularity for Information Diffusion
NASA Astrophysics Data System (ADS)
Nematzadeh, Azadeh; Ferrara, Emilio; Flammini, Alessandro; Ahn, Yong-Yeol
2014-08-01
We investigate the impact of community structure on information diffusion with the linear threshold model. Our results demonstrate that modular structure may have counterintuitive effects on information diffusion when social reinforcement is present. We show that strong communities can facilitate global diffusion by enhancing local, intracommunity spreading. Using both analytic approaches and numerical simulations, we demonstrate the existence of an optimal network modularity, where global diffusion requires the minimal number of early adopters.
Haeufle, D F B; Günther, M; Wunner, G; Schmitt, S
2014-01-01
In biomechanics and biorobotics, muscles are often associated with reduced movement control effort and simplified control compared to technical actuators. This is based on evidence that the nonlinear muscle properties positively influence movement control. It is, however, open how to quantify the simplicity aspect of control effort and compare it between systems. Physical measures, such as energy consumption, stability, or jerk, have already been applied to compare biological and technical systems. Here a physical measure of control effort based on information entropy is presented. The idea is that control is simpler if a specific movement is generated with less processed sensor information, depending on the control scheme and the physical properties of the systems being compared. By calculating the Shannon information entropy of all sensor signals required for control, an information cost function can be formulated allowing the comparison of models of biological and technical control systems. Exemplarily applied to (bio-)mechanical models of hopping, the method reveals that the required information for generating hopping with a muscle driven by a simple reflex control scheme is only I=32 bits versus I=660 bits with a DC motor and a proportional differential controller. This approach to quantifying control effort captures the simplicity of a control scheme and can be used to compare completely different actuators and control approaches.
Fast Conceptual Cost Estimating of Aerospace Projects Using Historical Information
NASA Technical Reports Server (NTRS)
Butts, Glenn
2007-01-01
Accurate estimates can be created in less than a minute by applying powerful techniques and algorithms to create an Excel-based parametric cost model. In five easy steps you will learn how to normalize your company 's historical cost data to the new project parameters. This paper provides a complete, easy-to-understand, step by step how-to guide. Such a guide does not seem to currently exist. Over 2,000 hours of research, data collection, and trial and error, and thousands of lines of Excel Visual Basic Application (VBA) code were invested in developing these methods. While VBA is not required to use this information, it increases the power and aesthetics of the model. Implementing all of the steps described, while not required, will increase the accuracy of the results.
Application of information technology to the National Launch System
NASA Technical Reports Server (NTRS)
Mauldin, W. T.; Smith, Carolyn L.; Monk, Jan C.; Davis, Steve; Smith, Marty E.
1992-01-01
The approach to the development of the Unified Information System (UNIS) to provide in a timely manner all the information required to manage, design, manufacture, integrate, test, launch, operate, and support the Advanced Launch System (NLS), as well as the current and planned capabilities are described. STESYM, the Space Transportation Main Engine (STME) development program, is comprised of a collection of data models which can be grouped into two primary models: the Engine Infrastructure Model (ENGIM) and the Engine Integrated Cast Model (ENGICOM). ENGIM is an end-to-end model of the infrastructure needed to perform the fabrication, assembly, and testing of the STEM program and its components. Together, UNIS and STESYM are to provide NLS managers and engineers with the ability to access various types and files of data quickly and use that data to assess the capabilities of the STEM program.
Dynamic access control model for privacy preserving personalized healthcare in cloud environment.
Son, Jiseong; Kim, Jeong-Dong; Na, Hong-Seok; Baik, Doo-Kwon
2015-01-01
When sharing and storing healthcare data in a cloud environment, access control is a central issue for preserving data privacy as a patient's personal health data may be accessed without permission from many stakeholders. Specifically, dynamic authorization for the access of data is required because personal health data is stored in cloud storage via wearable devices. Therefore, we propose a dynamic access control model for preserving the privacy of personal healthcare data in a cloud environment. The proposed model considers context information for dynamic access. According to the proposed model, access control can be dynamically determined by changing the context information; this means that even for a subject with the same role in the cloud, access permission is defined differently depending on the context information and access condition. Furthermore, we experiment the ability of the proposed model to provide correct responses by representing a dynamic access decision with real-life personalized healthcare system scenarios.
Enterprise systems security management: a framework for breakthrough protection
NASA Astrophysics Data System (ADS)
Farroha, Bassam S.; Farroha, Deborah L.
2010-04-01
Securing the DoD information network is a tremendous task due to its size, access locations and the amount of network intrusion attempts on a daily basis. This analysis investigates methods/architecture options to deliver capabilities for secure information sharing environment. Crypto-binding and intelligent access controls are basic requirements for secure information sharing in a net-centric environment. We introduce many of the new technology components to secure the enterprise. The cooperative mission requirements lead to developing automatic data discovery and data stewards granting access to Cross Domain (CD) data repositories or live streaming data. Multiple architecture models are investigated to determine best-of-breed approaches including SOA and Private/Public Clouds.
Olvingson, C; Hallberg, N; Timpka, T; Lindqvist, K
2002-01-01
To evaluate Use Case Maps (UCMs) as a technique for Requirements Engineering (RE) in the development of information systems with functions for spatial analyses in inter-organizational public health settings. In this study, Participatory Action Research (PAR) is used to explore the UCM notation for requirements elicitation and to gather the opinions of the users. The Delphi technique is used to reach consensus in the construction of UCMs. The results show that UCMs can provide a visualization of the system's functionality and in combination with PAR provide a sound basis for gathering requirements in inter-organizational settings. UCMs were found to represent a suitable level for describing the organization and the dynamic flux of information including spatial resolution to all stakeholders. Moreover, by using PAR, the voices of the users and their tacit knowledge is intercepted. Further, UCMs are found useful in generating intuitive requirements by the creation of use cases. With UCMs and PAR it is possible to study the effects of design changes in the general information display and the spatial resolution in the same context. Both requirements on the information system in general and the functions for spatial analyses are possible to elicit when identifying the different responsibilities and the demands on spatial resolution associated to the actions of each administrative unit. However, the development process of UCM is not well documented and needs further investigation and formulation of guidelines.
Niu, Mutian; Kebreab, Ermias; Hristov, Alexander N; Oh, Joonpyo; Arndt, Claudia; Bannink, André; Bayat, Ali R; Brito, André F; Boland, Tommy; Casper, David; Crompton, Les A; Dijkstra, Jan; Eugène, Maguy A; Garnsworthy, Phil C; Haque, Md Najmul; Hellwing, Anne L F; Huhtanen, Pekka; Kreuzer, Michael; Kuhla, Bjoern; Lund, Peter; Madsen, Jørgen; Martin, Cécile; McClelland, Shelby C; McGee, Mark; Moate, Peter J; Muetzel, Stefan; Muñoz, Camila; O'Kiely, Padraig; Peiren, Nico; Reynolds, Christopher K; Schwarm, Angela; Shingfield, Kevin J; Storlien, Tonje M; Weisbjerg, Martin R; Yáñez-Ruiz, David R; Yu, Zhongtang
2018-02-16
Enteric methane (CH 4 ) production from cattle contributes to global greenhouse gas emissions. Measurement of enteric CH 4 is complex, expensive, and impractical at large scales; therefore, models are commonly used to predict CH 4 production. However, building robust prediction models requires extensive data from animals under different management systems worldwide. The objectives of this study were to (1) collate a global database of enteric CH 4 production from individual lactating dairy cattle; (2) determine the availability of key variables for predicting enteric CH 4 production (g/day per cow), yield [g/kg dry matter intake (DMI)], and intensity (g/kg energy corrected milk) and their respective relationships; (3) develop intercontinental and regional models and cross-validate their performance; and (4) assess the trade-off between availability of on-farm inputs and CH 4 prediction accuracy. The intercontinental database covered Europe (EU), the United States (US), and Australia (AU). A sequential approach was taken by incrementally adding key variables to develop models with increasing complexity. Methane emissions were predicted by fitting linear mixed models. Within model categories, an intercontinental model with the most available independent variables performed best with root mean square prediction error (RMSPE) as a percentage of mean observed value of 16.6%, 14.7%, and 19.8% for intercontinental, EU, and United States regions, respectively. Less complex models requiring only DMI had predictive ability comparable to complex models. Enteric CH 4 production, yield, and intensity prediction models developed on an intercontinental basis had similar performance across regions, however, intercepts and slopes were different with implications for prediction. Revised CH 4 emission conversion factors for specific regions are required to improve CH 4 production estimates in national inventories. In conclusion, information on DMI is required for good prediction, and other factors such as dietary neutral detergent fiber (NDF) concentration, improve the prediction. For enteric CH 4 yield and intensity prediction, information on milk yield and composition is required for better estimation. © 2018 John Wiley & Sons Ltd.
Standardized reporting of functioning information on ICF-based common metrics.
Prodinger, Birgit; Tennant, Alan; Stucki, Gerold
2018-02-01
In clinical practice and research a variety of clinical data collection tools are used to collect information on people's functioning for clinical practice and research and national health information systems. Reporting on ICF-based common metrics enables standardized documentation of functioning information in national health information systems. The objective of this methodological note on applying the ICF in rehabilitation is to demonstrate how to report functioning information collected with a data collection tool on ICF-based common metrics. We first specify the requirements for the standardized reporting of functioning information. Secondly, we introduce the methods needed for transforming functioning data to ICF-based common metrics. Finally, we provide an example. The requirements for standardized reporting are as follows: 1) having a common conceptual framework to enable content comparability between any health information; and 2) a measurement framework so that scores between two or more clinical data collection tools can be directly compared. The methods needed to achieve these requirements are the ICF Linking Rules and the Rasch measurement model. Using data collected incorporating the 36-item Short Form Health Survey (SF-36), the World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), and the Stroke Impact Scale 3.0 (SIS 3.0), the application of the standardized reporting based on common metrics is demonstrated. A subset of items from the three tools linked to common chapters of the ICF (d4 Mobility, d5 Self-care and d6 Domestic life), were entered as "super items" into the Rasch model. Good fit was achieved with no residual local dependency and a unidimensional metric. A transformation table allows for comparison between scales, and between a scale and the reporting common metric. Being able to report functioning information collected with commonly used clinical data collection tools with ICF-based common metrics enables clinicians and researchers to continue using their tools while still being able to compare and aggregate the information within and across tools.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
...On February 4, 2009, President Obama signed the Children's Health Insurance Program Reauthorization Act of 2009 (CHIPRA, Pub. L. 111-3). CHIPRA includes a requirement that the Departments of Labor and Health and Human Services develop a model notice for employers to use to inform employees of potential opportunities currently available in the State in which the employee resides for group health plan premium assistance under Medicaid and the Children's Health Insurance Program (CHIP). The Department of Labor (Department) is required to provide the model notice to employers within one year of CHIPRA's enactment. This document announces the availability of a Model Employer CHIP Notice. This notice also requests comments regarding compliance with the Employer CHIP Notice requirement for use in the development of future compliance assistance materials and/or regulations.
An online database for informing ecological network models: http://kelpforest.ucsc.edu.
Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H; Tinker, Martin T; Black, August; Caselle, Jennifer E; Hoban, Michael; Malone, Dan; Iles, Alison
2014-01-01
Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui).
An Online Database for Informing Ecological Network Models: http://kelpforest.ucsc.edu
Beas-Luna, Rodrigo; Novak, Mark; Carr, Mark H.; Tinker, Martin T.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison
2014-01-01
Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui). PMID:25343723
An online database for informing ecological network models: http://kelpforest.ucsc.edu
Beas-Luna, Rodrigo; Tinker, M. Tim; Novak, Mark; Carr, Mark H.; Black, August; Caselle, Jennifer E.; Hoban, Michael; Malone, Dan; Iles, Alison C.
2014-01-01
Ecological network models and analyses are recognized as valuable tools for understanding the dynamics and resiliency of ecosystems, and for informing ecosystem-based approaches to management. However, few databases exist that can provide the life history, demographic and species interaction information necessary to parameterize ecological network models. Faced with the difficulty of synthesizing the information required to construct models for kelp forest ecosystems along the West Coast of North America, we developed an online database (http://kelpforest.ucsc.edu/) to facilitate the collation and dissemination of such information. Many of the database's attributes are novel yet the structure is applicable and adaptable to other ecosystem modeling efforts. Information for each taxonomic unit includes stage-specific life history, demography, and body-size allometries. Species interactions include trophic, competitive, facilitative, and parasitic forms. Each data entry is temporally and spatially explicit. The online data entry interface allows researchers anywhere to contribute and access information. Quality control is facilitated by attributing each entry to unique contributor identities and source citations. The database has proven useful as an archive of species and ecosystem-specific information in the development of several ecological network models, for informing management actions, and for education purposes (e.g., undergraduate and graduate training). To facilitate adaptation of the database by other researches for other ecosystems, the code and technical details on how to customize this database and apply it to other ecosystems are freely available and located at the following link (https://github.com/kelpforest-cameo/databaseui).
Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering
NASA Technical Reports Server (NTRS)
Reil, Robin
2014-01-01
Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
75 FR 64909 - Fiduciary Requirements for Disclosure in Participant-Directed Individual Account Plans
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-20
...- related information in a form that encourages and facilitates a comparative review among a plan's... alternatives, and, specifically, how participants would react to the Model Comparative Chart for plan... regulation and Model Comparative Chart. Set forth below is an overview of the final regulations and a...
Modelling and Implementation of Catalogue Cards Using FreeMarker
ERIC Educational Resources Information Center
Radjenovic, Jelen; Milosavljevic, Branko; Surla, Dusan
2009-01-01
Purpose: The purpose of this paper is to report on a study involving the specification (using Unified Modelling Language (UML) 2.0) of information requirements and implementation of the software components for generating catalogue cards. The implementation in a Java environment is developed using the FreeMarker software.…
49 CFR 599.302 - Dealer application for reimbursement-submission, contents.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) REQUIREMENTS AND... organization that is the purchaser. (B) Residence address (or, for an organization, business address). The full... number. (ii) Trade-in vehicle information. (A) Make. The make of the vehicle. (B) Model. The model of the...
Collaborative Development: A New Culture Affects an Old Organization
ERIC Educational Resources Information Center
Phelps, Jim; Ruzicka, Terry
2008-01-01
At the University of Wisconsin (UW)-Madison, the Registrar's Office and the Division of Information Technology (DoIT) apply a collaborative development process to joint projects. This model differs from a "waterfall" model in that technical and functional staff work closely to develop requirements, prototypes, and the product throughout…
Using Robust Variance Estimation to Combine Multiple Regression Estimates with Meta-Analysis
ERIC Educational Resources Information Center
Williams, Ryan
2013-01-01
The purpose of this study was to explore the use of robust variance estimation for combining commonly specified multiple regression models and for combining sample-dependent focal slope estimates from diversely specified models. The proposed estimator obviates traditionally required information about the covariance structure of the dependent…
Role of Discrepant Questioning Leading to Model Element Modification
ERIC Educational Resources Information Center
Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria Cecilia; Clement, John
2009-01-01
Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…
In recent years the applications of regional air quality models are continuously being extended to address atmospheric pollution phenomenon from local to hemispheric spatial scales over time scales ranging from episodic to annual. The need to represent interactions between physic...
Kim, Kwang-Yon; Shin, Seong Eun; No, Kyoung Tai
2015-01-01
Objectives For successful adoption of legislation controlling registration and assessment of chemical substances, it is important to obtain sufficient toxicological experimental evidence and other related information. It is also essential to obtain a sufficient number of predicted risk and toxicity results. Particularly, methods used in predicting toxicities of chemical substances during acquisition of required data, ultimately become an economic method for future dealings with new substances. Although the need for such methods is gradually increasing, the-required information about reliability and applicability range has not been systematically provided. Methods There are various representative environmental and human toxicity models based on quantitative structure-activity relationships (QSAR). Here, we secured the 10 representative QSAR-based prediction models and its information that can make predictions about substances that are expected to be regulated. We used models that predict and confirm usability of the information expected to be collected and submitted according to the legislation. After collecting and evaluating each predictive model and relevant data, we prepared methods quantifying the scientific validity and reliability, which are essential conditions for using predictive models. Results We calculated predicted values for the models. Furthermore, we deduced and compared adequacies of the models using the Alternative non-testing method assessed for Registration, Evaluation, Authorization, and Restriction of Chemicals Substances scoring system, and deduced the applicability domains for each model. Additionally, we calculated and compared inclusion rates of substances expected to be regulated, to confirm the applicability. Conclusions We evaluated and compared the data, adequacy, and applicability of our selected QSAR-based toxicity prediction models, and included them in a database. Based on this data, we aimed to construct a system that can be used with predicted toxicity results. Furthermore, by presenting the suitability of individual predicted results, we aimed to provide a foundation that could be used in actual assessments and regulations. PMID:26206368
Mixed integer programming model for optimizing the layout of an ICU vehicle.
Alejo, Javier Sánchez; Martín, Modoaldo Garrido; Ortega-Mier, Miguel; García-Sánchez, Alvaro
2009-12-08
This paper presents a Mixed Integer Programming (MIP) model for designing the layout of the Intensive Care Units' (ICUs) patient care space. In particular, this MIP model was developed for optimizing the layout for materials to be used in interventions. This work was developed within the framework of a joint project between the Madrid Technical Unverstity and the Medical Emergency Services of the Madrid Regional Government (SUMMA 112). The first task was to identify the relevant information to define the characteristics of the new vehicles and, in particular, to obtain a satisfactory interior layout to locate all the necessary materials. This information was gathered from health workers related to ICUs. With that information an optimization model was developed in order to obtain a solution. From the MIP model, a first solution was obtained, consisting of a grid to locate the different materials needed for the ICUs. The outcome from the MIP model was discussed with health workers to tune the solution, and after slightly altering that solution to meet some requirements that had not been included in the mathematical model, the eventual solution was approved by the persons responsible for specifying the characteristics of the new vehicles. According to the opinion stated by the SUMMA 112's medical group responsible for improving the ambulances (the so-called "coaching group"), the outcome was highly satisfactory. Indeed, the final design served as a basis to draw up the requirements of a public tender. As a result from solving the Optimization model, a grid was obtained to locate the different necessary materials for the ICUs. This grid had to be slightly altered to meet some requirements that had not been included in the mathematical model. The results were discussed with the persons responsible for specifying the characteristics of the new vehicles. The outcome was highly satisfactory. Indeed, the final design served as a basis to draw up the requirements of a public tender. The authors advocate this approach to address similar problems within the field of Health Services to improve the efficiency and the effectiveness of the processes involved. Problems such as those in operation rooms or emergency rooms, where the availability of a large amount of material is critical are eligible to be dealt with in a simmilar manner.
Mixed integer programming model for optimizing the layout of an ICU vehicle
2009-01-01
Background This paper presents a Mixed Integer Programming (MIP) model for designing the layout of the Intensive Care Units' (ICUs) patient care space. In particular, this MIP model was developed for optimizing the layout for materials to be used in interventions. This work was developed within the framework of a joint project between the Madrid Technical Unverstity and the Medical Emergency Services of the Madrid Regional Government (SUMMA 112). Methods The first task was to identify the relevant information to define the characteristics of the new vehicles and, in particular, to obtain a satisfactory interior layout to locate all the necessary materials. This information was gathered from health workers related to ICUs. With that information an optimization model was developed in order to obtain a solution. From the MIP model, a first solution was obtained, consisting of a grid to locate the different materials needed for the ICUs. The outcome from the MIP model was discussed with health workers to tune the solution, and after slightly altering that solution to meet some requirements that had not been included in the mathematical model, the eventual solution was approved by the persons responsible for specifying the characteristics of the new vehicles. According to the opinion stated by the SUMMA 112's medical group responsible for improving the ambulances (the so-called "coaching group"), the outcome was highly satisfactory. Indeed, the final design served as a basis to draw up the requirements of a public tender. Results As a result from solving the Optimization model, a grid was obtained to locate the different necessary materials for the ICUs. This grid had to be slightly altered to meet some requirements that had not been included in the mathematical model. The results were discussed with the persons responsible for specifying the characteristics of the new vehicles. Conclusion The outcome was highly satisfactory. Indeed, the final design served as a basis to draw up the requirements of a public tender. The authors advocate this approach to address similar problems within the field of Health Services to improve the efficiency and the effectiveness of the processes involved. Problems such as those in operation rooms or emergency rooms, where the availability of a large amount of material is critical are eligible to be dealt with in a simmilar manner. PMID:19995438
Pressurization and expulsion of cryogenic liquids: Generic requirements for a low gravity experiment
NASA Technical Reports Server (NTRS)
Vandresar, Neil T.; Stochl, Robert J.
1991-01-01
Requirements are presented for an experiment designed to obtain data for the pressurization and expulsion of a cryogenic supply tank in a low gravity environment. These requirements are of a generic nature and applicable to any cryogenic fluid of interest, condensible or non-condensible pressurants, and various low gravity test platforms such as the Space Shuttle or a free-flyer. Background information, the thermophysical process, preliminary analytical modeling, and experimental requirements are discussed. Key parameters, measurements, hardware requirements, procedures, a test matrix, and data analysis are outlined.
NASA Astrophysics Data System (ADS)
Kuppel, S.; Soulsby, C.; Maneta, M. P.; Tetzlaff, D.
2017-12-01
The utility of field measurements to help constrain the model solution space and identify feasible model configurations has been an increasingly central issue in hydrological model calibration. Sufficiently informative observations are necessary to ensure that the goodness of model-data fit attained effectively translates into more physically-sound information for the internal model parameters, as a basis for model structure evaluation. Here we assess to which extent the diversity of information content can inform on the suitability of a complex, process-based ecohydrological model to simulate key water flux and storage dynamics at a long-term research catchment in the Scottish Highlands. We use the fully-distributed ecohydrological model EcH2O, calibrated against long-term datasets that encompass hydrologic and energy exchanges and ecological measurements: stream discharge, soil moisture, net radiation above canopy, and pine stand transpiration. Diverse combinations of these constraints were applied using a multi-objective cost function specifically designed to avoid compensatory effects between model-data metrics. Results revealed that calibration against virtually all datasets enabled the model to reproduce streamflow reasonably well. However, parameterizing the model to adequately capture local flux and storage dynamics, such as soil moisture or transpiration, required calibration with specific observations. This indicates that the footprint of the information contained in observations varies for each type of dataset, and that a diverse database informing about the different compartments of the domain, is critical to test hypotheses of catchment function and identify a consistent model parameterization. The results foster confidence in using EcH2O to help understanding current and future ecohydrological couplings in Northern catchments.
NADM Conceptual Model 1.0 -- A Conceptual Model for Geologic Map Information
,
2004-01-01
Executive Summary -- The NADM Data Model Design Team was established in 1999 by the North American Geologic Map Data Model Steering Committee (NADMSC) with the purpose of drafting a geologic map data model for consideration as a standard for developing interoperable geologic map-centered databases by state, provincial, and federal geological surveys. The model is designed to be a technology-neutral conceptual model that can form the basis for a web-based interchange format using evolving information technology (e.g., XML, RDF, OWL), and guide implementation of geoscience databases in a common conceptual framework. The intended purpose is to allow geologic information sharing between geologic map data providers and users, independent of local information system implementation. The model emphasizes geoscience concepts and relationships related to information presented on geologic maps. Design has been guided by an informal requirements analysis, documentation of existing databases, technology developments, and other standardization efforts in the geoscience and computer-science communities. A key aspect of the model is the notion that representation of the conceptual framework (ontology) that underlies geologic map data must be part of the model, because this framework changes with time and understanding, and varies between information providers. The top level of the model distinguishes geologic concepts, geologic representation concepts, and metadata. The geologic representation part of the model provides a framework for representing the ontology that underlies geologic map data through a controlled vocabulary, and for establishing the relationships between this vocabulary and a geologic map visualization or portrayal. Top-level geologic classes in the model are Earth material (substance), geologic unit (parts of the Earth), geologic age, geologic structure, fossil, geologic process, geologic relation, and geologic event.
NASA Technical Reports Server (NTRS)
Throop, David R.
1992-01-01
The paper examines the requirements for the reuse of computational models employed in model-based reasoning (MBR) to support automated inference about mechanisms. Areas in which the theory of MBR is not yet completely adequate for using the information that simulations can yield are identified, and recent work in these areas is reviewed. It is argued that using MBR along with simulations forces the use of specific fault models. Fault models are used so that a particular fault can be instantiated into the model and run. This in turn implies that the component specification language needs to be capable of encoding any fault that might need to be sensed or diagnosed. It also means that the simulation code must anticipate all these faults at the component level.
Functional Requirements for Information Resource Provenance on the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCusker, James P.; Lebo, Timothy; Graves, Alvaro
We provide a means to formally explain the relationship between HTTP URLs and the representations returned when they are requested. According to existing World Wide Web architecture, the URL serves as an identier for a semiotic referent while the document returned via HTTP serves as a representation of the same referent. This begins with two sides of a semiotic triangle; the third side is the relationship between the URL and the representation received. We complete this description by extending the library science resource model Functional Requirements for Bibliographic Resources (FRBR) with cryptographic message and content digests to create a Functionalmore » Requirements for Information Resources (FRIR). We show how applying the FRIR model to HTTP GET and POST transactions disambiguates the many relationships between a given URL and all representations received from its request, provides fine-grained explanations that are complementary to existing explanations of web resources, and integrates easily into the emerging W3C provenance standard.« less
Improving size estimates of open animal populations by incorporating information on age
Manly, Bryan F.J.; McDonald, Trent L.; Amstrup, Steven C.; Regehr, Eric V.
2003-01-01
Around the world, a great deal of effort is expended each year to estimate the sizes of wild animal populations. Unfortunately, population size has proven to be one of the most intractable parameters to estimate. The capture-recapture estimation models most commonly used (of the Jolly-Seber type) are complicated and require numerous, sometimes questionable, assumptions. The derived estimates usually have large variances and lack consistency over time. In capture–recapture studies of long-lived animals, the ages of captured animals can often be determined with great accuracy and relative ease. We show how to incorporate age information into size estimates for open populations, where the size changes through births, deaths, immigration, and emigration. The proposed method allows more precise estimates of population size than the usual models, and it can provide these estimates from two sample occasions rather than the three usually required. Moreover, this method does not require specialized programs for capture-recapture data; researchers can derive their estimates using the logistic regression module in any standard statistical package.
Highway Project Delivery Requirements
DOT National Transportation Integrated Search
1998-07-01
The purpose of the Commercial Vehicle Information Systems and Networks Model Deployment Initiative (CVISN MDI) is to demonstrate the technical and institutional feasibility, costs, and benefits of the primary Intelligent Transportation Systems (ITS) ...
Intertemporal consumption with directly measured welfare functions and subjective expectations
Kapteyn, Arie; Kleinjans, Kristin J.; van Soest, Arthur
2010-01-01
Euler equation estimation of intertemporal consumption models requires many, often unverifiable assumptions. These include assumptions on expectations and preferences. We aim at reducing some of these requirements by using direct subjective information on respondents’ preferences and expectations. The results suggest that individually measured welfare functions and expectations have predictive power for the variation in consumption across households. Furthermore, estimates of the intertemporal elasticity of substitution based on the estimated welfare functions are plausible and of a similar order of magnitude as other estimates found in the literature. The model favored by the data only requires cross-section data for estimation. PMID:20442798