Sample records for reference information model

  1. Requirements for data integration platforms in biomedical research networks: a reference model.

    PubMed

    Ganzinger, Matthias; Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper.

  2. A Reference Architecture for Space Information Management

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Crichton, Daniel J.; Hughes, J. Steven; Ramirez, Paul M.; Berrios, Daniel C.

    2006-01-01

    We describe a reference architecture for space information management systems that elegantly overcomes the rigid design of common information systems in many domains. The reference architecture consists of a set of flexible, reusable, independent models and software components that function in unison, but remain separately managed entities. The main guiding principle of the reference architecture is to separate the various models of information (e.g., data, metadata, etc.) from implemented system code, allowing each to evolve independently. System modularity, systems interoperability, and dynamic evolution of information system components are the primary benefits of the design of the architecture. The architecture requires the use of information models that are substantially more advanced than those used by the vast majority of information systems. These models are more expressive and can be more easily modularized, distributed and maintained than simpler models e.g., configuration files and data dictionaries. Our current work focuses on formalizing the architecture within a CCSDS Green Book and evaluating the architecture within the context of the C3I initiative.

  3. Requirements for data integration platforms in biomedical research networks: a reference model

    PubMed Central

    Knaup, Petra

    2015-01-01

    Biomedical research networks need to integrate research data among their members and with external partners. To support such data sharing activities, an adequate information technology infrastructure is necessary. To facilitate the establishment of such an infrastructure, we developed a reference model for the requirements. The reference model consists of five reference goals and 15 reference requirements. Using the Unified Modeling Language, the goals and requirements are set into relation to each other. In addition, all goals and requirements are described textually in tables. This reference model can be used by research networks as a basis for a resource efficient acquisition of their project specific requirements. Furthermore, a concrete instance of the reference model is described for a research network on liver cancer. The reference model is transferred into a requirements model of the specific network. Based on this concrete requirements model, a service-oriented information technology architecture is derived and also described in this paper. PMID:25699205

  4. E-KIT: An Electronic-Knowledge Information Tool for Organizing Site Information and Improving Technical Communication with Stakeholders - 13082

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kautsky, Mark; Findlay, Richard C.; Hodges, Rex A.

    2013-07-01

    Managing technical references for projects that have long histories is hampered by the large collection of documents, each of which might contain discrete pieces of information relevant to the site conceptual model. A database application has been designed to improve the efficiency of retrieving technical information for a project. Although many databases are currently used for accessing analytical and geo-referenced data, applications designed specifically to manage technical reference material for projects are scarce. Retrieving site data from the array of available references becomes an increasingly inefficient use of labor. The electronic-Knowledge Information Tool (e-KIT) is designed as a project-level resourcemore » to access and communicate technical information. The e-KIT is a living tool that grows as new information becomes available, and its value to the project increases as the volume of site information increases. Having all references assembled in one location with complete reference citations and links to elements of the site conceptual model offers a way to enhance communication with outside groups. The published and unpublished references are incorporated into the e-KIT, while the compendium of references serves as a complete bibliography for the project. (authors)« less

  5. Land-use history and contemporary management inform an ecological reference model for longleaf pine woodland understory plant communities

    Treesearch

    Lars A. Brudvig; John L. Orrock; Ellen I. Damschen; Cathy D. Collins; Philip G. Hahn; W. Brett Mattingly; Joseph W. Veldman; Joan L. Walker

    2014-01-01

    Ecological restoration is frequently guided by reference conditions describing a successfully restored ecosystem; however, the causes and magnitude of ecosystem degradation vary, making simple knowledge of reference conditions insufficient for prioritizing and guiding restoration. Ecological reference models provide further guidance by quantifying reference conditions...

  6. Effects of Information Status and Uniqueness Status on Referent Management in Discourse Comprehension and Planning

    ERIC Educational Resources Information Center

    Brocher, Andreas; Chiriacescu, Sofiana Iulia; von Heusinger, Klaus

    2018-01-01

    In discourse processing, speakers collaborate toward a shared mental model by establishing and recruiting prominence relations between different discourse referents. In this article we investigate to what extent the possibility to infer a referent's existence from preceding context (as indicated by the referent's information status as inferred or…

  7. Component Models for Fuzzy Data

    ERIC Educational Resources Information Center

    Coppi, Renato; Giordani, Paolo; D'Urso, Pierpaolo

    2006-01-01

    The fuzzy perspective in statistical analysis is first illustrated with reference to the "Informational Paradigm" allowing us to deal with different types of uncertainties related to the various informational ingredients (data, model, assumptions). The fuzzy empirical data are then introduced, referring to "J" LR fuzzy variables as observed on "I"…

  8. Modeling Cross-Situational Word-Referent Learning: Prior Questions

    ERIC Educational Resources Information Center

    Yu, Chen; Smith, Linda B.

    2012-01-01

    Both adults and young children possess powerful statistical computation capabilities--they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of…

  9. Virtual Reference Transcript Analysis: A Few Models.

    ERIC Educational Resources Information Center

    Smyth, Joanne

    2003-01-01

    Describes the introduction of virtual, or digital, reference service at the University of New Brunswick libraries. Highlights include analyzing transcripts from LIVE (Library Information in a Virtual Environment); reference question types; ACRL (Association of College and Research Libraries) information literacy competency standards; and the Big 6…

  10. Applying an Information Problem-Solving Model to Academic Reference Work: Findings and Implications.

    ERIC Educational Resources Information Center

    Cottrell, Janet R.; Eisenberg, Michael B.

    2001-01-01

    Examines the usefulness of the Eisenberg-Berkowitz Information Problem-Solving model as a categorization for academic reference encounters. Major trends in the data include a high proportion of questions about location and access of sources, lack of synthesis or production activities, and consistent presence of system problems that impede the…

  11. Potential for Inclusion of Information Encountering within Information Literacy Models

    ERIC Educational Resources Information Center

    Erdelez, Sanda; Basic, Josipa; Levitov, Deborah D.

    2011-01-01

    Introduction: Information encountering (finding information while searching for some other information), is a type of opportunistic discovery of information that complements purposeful approaches to finding information. The motivation for this paper was to determine if the current models of information literacy instruction refer to information…

  12. Building a Unified Information Network.

    ERIC Educational Resources Information Center

    Avram, Henriette D.

    1988-01-01

    Discusses cooperative efforts between research organizations and libraries to create a national information network. Topics discussed include the Linked System Project (LSP); technical processing versus reference and research functions; Open Systems Interconnection (OSI) Reference Model; the National Science Foundation Network (NSFNET); and…

  13. AP233: An Information Model for Systems Engineering

    NASA Technical Reports Server (NTRS)

    Siebes, Georg

    2009-01-01

    In today's world, information is abundant. We have no problems generating it. But we are challenged to find, organize, and exchange information. center dot A standardized model of information can help. Such a model nearly completed its development for Systems Engineering. It is referred to as AP233 (AP = Application Protocol).

  14. Dental age estimation in Japanese individuals combining permanent teeth and third molars.

    PubMed

    Ramanan, Namratha; Thevissen, Patrick; Fleuws, Steffen; Willems, G

    2012-12-01

    The study aim was, firstly, to verify the Willems et al. model on a Japanese reference sample. Secondly to develop a Japanese reference model based on the Willems et al. method and to verify it. Thirdly to analyze the age prediction performance adding tooth development information of third molars to permanent teeth. Retrospectively 1877 panoramic radiographs were selected in the age range between 1 and 23 years (1248 children, 629 sub-adults). Dental development was registered applying Demirjian 's stages of the mandibular left permanent teeth in children and Köhler stages on the third molars. The children's data were, firstly, used to validate the Willems et al. model (developed a Belgian reference sample), secondly, split ino a training and a test sample. On the training sample a Japanese reference model was developed based on the Willems method. The developed model and the Willems et al; model were verified on the test sample. Regression analysis was used to detect the age prediction performance adding third molar scores to permanent tooth scores. The validated Willems et al. model provided a mean absolute error of 0.85 and 0.75 years in females and males, respectively. The mean absolute error in the verified Willems et al. and the developed Japanese reference model was 0.85, 0.77 and 0.79, 0.75 years in females and males, respectively. On average a negligible change in root mean square error values was detected adding third molar scores to permanent teeth scores. The Belgian sample could be used as a reference model to estimate the age of the Japanese individuals. Combining information from the third molars and permanent teeth was not providing clinically significant improvement of age predictions based on permanent teeth information alone.

  15. The combined effects of self-referent information processing and ruminative responses on adolescent depression.

    PubMed

    Black, Stephanie Winkeljohn; Pössel, Patrick

    2013-08-01

    Adolescents who develop depression have worse interpersonal and affective experiences and are more likely to develop substance problems and/or suicidal ideation compared to adolescents who do not develop depression. This study examined the combined effects of negative self-referent information processing and rumination (i.e., brooding and reflection) on adolescent depressive symptoms. It was hypothesized that the interaction of negative self-referent information processing and brooding would significantly predict depressive symptoms, while the interaction of negative self-referent information processing and reflection would not predict depressive symptoms. Adolescents (n = 92; 13-15 years; 34.7% female) participated in a 6-month longitudinal study. Self-report instruments measured depressive symptoms and rumination; a cognitive task measured information processing. Path modelling in Amos 19.0 analyzed the data. The interaction of negative information processing and brooding significantly predicted an increase in depressive symptoms 6 months later. The interaction of negative information processing and reflection did not significantly predict depression, however, the model not meet a priori standards to accept the null hypothesis. Results suggest clinicians working with adolescents at-risk for depression should consider focusing on the reduction of brooding and negative information processing to reduce long-term depressive symptoms.

  16. Approaches to defining reference regimes for river restoration planning

    NASA Astrophysics Data System (ADS)

    Beechie, T. J.

    2014-12-01

    Reference conditions or reference regimes can be defined using three general approaches, historical analysis, contemporary reference sites, and theoretical or empirical models. For large features (e.g., floodplain channels and ponds) historical data and maps are generally reliable. For smaller features (e.g., pools and riffles in small tributaries), field data from contemporary reference sites are a reasonable surrogate for historical data. Models are generally used for features that have no historical information or present day reference sites (e.g., beaver pond habitat). Each of these approaches contributes to a watershed-wide understanding of current biophysical conditions relative to potential conditions, which helps create not only a guiding vision for restoration, but also helps quantify and locate the largest or most important restoration opportunities. Common uses of geomorphic and biological reference conditions include identifying key areas for habitat protection or restoration, and informing the choice of restoration targets. Examples of use of each of these three approaches to define reference regimes in western USA illustrate how historical information and current research highlight key restoration opportunities, focus restoration effort in areas that can produce the largest ecological benefit, and contribute to estimating restoration potential and assessing likelihood of achieving restoration goals.

  17. Improved reference models for middle atmosphere ozone

    NASA Technical Reports Server (NTRS)

    Keating, G. M.; Pitts, M. C.; Chen, C.

    1989-01-01

    Improvements are provided for the ozone reference model which is to be incorporated in the COSPAR International Reference Atmosphere (CIRA). The ozone reference model will provide considerable information on the global ozone distribution, including ozone vertical structure as a function of month and latitude from approximately 25 to 90 km, combining data from five recent satellite experiments (Nimbus 7 LIMS, Nimbus 7 SBUV, AE-2 SAGE, Solar Mesosphere Explorer (SME) UVS, and SME IR). The improved models are described and use reprocessed AE-2 SAGE data (sunset) and extend the use of SAGE data from 1981 to the period 1981-1983. Comparisons are shown between the ozone reference model and various nonsatellite measurements at different levels in the middle atmosphere.

  18. A reference model for scientific information interchange

    NASA Technical Reports Server (NTRS)

    Reich, Lou; Sawyer, Don; Davis, Randy

    1993-01-01

    This paper presents an overview of an Information Interchange Reference Model (IIRM) currently being developed by individuals participating in the Consultative Committee for Space Data Systems (CCSDS) Panel 2, the Planetary Data Systems (PDS), and the Committee on Earth Observing Satellites (CEOS). This is an ongoing research activity and is not an official position by these bodies. This reference model provides a framework for describing and assessing current and proposed methodologies for information interchange within and among the space agencies. It is hoped that this model will improve interoperability between the various methodologies. As such, this model attempts to address key information interchange issues as seen by the producers and users of space-related data and to put them into a coherent framework. Information is understood as the knowledge (e.g., the scientific content) represented by data. Therefore, concern is not primarily on mechanisms for transferring data from user to user (e.g., compact disk read-only memory (CD-ROM), wide-area networks, optical tape, and so forth) but on how information is encoded as data and how the information content is maintained with minimal loss or distortion during transmittal. The model assumes open systems, which means that the protocols or methods used should be fully described and the descriptions publicly available. Ideally these protocols are promoted by recognized standards organizations using processes that permit involvement by those most likely to be affected, thereby enhancing the protocol's stability and the likelihood of wide support.

  19. The Reference Encounter Model.

    ERIC Educational Resources Information Center

    White, Marilyn Domas

    1983-01-01

    Develops model of the reference interview which explicitly incorporates human information processing, particularly schema ideas presented by Marvin Minsky and other theorists in cognitive processing and artificial intelligence. Questions are raised concerning use of content analysis of transcribed verbal protocols as methodology for studying…

  20. Information Literacy for Health Professionals: Teaching Essential Information Skills with the Big6 Information Literacy Model

    ERIC Educational Resources Information Center

    Santana Arroyo, Sonia

    2013-01-01

    Health professionals frequently do not possess the necessary information-seeking abilities to conduct an effective search in databases and Internet sources. Reference librarians may teach health professionals these information and technology skills through the Big6 information literacy model (Big6). This article aims to address this issue. It also…

  1. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    ERIC Educational Resources Information Center

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  2. Requirements engineering for cross-sectional information chain models

    PubMed Central

    Hübner, U; Cruel, E; Gök, M; Garthaus, M; Zimansky, M; Remmers, H; Rienhoff, O

    2012-01-01

    Despite the wealth of literature on requirements engineering, little is known about engineering very generic, innovative and emerging requirements, such as those for cross-sectional information chains. The IKM health project aims at building information chain reference models for the care of patients with chronic wounds, cancer-related pain and back pain. Our question therefore was how to appropriately capture information and process requirements that are both generally applicable and practically useful. To this end, we started with recommendations from clinical guidelines and put them up for discussion in Delphi surveys and expert interviews. Despite the heterogeneity we encountered in all three methods, it was possible to obtain requirements suitable for building reference models. We evaluated three modelling languages and then chose to write the models in UML (class and activity diagrams). On the basis of the current project results, the pros and cons of our approach are discussed. PMID:24199080

  3. Concentration and Diversity of Availability and Use in Information Systems: A Positive Reinforcement Model.

    ERIC Educational Resources Information Center

    Rousseau, Ronald

    1992-01-01

    Proposes a mathematical model to explain the observed concentration or diversity of nominal classes in information retrieval systems. The Lorenz Curve is discussed, Information Production Process (IPP) is explained, and a heuristic explanation of circumstances in which the model might be used is offered. (30 references) (LRW)

  4. Theory and Modelling Resources Cookbook

    NASA Astrophysics Data System (ADS)

    Gray, Norman

    This cookbook is intended to assemble references to resources likely to be of interest to theorists and modellers. It's not a collection of standard recipes, but instead a repository of brief introductions to facilities. It includes references to sources of authoritative information, including those Starlink documents most likely to be of interest to theorists. Although the topics are chosen for their relevance to theoretical work, a good proportion of the information should be of interest to all of the astronomical computing community.

  5. A Theory of Information Genetics: How Four Subforces Generate Information and the Implications for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2002-01-01

    Proposes a model called information genetics to elaborate on the origin of information generating. Explains conceptual and data models; and describes a software program that was developed for citation data mining, infomapping, and information repackaging for total quality knowledge management in Web representation. (Contains 112 references.)…

  6. A Unified Mathematical Definition of Classical Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2000-01-01

    Presents a unified mathematical definition for the classical models of information retrieval and identifies a mathematical structure behind relevance feedback. Highlights include vector information retrieval; probabilistic information retrieval; and similarity information retrieval. (Contains 118 references.) (Author/LRW)

  7. Sensor trustworthiness in uncertain time varying stochastic environments

    NASA Astrophysics Data System (ADS)

    Verma, Ajay; Fernandes, Ronald; Vadakkeveedu, Kalyan

    2011-06-01

    Persistent surveillance applications require unattended sensors deployed in remote regions to track and monitor some physical stimulant of interest that can be modeled as output of time varying stochastic process. However, the accuracy or the trustworthiness of the information received through a remote and unattended sensor and sensor network cannot be readily assumed, since sensors may get disabled, corrupted, or even compromised, resulting in unreliable information. The aim of this paper is to develop information theory based metric to determine sensor trustworthiness from the sensor data in an uncertain and time varying stochastic environment. In this paper we show an information theory based determination of sensor data trustworthiness using an adaptive stochastic reference sensor model that tracks the sensor performance for the time varying physical feature, and provides a baseline model that is used to compare and analyze the observed sensor output. We present an approach in which relative entropy is used for reference model adaptation and determination of divergence of the sensor signal from the estimated reference baseline. We show that that KL-divergence is a useful metric that can be successfully used in determination of sensor failures or sensor malice of various types.

  8. Event-Driven Process Chains (EPC)

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    This chapter provides a comprehensive overview of Event-driven Process Chains (EPCs) and introduces a novel definition of EPC semantics. EPCs became popular in the 1990s as a conceptual business process modeling language in the context of reference modeling. Reference modeling refers to the documentation of generic business operations in a model such as service processes in the telecommunications sector, for example. It is claimed that reference models can be reused and adapted as best-practice recommendations in individual companies (see [230, 168, 229, 131, 400, 401, 446, 127, 362, 126]). The roots of reference modeling can be traced back to the Kölner Integrationsmodell (KIM) [146, 147] that was developed in the 1960s and 1970s. In the 1990s, the Institute of Information Systems (IWi) in Saarbrücken worked on a project with SAP to define a suitable business process modeling language to document the processes of the SAP R/3 enterprise resource planning system. There were two results from this joint effort: the definition of EPCs [210] and the documentation of the SAP system in the SAP Reference Model (see [92, 211]). The extensive database of this reference model contains almost 10,000 sub-models: 604 of them non-trivial EPC business process models. The SAP Reference model had a huge impact with several researchers referring to it in their publications (see [473, 235, 127, 362, 281, 427, 415]) as well as motivating the creation of EPC reference models in further domains including computer integrated manufacturing [377, 379], logistics [229] or retail [52]. The wide-spread application of EPCs in business process modeling theory and practice is supported by their coverage in seminal text books for business process management and information systems in general (see [378, 380, 49, 384, 167, 240]). EPCs are frequently used in practice due to a high user acceptance [376] and extensive tool support. Some examples of tools that support EPCs are ARIS Toolset by IDS Scheer AG, AENEIS by ATOSS Software AG, ADONIS by BOC GmbH, Visio by Microsoft Corp., Nautilus by Gedilan Consulting GmbH, and Bonapart by Pikos GmbH. In order to facilitate the interchange of EPC business process models between these tools, there is a tool neutral interchange format called EPC Markup Language (EPML) [283, 285, 286, 287, 289, 290, 291].

  9. Modeling Cross-Situational Word–Referent Learning: Prior Questions

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2013-01-01

    Both adults and young children possess powerful statistical computation capabilities—they can infer the referent of a word from highly ambiguous contexts involving many words and many referents by aggregating cross-situational statistical information across contexts. This ability has been explained by models of hypothesis testing and by models of associative learning. This article describes a series of simulation studies and analyses designed to understand the different learning mechanisms posited by the 2 classes of models and their relation to each other. Variants of a hypothesis-testing model and a simple or dumb associative mechanism were examined under different specifications of information selection, computation, and decision. Critically, these 3 components of the models interact in complex ways. The models illustrate a fundamental tradeoff between amount of data input and powerful computations: With the selection of more information, dumb associative models can mimic the powerful learning that is accomplished by hypothesis-testing models with fewer data. However, because of the interactions among the component parts of the models, the associative model can mimic various hypothesis-testing models, producing the same learning patterns but through different internal components. The simulations argue for the importance of a compositional approach to human statistical learning: the experimental decomposition of the processes that contribute to statistical learning in human learners and models with the internal components that can be evaluated independently and together. PMID:22229490

  10. The Comprehension and Validation of Social Information.

    ERIC Educational Resources Information Center

    Wyer, Robert S., Jr.; Radvansky, Gabriel A.

    1999-01-01

    Proposes a theory of social cognition to account for the comprehension and verification of social information. The theory views comprehension as a process of constructing situation models of new information on the basis of previously formed models about its referents. The comprehension of both single statements and multiple pieces of information…

  11. Geoid undulations and gravity anomalies over the Aral Sea, the Black Sea and the Caspian Sea from a combined GEOS-3/SEASAT/GEOSAT altimeter data set

    NASA Technical Reports Server (NTRS)

    Au, Andrew Y.; Brown, Richard D.; Welker, Jean E.

    1991-01-01

    Satellite-based altimetric data taken by GOES-3, SEASAT, and GEOSAT over the Aral Sea, the Black Sea, and the Caspian Sea are analyzed and a least squares collocation technique is used to predict the geoid undulations on a 0.25x0.25 deg. grid and to transform these geoid undulations to free air gravity anomalies. Rapp's 180x180 geopotential model is used as the reference surface for the collocation procedure. The result of geoid to gravity transformation is, however, sensitive to the information content of the reference geopotential model used. For example, considerable detailed surface gravity data were incorporated into the reference model over the Black Sea, resulting in a reference model with significant information content at short wavelengths. Thus, estimation of short wavelength gravity anomalies from gridded geoid heights is generally reliable over regions such as the Black Sea, using the conventional collocation technique with local empirical covariance functions. Over regions such as the Caspian Sea, where detailed surface data are generally not incorporated into the reference model, unconventional techniques are needed to obtain reliable gravity anomalies. Based on the predicted gravity anomalies over these inland seas, speculative tectonic structures are identified and geophysical processes are inferred.

  12. The demand for consumer health information.

    PubMed

    Wagner, T H; Hu, T W; Hibbard, J H

    2001-11-01

    Using data from an evaluation of a community-wide informational intervention, we modeled the demand for medical reference books, telephone advice nurses, and computers for health information. Data were gathered from random household surveys in Boise, ID (experimental site), Billings, MT, and Eugene, OR (control sites). Conditional difference-in-differences show that the intervention increased the use of medical reference books, advice nurses, and computers for health information by approximately 15, 6, and 4%. respectively. The results also suggest that the intervention was associated with a decreased reliance on health professionals for information.

  13. Scoring annual earthquake predictions in China

    NASA Astrophysics Data System (ADS)

    Zhuang, Jiancang; Jiang, Changsheng

    2012-02-01

    The Annual Consultation Meeting on Earthquake Tendency in China is held by the China Earthquake Administration (CEA) in order to provide one-year earthquake predictions over most China. In these predictions, regions of concern are denoted together with the corresponding magnitude range of the largest earthquake expected during the next year. Evaluating the performance of these earthquake predictions is rather difficult, especially for regions that are of no concern, because they are made on arbitrary regions with flexible magnitude ranges. In the present study, the gambling score is used to evaluate the performance of these earthquake predictions. Based on a reference model, this scoring method rewards successful predictions and penalizes failures according to the risk (probability of being failure) that the predictors have taken. Using the Poisson model, which is spatially inhomogeneous and temporally stationary, with the Gutenberg-Richter law for earthquake magnitudes as the reference model, we evaluate the CEA predictions based on 1) a partial score for evaluating whether issuing the alarmed regions is based on information that differs from the reference model (knowledge of average seismicity level) and 2) a complete score that evaluates whether the overall performance of the prediction is better than the reference model. The predictions made by the Annual Consultation Meetings on Earthquake Tendency from 1990 to 2003 are found to include significant precursory information, but the overall performance is close to that of the reference model.

  14. Information Interaction: Providing a Framework for Information Architecture.

    ERIC Educational Resources Information Center

    Toms, Elaine G.

    2002-01-01

    Discussion of information architecture focuses on a model of information interaction that bridges the gap between human and computer and between information behavior and information retrieval. Illustrates how the process of information interaction is affected by the user, the system, and the content. (Contains 93 references.) (LRW)

  15. Protocol Information Office (PIO) | Division of Cancer Prevention

    Cancer.gov

    PIO Instructions and Tools Find instructions, forms, and templates for the management of all types of Division of Cancer Prevention clinical trials.Read more about PIO Instructions and Tools Clinical Trials Reference Materials Model clinical agreements, human subject protection and informed consent models, gender and minority inclusion information, and monitoring policy and

  16. A User-Centered Approach to Adaptive Hypertext Based on an Information Relevance Model

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James

    1994-01-01

    Rapid and effective to information in large electronic documentation systems can be facilitated if information relevant in an individual user's content can be automatically supplied to this user. However most of this knowledge on contextual relevance is not found within the contents of documents, it is rather established incrementally by users during information access. We propose a new model for interactively learning contextual relevance during information retrieval, and incrementally adapting retrieved information to individual user profiles. The model, called a relevance network, records the relevance of references based on user feedback for specific queries and user profiles. It also generalizes such knowledge to later derive relevant references for similar queries and profiles. The relevance network lets users filter information by context of relevance. Compared to other approaches, it does not require any prior knowledge nor training. More importantly, our approach to adaptivity is user-centered. It facilitates acceptance and understanding by users by giving them shared control over the adaptation without disturbing their primary task. Users easily control when to adapt and when to use the adapted system. Lastly, the model is independent of the particular application used to access information, and supports sharing of adaptations among users.

  17. Teaching and Learning Activity Sequencing System using Distributed Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Matsui, Tatsunori; Ishikawa, Tomotake; Okamoto, Toshio

    The purpose of this study is development of a supporting system for teacher's design of lesson plan. Especially design of lesson plan which relates to the new subject "Information Study" is supported. In this study, we developed a system which generates teaching and learning activity sequences by interlinking lesson's activities corresponding to the various conditions according to the user's input. Because user's input is multiple information, there will be caused contradiction which the system should solve. This multiobjective optimization problem is resolved by Distributed Genetic Algorithms, in which some fitness functions are defined with reference models on lesson, thinking and teaching style. From results of various experiments, effectivity and validity of the proposed methods and reference models were verified; on the other hand, some future works on reference models and evaluation functions were also pointed out.

  18. Comparing Information Access Approaches.

    ERIC Educational Resources Information Center

    Chalmers, Matthew

    1999-01-01

    Presents a broad view of information access, drawing from philosophy and semiology in constructing a framework for comparative discussion that is used to examine the information representations that underlie four approaches to information access--information retrieval, workflow, collaborative filtering, and the path model. Contains 32 references.…

  19. Improved reference models for middle atmosphere ozone

    NASA Technical Reports Server (NTRS)

    Keating, G. M.; Pitts, M. C.; Chen, C.

    1990-01-01

    This paper describes the improvements introduced into the original version of ozone reference model of Keating and Young (1985, 1987) which is to be incorporated in the next COSPAR International Reference Atmosphere (CIRA). The ozone reference model will provide information on the global ozone distribution (including the ozone vertical structure as a function of month and latitude from 25 to 90 km) combining data from five recent satellite experiments: the Nimbus 7 LIMS, Nimbus 7 SBUV, AE-2 Stratospheric Aerosol Gas Experiment (SAGE), Solar Mesosphere Explorer (SME) UV Spectrometer, and SME 1.27 Micron Airglow. The improved version of the reference model uses reprocessed AE-2 SAGE data (sunset) and extends the use of SAGE data from 1981 to the 1981-1983 time period. Comparisons are presented between the results of this ozone model and various nonsatellite measurements at different levels in the middle atmosphere.

  20. A Feminist Paradigm for Library and Information Science.

    ERIC Educational Resources Information Center

    Hannigan, Jane Anne; Crew, Hilary

    1993-01-01

    Discussion of feminist scholarship and feminist thinking focuses on feminism in librarianship. Topics addressed include research methodologies; implications for library and information science; a feminist model, including constructed knowledge; standpoint theory; benefits of feminist scholarship; and a library model. (Contains 14 references.) (LRW)

  1. Defining And Employing Reference Conditions For Ecological Restoration Of The Lower Missouri River, USA

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; Elliott, C. M.; Reuter, J. M.

    2008-12-01

    Ecological reference conditions are especially challenging for large, intensively managed rivers like the Lower Missouri. Historical information provides broad understanding of how the river has changed, but translating historical information into quantitative reference conditions remains a challenge. Historical information is less available for biological and chemical conditions than for physical conditions. For physical conditions, much of the early historical condition is documented in date-specific measurements or maps, and it is difficult to determine how representative these conditions are for a river system that was characterized historically by large floods and high channel migration rates. As an alternative to a historically defined least- disturbed condition, spatial variation within the Missouri River basin provides potential for defining a best- attainable reference condition. A possibility for the best-attainable condition for channel morphology is an unchannelized segment downstream of the lowermost dam (rkm 1298 - 1203). This segment retains multiple channels and abundant sandbars although it has a highly altered flow regime and a greatly diminished sediment supply. Conversely, downstream river segments have more natural flow regimes, but have been narrowed and simplified for navigation and bank stability. We use two computational tools to compensate for the lack of ideal reference conditions. The first is a hydrologic model that synthesizes natural and altered flow regimes based on 100 years of daily inputs to the river (daily routing model, DRM, US Army Corps of Engineers, 1998); the second tool is hydrodynamic modeling of habitat availability. The flow-regime and hydrodynamic outputs are integrated to define habitat-duration curves as the basis for reference conditions (least-disturbed flow regime and least-disturbed channel morphology). Lacking robust biological response models, we use mean residence time of water and a habitat diversity index as generic ecosystem indicators.

  2. Risk Aversion and the Value of Information.

    ERIC Educational Resources Information Center

    Eeckhoudt, Louis; Godfroid, Phillippe

    2000-01-01

    Explains why risk aversion does not always induce a greater information value, but instead may induce a lower information value when increased. Presents a basic model defining the concept of perfect information value and providing a numerical illustration. Includes references. (CMK)

  3. Probabilistic modeling of discourse-aware sentence processing.

    PubMed

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  4. "My Understanding Has Grown, My Perspective Has Switched": Linking Informal Writing to Learning Goals

    ERIC Educational Resources Information Center

    Hudd, Suzanne S.; Smart, Robert A.; Delohery, Andrew W.

    2011-01-01

    The use of informal writing is common in sociology. This article presents one model for integrating informal written work with learning goals through a theoretical framework known as concentric thinking. More commonly referred to as "the PTA model" because of the series of cognitive tasks it promotes--prioritization, translation, and analogy…

  5. Theft of information in the take-grant protection model

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1989-01-01

    Using the information transfer extensions to the Take-Grant Protection Model, the concept of theft of information is defined and necessary and sufficient conditions for such theft to occur are presented, as well as bounds on the number of actors involved in such theft. Finally, the application of these results to reference monitors are explored.

  6. Learner Perception of Personal Spaces of Information (PSIs): A Mental Model Analysis

    ERIC Educational Resources Information Center

    Hardof-Jaffe, Sharon; Aladjem, Ruthi

    2018-01-01

    A personal space of information (PSI) refers to the collection of digital information items created, saved and organized, on digital devices. PSIs play a central and significant role in learning processes. This study explores the mental models and perceptions of PSIs by learners, using drawing analysis. Sixty-three graduate students were asked to…

  7. Detailed clinical models: representing knowledge, data and semantics in healthcare information technology.

    PubMed

    Goossen, William T F

    2014-07-01

    This paper will present an overview of the developmental effort in harmonizing clinical knowledge modeling using the Detailed Clinical Models (DCMs), and will explain how it can contribute to the preservation of Electronic Health Records (EHR) data. Clinical knowledge modeling is vital for the management and preservation of EHR and data. Such modeling provides common data elements and terminology binding with the intention of capturing and managing clinical information over time and location independent from technology. Any EHR data exchange without an agreed clinical knowledge modeling will potentially result in loss of information. Many attempts exist from the past to model clinical knowledge for the benefits of semantic interoperability using standardized data representation and common terminologies. The objective of each project is similar with respect to consistent representation of clinical data, using standardized terminologies, and an overall logical approach. However, the conceptual, logical, and the technical expressions are quite different in one clinical knowledge modeling approach versus another. There currently are synergies under the Clinical Information Modeling Initiative (CIMI) in order to create a harmonized reference model for clinical knowledge models. The goal for the CIMI is to create a reference model and formalisms based on for instance the DCM (ISO/TS 13972), among other work. A global repository of DCMs may potentially be established in the future.

  8. Advantages of a dual-tracer model over reference tissue models for binding potential measurement in tumors

    PubMed Central

    Tichauer, K M; Samkoe, K S; Klubben, W S; Hasan, T; Pogue, B W

    2012-01-01

    The quantification of tumor molecular expression in vivo could have a significant impact for informing and monitoring immerging targeted therapies in oncology. Molecular imaging of targeted tracers can be used to quantify receptor expression in the form of a binding potential (BP) if the arterial input curve or a surrogate of it is also measured. However, the assumptions of the most common approaches (reference tissue models) may not be valid for use in tumors. In this study, the validity of reference tissue models is investigated for use in tumors experimentally and in simulations. Three different tumor lines were grown subcutaneously in athymic mice and the mice were injected with a mixture of an epidermal growth factor receptor- (EGFR-) targeted fluorescent tracer and an untargeted fluorescent tracer. A one-compartment plasma input model demonstrated that the transport kinetics of both tracers were significantly different between tumors and all potential reference tissues, and using the reference tissue model resulted in a theoretical underestimation in BP of 50 ± 37%. On the other hand, the targeted and untargeted tracers demonstrated similar transport kinetics, allowing a dual-tracer approach to be employed to accurately estimate binding potential (with a theoretical error of 0.23 ± 9.07%). These findings highlight the potential for using a dual-tracer approach to quantify receptor expression in tumors with abnormal hemodynamics, possibly to inform the choice or progress of molecular cancer therapies. PMID:23022732

  9. Teaching RFID Information Systems Security

    ERIC Educational Resources Information Center

    Thompson, Dale R.; Di, Jia; Daugherty, Michael K.

    2014-01-01

    The future cyber security workforce needs radio frequency identification (RFID) information systems security (INFOSEC) and threat modeling educational materials. A complete RFID security course with new learning materials and teaching strategies is presented here. A new RFID Reference Model is used in the course to organize discussion of RFID,…

  10. Jobs and Economic Development Impact (JEDI) Model Geothermal User Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, C.; Augustine, C.; Goldberg, M.

    2012-09-01

    The Geothermal Jobs and Economic Development Impact (JEDI) model, developed through the National Renewable Energy Laboratory (NREL), is an Excel-based user-friendly tools that estimates the economic impacts of constructing and operating hydrothermal and Enhanced Geothermal System (EGS) power generation projects at the local level for a range of conventional and renewable energy technologies. The JEDI Model Geothermal User Reference Guide was developed to assist users in using and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide alsomore » provides basic instruction on model add-in features, operation of the model, and a discussion of how the results should be interpreted.« less

  11. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  12. The Pursuit of Word Meanings

    PubMed Central

    Stevens, Jon Scott; Gleitman, Lila R.; Trueswell, John C.; Yang, Charles

    2016-01-01

    We evaluate here the performance of four models of cross-situational word learning; two global models, which extract and retain multiple referential alternatives from each word occurrence; and two local models, which extract just a single referent from each occurrence. One of these local models, dubbed Pursuit, uses an associative learning mechanism to estimate word-referent probability but pursues and tests the best referent-meaning at any given time. Pursuit is found to perform as well as global models under many conditions extracted from naturalistic corpora of parent child-interactions, even though the model maintains far less information than global models. Moreover, Pursuit is found to best capture human experimental findings from several relevant cross-situational word-learning experiments, including those of Yu and Smith (2007), the paradigm example of a finding believed to support fully global cross-situational models. Implications and limitations of these results are discussed, most notably that the model characterizes only the earliest stages of word learning, when reliance on the co-occurring referent world is at its greatest. PMID:27666335

  13. OWL references in ORM conceptual modelling

    NASA Astrophysics Data System (ADS)

    Matula, Jiri; Belunek, Roman; Hunka, Frantisek

    2017-07-01

    Object Role Modelling methodology is the fact-based type of conceptual modelling. The aim of the paper is to emphasize a close connection to OWL documents and its possible mutual cooperation. The definition of entities or domain values is an indispensable part of the conceptual schema design procedure defined by the ORM methodology. Many of these entities are already defined in OWL documents. Therefore, it is not necessary to declare entities again, whereas it is possible to utilize references from OWL documents during modelling of information systems.

  14. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  15. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes.

    PubMed

    Murray, Trevor; Zeil, Jochen

    2017-01-01

    Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its 'catchment area') has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the 'catchment volumes' within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.

  16. Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes

    PubMed Central

    Zeil, Jochen

    2017-01-01

    Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its ‘catchment area’) has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the ‘catchment volumes’ within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots. PMID:29088300

  17. A modified Galam’s model for word-of-mouth information exchange

    NASA Astrophysics Data System (ADS)

    Ellero, Andrea; Fasano, Giovanni; Sorato, Annamaria

    2009-09-01

    In this paper we analyze the stochastic model proposed by Galam in [S. Galam, Modelling rumors: The no plane Pentagon French hoax case, Physica A 320 (2003), 571-580], for information spreading in a ‘word-of-mouth’ process among agents, based on a majority rule. Using the communications rules among agents defined in the above reference, we first perform simulations of the ‘word-of-mouth’ process and compare the results with the theoretical values predicted by Galam’s model. Some dissimilarities arise in particular when a small number of agents is considered. We find motivations for these dissimilarities and suggest some enhancements by introducing a new parameter dependent model. We propose a modified Galam’s scheme which is asymptotically coincident with the original model in the above reference. Furthermore, for relatively small values of the parameter, we provide a numerical experience proving that the modified model often outperforms the original one.

  18. Evolution of an Implementation-Ready Interprofessional Pain Assessment Reference Model

    PubMed Central

    Collins, Sarah A; Bavuso, Karen; Swenson, Mary; Suchecki, Christine; Mar, Perry; Rocha, Roberto A.

    2017-01-01

    Standards to increase consistency of comprehensive pain assessments are important for safety, quality, and analytics activities, including meeting Joint Commission requirements and learning the best management strategies and interventions for the current prescription Opioid epidemic. In this study we describe the development and validation of a Pain Assessment Reference Model ready for implementation on EHR forms and flowsheets. Our process resulted in 5 successive revisions of the reference model, which more than doubled the number of data elements to 47. The organization of the model evolved during validation sessions with panels totaling 48 subject matter experts (SMEs) to include 9 sets of data elements, with one set recommended as a minimal data set. The reference model also evolved when implemented into EHR forms and flowsheets, indicating specifications such as cascading logic that are important to inform secondary use of data. PMID:29854125

  19. Performance Guaranteed Inertia Emulation forDiesel-Wind System Feed Microgrid via ModelReference Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M.; Zhang, Yichen; Djouadi, Seddik

    In this paper, a model reference control based inertia emulation strategy is proposed. Desired inertia can be precisely emulated through this control strategy so that guaranteed performance is ensured. A typical frequency response model with parametrical inertia is set to be the reference model. A measurement at a specific location delivers the information of disturbance acting on the diesel-wind system to the referencemodel. The objective is for the speed of the diesel-wind system to track the reference model. Since active power variation is dominantly governed by mechanical dynamics and modes, only mechanical dynamics and states, i.e., a swing-engine-governor system plusmore » a reduced-order wind turbine generator, are involved in the feedback control design. The controller is implemented in a three-phase diesel-wind system feed microgrid. The results show exact synthetic inertia is emulated, leading to guaranteed performance and safety bounds.« less

  20. State-and-transition models for heterogeneous landscapes: A strategy for development and application

    USDA-ARS?s Scientific Manuscript database

    Interpretation of assessment and monitoring data requires information about reference conditions and ecological resilience. Reference conditions used as benchmarks can be specified via potential-based land classifications (e.g., ecological sites) that describe the plant communities potentially obser...

  1. 32 CFR 199.21 - Pharmacy benefits program.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information may include but are not limited to: (A) Medical and pharmaceutical textbooks and reference books... Book) published by the Food and Drug Administration, or any successor to such reference. Generics are...) Cross-sectional or retrospective economic evaluations; (F) Pharmacoeconomic models; (G) Patent...

  2. Cognitive bias in back pain patients attending osteopathy: testing the enmeshment model in reference to future thinking.

    PubMed

    Read, Jessica; Pincus, Tamar

    2004-12-01

    Depressive symptoms are common in chronic pain. Previous research has found differences in information-processing biases in depressed pain patients and depressed people without pain. The schema enmeshment model of pain (SEMP) has been proposed to explain chronic pain patients' information-processing biases. Negative future thinking is common in depression but has not been explored in relation to chronic pain and information-processing models. The study aimed to test the SEMP with reference to future thinking. An information-processing paradigm compared endorsement and recall bias between depressed and non-depressed chronic low back pain patients and control participants. Twenty-five depressed and 35 non-depressed chronic low back pain patients and 25 control participants (student osteopaths) were recruited from an osteopathy practice. Participants were asked to endorse positive and negative ill-health, depression-related, and neutral (control) adjectives, encoded in reference to either current or future time-frame. Incidental recall of the adjectives was then tested. While the expected hypothesis of a recall bias by depressed pain patients towards ill-health stimuli in the current condition was confirmed, the recall bias was not present in the future condition. Additionally, patterns of endorsement and recall bias differed. Results extend understanding of future thinking in chronic pain within the context of the SEMP.

  3. Thinking about Museum Information.

    ERIC Educational Resources Information Center

    Reed, Patricia Ann; Sledge, Jane

    1988-01-01

    Describes work in progress at the Smithsonian Institution in developing a system to understand and articulate the information needed to support collection related functions. The discussion covers the data modeling methodology used and the advantages of this methodology in structuring museum collections information. (one reference) (CLB)

  4. Transmission Line Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, M.; Keyser, D.

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are freely available, user-friendly tools that estimate the potential economic impacts of constructing and operating power generation projects for a range of conventional and renewable energy technologies. The Transmission Line JEDI model can be used to field questions about the economic impacts of transmission lines in a given state, region, or local community. This Transmission Line JEDI User Reference Guide was developed to provide basic instruction on operating the model and understanding the results. This guide also provides information on the model's underlying methodology,more » as well as the parameters and references used to develop the cost data contained in the model.« less

  5. 76 FR 5298 - Airworthiness Directives; Empresa Brasileira de Aeronautica S.A. (EMBRAER) Model EMB-500 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-31

    ...-200. Related Information (h) Refer to MCAI AG[Ecirc]NCIA NACIONAL DE AVIA[Ccedil][Atilde]O CIVIL... de Aeronautica S.A. (EMBRAER) Model EMB-500 Airplanes AGENCY: Federal Aviation Administration (FAA... mandatory continuing airworthiness information (MCAI) originated by an aviation authority of another country...

  6. Order-Constrained Reference Priors with Implications for Bayesian Isotonic Regression, Analysis of Covariance and Spatial Models

    NASA Astrophysics Data System (ADS)

    Gong, Maozhen

    Selecting an appropriate prior distribution is a fundamental issue in Bayesian Statistics. In this dissertation, under the framework provided by Berger and Bernardo, I derive the reference priors for several models which include: Analysis of Variance (ANOVA)/Analysis of Covariance (ANCOVA) models with a categorical variable under common ordering constraints, the conditionally autoregressive (CAR) models and the simultaneous autoregressive (SAR) models with a spatial autoregression parameter rho considered. The performances of reference priors for ANOVA/ANCOVA models are evaluated by simulation studies with comparisons to Jeffreys' prior and Least Squares Estimation (LSE). The priors are then illustrated in a Bayesian model of the "Risk of Type 2 Diabetes in New Mexico" data, where the relationship between the type 2 diabetes risk (through Hemoglobin A1c) and different smoking levels is investigated. In both simulation studies and real data set modeling, the reference priors that incorporate internal order information show good performances and can be used as default priors. The reference priors for the CAR and SAR models are also illustrated in the "1999 SAT State Average Verbal Scores" data with a comparison to a Uniform prior distribution. Due to the complexity of the reference priors for both CAR and SAR models, only a portion (12 states in the Midwest) of the original data set is considered. The reference priors can give a different marginal posterior distribution compared to a Uniform prior, which provides an alternative for prior specifications for areal data in Spatial statistics.

  7. A Compositional Relevance Model for Adaptive Information Retrieval

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Chen, James; Lu, Henry, Jr. (Technical Monitor)

    1994-01-01

    There is a growing need for rapid and effective access to information in large electronic documentation systems. Access can be facilitated if information relevant in the current problem solving context can be automatically supplied to the user. This includes information relevant to particular user profiles, tasks being performed, and problems being solved. However most of this knowledge on contextual relevance is not found within the contents of documents, and current hypermedia tools do not provide any easy mechanism to let users add this knowledge to their documents. We propose a compositional relevance network to automatically acquire the context in which previous information was found relevant. The model records information on the relevance of references based on user feedback for specific queries and contexts. It also generalizes such information to derive relevant references for similar queries and contexts. This model lets users filter information by context of relevance, build personalized views of documents over time, and share their views with other users. It also applies to any type of multimedia information. Compared to other approaches, it is less costly and doesn't require any a priori statistical computation, nor an extended training period. It is currently being implemented into the Computer Integrated Documentation system which enables integration of various technical documents in a hypertext framework.

  8. 30 CFR 550.219 - What oil and hazardous substance spills information must accompany the EP?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...), (d), and (e)). (b) Modeling report. If you model a potential oil or hazardous substance spill in developing your EP, a modeling report or the modeling results, or a reference to such report or results if...

  9. 30 CFR 550.219 - What oil and hazardous substance spills information must accompany the EP?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...), (d), and (e)). (b) Modeling report. If you model a potential oil or hazardous substance spill in developing your EP, a modeling report or the modeling results, or a reference to such report or results if...

  10. 30 CFR 550.219 - What oil and hazardous substance spills information must accompany the EP?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...), (d), and (e)). (b) Modeling report. If you model a potential oil or hazardous substance spill in developing your EP, a modeling report or the modeling results, or a reference to such report or results if...

  11. Tailoring periodical collections to meet institutional needs.

    PubMed Central

    Delman, B S

    1984-01-01

    A system for tailoring journal collections to meet institutional needs is described. The approach is based on the view that reference work and collection development are variant and complementary forms of the same library function; both tasks have as their objective a literature response to information problems. Utilizing the tools and procedures of the reference search in response to a specific collection development problem topic, the author created a model ranked list of relevant journals. Finally, by linking the model to certain operational and environmental factors in three different health care organizations, he tailored the collection to meet the institutions' respective information needs. PMID:6375775

  12. Multimodal Fusion with Reference: Searching for Joint Neuromarkers of Working Memory Deficits in Schizophrenia

    PubMed Central

    Qi, Shile; Calhoun, Vince D.; van Erp, Theo G. M.; Bustillo, Juan; Damaraju, Eswar; Turner, Jessica A.; Du, Yuhui; Chen, Jiayu; Yu, Qingbao; Mathalon, Daniel H.; Ford, Judith M.; Voyvodic, James; Mueller, Bryon A.; Belger, Aysenil; Ewen, Sarah Mc; Potkin, Steven G.; Preda, Adrian; Jiang, Tianzi

    2017-01-01

    Multimodal fusion is an effective approach to take advantage of cross-information among multiple imaging data to better understand brain diseases. However, most current fusion approaches are blind, without adopting any prior information. To date, there is increasing interest to uncover the neurocognitive mapping of specific behavioral measurement on enriched brain imaging data; hence, a supervised, goal-directed model that enables a priori information as a reference to guide multimodal data fusion is in need and a natural option. Here we proposed a fusion with reference model, called “multi-site canonical correlation analysis with reference plus joint independent component analysis” (MCCAR+jICA), which can precisely identify co-varying multimodal imaging patterns closely related to reference information, such as cognitive scores. In a 3-way fusion simulation, the proposed method was compared with its alternatives on estimation accuracy of both target component decomposition and modality linkage detection. MCCAR+jICA outperforms others with higher precision. In human imaging data, working memory performance was utilized as a reference to investigate the covarying functional and structural brain patterns among 3 modalities and how they are impaired in schizophrenia. Two independent cohorts (294 and 83 subjects respectively) were used. Interestingly, similar brain maps were identified between the two cohorts, with substantial overlap in the executive control networks in fMRI, salience network in sMRI, and major white matter tracts in dMRI. These regions have been linked with working memory deficits in schizophrenia in multiple reports, while MCCAR+jICA further verified them in a repeatable, joint manner, demonstrating the potential of such results to identify potential neuromarkers for mental disorders. PMID:28708547

  13. Comparing a Japanese and a German hospital information system.

    PubMed

    Jahn, F; Issler, L; Winter, A; Takabayashi, K

    2009-01-01

    To examine the architectural differences and similarities of a Japanese and German hospital information system (HIS) in a case study. This cross-cultural comparison, which focuses on structural quality characteristics, offers the chance to get new insights into different HIS architectures, which possibly cannot be obtained by inner-country comparisons. A reference model for the domain layer of hospital information systems containing the typical enterprise functions of a hospital provides the basis of comparison for the two different hospital information systems. 3LGM(2) models, which describe the two HISs and which are based on that reference model, are used to assess several structural quality criteria. Four of these criteria are introduced in detail. The two examined HISs are different in terms of the four structural quality criteria examined. Whereas the centralized architecture of the hospital information system at Chiba University Hospital causes only few functional redundancies and leads to a low implementation of communication standards, the hospital information system at the University Hospital of Leipzig, having a decentralized architecture, exhibits more functional redundancies and a higher use of communication standards. Using a model-based comparison, it was possible to detect remarkable differences between the observed hospital information systems of completely different cultural areas. However, the usability of 3LGM(2) models for comparisons has to be improved in order to apply key figures and to assess or benchmark the structural quality of health information systems architectures more thoroughly.

  14. Incorporating geographical factors with artificial neural networks to predict reference values of erythrocyte sedimentation rate

    PubMed Central

    2013-01-01

    Background The measurement of the Erythrocyte Sedimentation Rate (ESR) value is a standard procedure performed during a typical blood test. In order to formulate a unified standard of establishing reference ESR values, this paper presents a novel prediction model in which local normal ESR values and corresponding geographical factors are used to predict reference ESR values using multi-layer feed-forward artificial neural networks (ANN). Methods and findings Local normal ESR values were obtained from hospital data, while geographical factors that include altitude, sunshine hours, relative humidity, temperature and precipitation were obtained from the National Geographical Data Information Centre in China. The results show that predicted values are statistically in agreement with measured values. Model results exhibit significant agreement between training data and test data. Consequently, the model is used to predict the unseen local reference ESR values. Conclusions Reference ESR values can be established with geographical factors by using artificial intelligence techniques. ANN is an effective method for simulating and predicting reference ESR values because of its ability to model nonlinear and complex relationships. PMID:23497145

  15. Incorporating geographical factors with artificial neural networks to predict reference values of erythrocyte sedimentation rate.

    PubMed

    Yang, Qingsheng; Mwenda, Kevin M; Ge, Miao

    2013-03-12

    The measurement of the Erythrocyte Sedimentation Rate (ESR) value is a standard procedure performed during a typical blood test. In order to formulate a unified standard of establishing reference ESR values, this paper presents a novel prediction model in which local normal ESR values and corresponding geographical factors are used to predict reference ESR values using multi-layer feed-forward artificial neural networks (ANN). Local normal ESR values were obtained from hospital data, while geographical factors that include altitude, sunshine hours, relative humidity, temperature and precipitation were obtained from the National Geographical Data Information Centre in China.The results show that predicted values are statistically in agreement with measured values. Model results exhibit significant agreement between training data and test data. Consequently, the model is used to predict the unseen local reference ESR values. Reference ESR values can be established with geographical factors by using artificial intelligence techniques. ANN is an effective method for simulating and predicting reference ESR values because of its ability to model nonlinear and complex relationships.

  16. Life sciences domain analysis model

    PubMed Central

    Freimuth, Robert R; Freund, Elaine T; Schick, Lisa; Sharma, Mukesh K; Stafford, Grace A; Suzek, Baris E; Hernandez, Joyce; Hipp, Jason; Kelley, Jenny M; Rokicki, Konrad; Pan, Sue; Buckler, Andrew; Stokes, Todd H; Fernandez, Anna; Fore, Ian; Buetow, Kenneth H

    2012-01-01

    Objective Meaningful exchange of information is a fundamental challenge in collaborative biomedical research. To help address this, the authors developed the Life Sciences Domain Analysis Model (LS DAM), an information model that provides a framework for communication among domain experts and technical teams developing information systems to support biomedical research. The LS DAM is harmonized with the Biomedical Research Integrated Domain Group (BRIDG) model of protocol-driven clinical research. Together, these models can facilitate data exchange for translational research. Materials and methods The content of the LS DAM was driven by analysis of life sciences and translational research scenarios and the concepts in the model are derived from existing information models, reference models and data exchange formats. The model is represented in the Unified Modeling Language and uses ISO 21090 data types. Results The LS DAM v2.2.1 is comprised of 130 classes and covers several core areas including Experiment, Molecular Biology, Molecular Databases and Specimen. Nearly half of these classes originate from the BRIDG model, emphasizing the semantic harmonization between these models. Validation of the LS DAM against independently derived information models, research scenarios and reference databases supports its general applicability to represent life sciences research. Discussion The LS DAM provides unambiguous definitions for concepts required to describe life sciences research. The processes established to achieve consensus among domain experts will be applied in future iterations and may be broadly applicable to other standardization efforts. Conclusions The LS DAM provides common semantics for life sciences research. Through harmonization with BRIDG, it promotes interoperability in translational science. PMID:22744959

  17. The Swedish strategy and method for development of a national healthcare information architecture.

    PubMed

    Rosenälv, Jessica; Lundell, Karl-Henrik

    2012-01-01

    "We need a precise framework of regulations in order to maintain appropriate and structured health care documentation that ensures that the information maintains a sufficient level of quality to be used in treatment, in research and by the actual patient. The users shall be aided by clearly and uniformly defined terms and concepts, and there should be an information structure that clarifies what to document and how to make the information more useful. Most of all, we need to standardize the information, not just the technical systems." (eHälsa - nytta och näring, Riksdag report 2011/12:RFR5, p. 37). In 2010, the Swedish Government adopted the National e-Health - the national strategy for accessible and secure information in healthcare. The strategy is a revision and extension of the previous strategy from 2006, which was used as input for the most recent efforts to develop a national information structure utilizing business-oriented generic models. A national decision on healthcare informatics standards was made by the Swedish County Councils, which decided to follow and use EN/ISO 13606 as a standard for the development of a universally applicable information structure, including archetypes and templates. The overall aim of the Swedish strategy for development of National Healthcare Information Architecture is to achieve high level semantic interoperability for clinical content and clinical contexts. High level semantic interoperability requires consistently structured clinical data and other types of data with coherent traceability to be mapped to reference clinical models. Archetypes that are formal definitions of the clinical and demographic concepts and some administrative data were developed. Each archetype describes the information structure and content of overarching core clinical concepts. Information that is defined in archetypes should be used for different purposes. Generic clinical process model was made concrete and analyzed. For each decision-making step in the process where information is processed, the amount and type of information and its structure were defined in terms of reference templates. Reference templates manage clinical, administrative and demographic types of information in a specific clinical context. Based on a survey of clinical processes at the reference level, the identification of specific clinical processes such as diabetes and congestive heart failure in adults were made. Process-specific templates were defined by using reference templates and populated with information that was relevant to each health problem in a specific clinical context. Throughout this process, medical data for knowledge management were collected for each health problem. Parallel with the efforts to define archetypes and templates, terminology binding work is on-going. Different strategies are used depending on the terminology binding level.

  18. Building Program Models Incrementally from Informal Descriptions.

    DTIC Science & Technology

    1979-10-01

    specified at each step. Since the user controls the interaction, the user may determine the order in which information flows into PMB. Information is received...until only ten years ago the term aautomatic programming" referred to the development of the assemblers, macro expanders, and compilers for these

  19. Cost-effective ways of delivering enquiry services: a rapid review.

    PubMed

    Sutton, Anthea; Grant, Maria J

    2011-12-01

    In the recent times of recession and budget cuts, it is more important than ever for library and information services to deliver cost-effective services. This rapid review aims to examine the evidence for the most cost-effective ways of delivering enquiry services. A literature search was conducted on LISA (Library and Information Sciences Abstracts) and MEDLINE. Searches were limited to 2007 onwards. Eight studies met the inclusion criteria. The studies covered hospital and academic libraries in the USA and Canada. Services analysed were 'point-of-care' librarian consultations, staffing models for reference desks and virtual/digital reference services. Transferable lessons, relevant to health library and information services generally, can be drawn from this rapid review. These suggest that 'point-of-care' librarians for primary care practitioners are a cost-effective way of answering questions. Reference desks can be cost-effectively staffed by student employees or general reference staff, although librarian referral must be provided for more complex and subject-specific enquiries. However, it is not possible to draw any conclusions on virtual/digital reference services because of the limited literature available. Further case analysis studies measuring specific services, particularly enquiry services within a health library and information context, are required. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.

  20. Predictive Eco-Cruise Control (ECC) system : model development, modeling and potential benefits.

    DOT National Transportation Integrated Search

    2013-02-01

    The research develops a reference model of a predictive eco-cruise control (ECC) system that intelligently modulates vehicle speed within a pre-set speed range to minimize vehicle fuel consumption levels using roadway topographic information. The stu...

  1. Library and Information Networks: Centralization and Decentralization.

    ERIC Educational Resources Information Center

    Segal, JoAnn S.

    1988-01-01

    Describes the development of centralized library networks and the current factors that make library sharing on a smaller scale feasible. The discussion covers the need to decide the level at which library cooperation should occur and the possibility of linking via the Open System Interface Reference Model. (37 references) (CLB)

  2. The American Archival Profession and Information Technology Standards.

    ERIC Educational Resources Information Center

    Cox, Richard J.

    1992-01-01

    Discussion of the use of standards by archivists highlights the U.S. MARC AMC (Archives-Manuscript Control) format for reporting archival records and manuscripts; their interest in specific standards being developed for the OSI (Open Systems Interconnection) reference model; and the management of records in electronic formats. (16 references) (LAE)

  3. Multiple neural states of representation in short-term memory? It's a matter of attention.

    PubMed

    Larocque, Joshua J; Lewis-Peacock, Jarrod A; Postle, Bradley R

    2014-01-01

    Short-term memory (STM) refers to the capacity-limited retention of information over a brief period of time, and working memory (WM) refers to the manipulation and use of that information to guide behavior. In recent years it has become apparent that STM and WM interact and overlap with other cognitive processes, including attention (the selection of a subset of information for further processing) and long-term memory (LTM-the encoding and retention of an effectively unlimited amount of information for a much longer period of time). Broadly speaking, there have been two classes of memory models: systems models, which posit distinct stores for STM and LTM (Atkinson and Shiffrin, 1968; Baddeley and Hitch, 1974); and state-based models, which posit a common store with different activation states corresponding to STM and LTM (Cowan, 1995; McElree, 1996; Oberauer, 2002). In this paper, we will focus on state-based accounts of STM. First, we will consider several theoretical models that postulate, based on considerable behavioral evidence, that information in STM can exist in multiple representational states. We will then consider how neural data from recent studies of STM can inform and constrain these theoretical models. In the process we will highlight the inferential advantage of multivariate, information-based analyses of neuroimaging data (fMRI and electroencephalography (EEG)) over conventional activation-based analysis approaches (Postle, in press). We will conclude by addressing lingering questions regarding the fractionation of STM, highlighting differences between the attention to information vs. the retention of information during brief memory delays.

  4. Archetype-based semantic integration and standardization of clinical data.

    PubMed

    Moner, David; Maldonado, Jose A; Bosca, Diego; Fernandez, Jesualdo T; Angulo, Carlos; Crespo, Pere; Vivancos, Pedro J; Robles, Montserrat

    2006-01-01

    One of the basic needs for any healthcare professional is to be able to access to clinical information of patients in an understandable and normalized way. The lifelong clinical information of any person supported by electronic means configures his/her Electronic Health Record (EHR). This information is usually distributed among several independent and heterogeneous systems that may be syntactically or semantically incompatible. The Dual Model architecture has appeared as a new proposal for maintaining a homogeneous representation of the EHR with a clear separation between information and knowledge. Information is represented by a Reference Model which describes common data structures with minimal semantics. Knowledge is specified by archetypes, which are formal representations of clinical concepts built upon a particular Reference Model. This kind of architecture is originally thought for implantation of new clinical information systems, but archetypes can be also used for integrating data of existing and not normalized systems, adding at the same time a semantic meaning to the integrated data. In this paper we explain the possible use of a Dual Model approach for semantic integration and standardization of heterogeneous clinical data sources and present LinkEHR-Ed, a tool for developing archetypes as elements for integration purposes. LinkEHR-Ed has been designed to be easily used by the two main participants of the creation process of archetypes for clinical data integration: the Health domain expert and the Information Technologies domain expert.

  5. The Relationship between Simultaneous-Successive Processing and Academic Achievement.

    ERIC Educational Resources Information Center

    Merritt, Frank M.; McCallum, Steve

    The Luria-Das Information Processing Model of human learning holds that information is analysed and coded within the brain in either a simultaneous or a successive fashion. Simultaneous integration refers to the synthesis of separate elements into groups, often with spatial characteristics; successive integration means that information is…

  6. An incompressible fluid flow model with mutual information for MR image registration

    NASA Astrophysics Data System (ADS)

    Tsai, Leo; Chang, Herng-Hua

    2013-03-01

    Image registration is one of the fundamental and essential tasks within image processing. It is a process of determining the correspondence between structures in two images, which are called the template image and the reference image, respectively. The challenge of registration is to find an optimal geometric transformation between corresponding image data. This paper develops a new MR image registration algorithm that uses a closed incompressible viscous fluid model associated with mutual information. In our approach, we treat the image pixels as the fluid elements of a viscous fluid flow governed by the nonlinear Navier-Stokes partial differential equation (PDE). We replace the pressure term with the body force mainly used to guide the transformation with a weighting coefficient, which is expressed by the mutual information between the template and reference images. To solve this modified Navier-Stokes PDE, we adopted the fast numerical techniques proposed by Seibold1. The registration process of updating the body force, the velocity and deformation fields is repeated until the mutual information weight reaches a prescribed threshold. We applied our approach to the BrainWeb and real MR images. As consistent with the theory of the proposed fluid model, we found that our method accurately transformed the template images into the reference images based on the intensity flow. Experimental results indicate that our method is of potential in a wide variety of medical image registration applications.

  7. Radiation Models

    ERIC Educational Resources Information Center

    James, W. G. G.

    1970-01-01

    Discusses the historical development of both the wave and the corpuscular photon model of light. Suggests that students should be informed that the two models are complementary and that each model successfully describes a wide range of radiation phenomena. Cites 19 references which might be of interest to physics teachers and students. (LC)

  8. Development of a clinical pharmacy model within an Australian home nursing service using co-creation and participatory action research: the Visiting Pharmacist (ViP) study

    PubMed Central

    Lee, Cik Yin; Beanland, Christine; Goeman, Dianne P; Petrie, Neil; Petrie, Barbara; Vise, Felicity; Gray, June

    2017-01-01

    Objective To develop a collaborative, person-centred model of clinical pharmacy support for community nurses and their medication management clients. Design Co-creation and participatory action research, based on reflection, data collection, interaction and feedback from participants and other stakeholders. Setting A large, non-profit home nursing service in Melbourne, Australia. Participants Older people referred to the home nursing service for medication management, their carers, community nurses, general practitioners (GPs) and pharmacists, a multidisciplinary stakeholder reference group (including consumer representation) and the project team. Data collection and analysis Feedback and reflections from minutes, notes and transcripts from: project team meetings, clinical pharmacists’ reflective diaries and interviews, meetings with community nurses, reference group meetings and interviews and focus groups with 27 older people, 18 carers, 53 nurses, 15 GPs and seven community pharmacists. Results The model was based on best practice medication management standards and designed to address key medication management issues raised by stakeholders. Pharmacist roles included direct client care and indirect care. Direct care included home visits, medication reconciliation, medication review, medication regimen simplification, preparation of medication lists for clients and nurses, liaison and information sharing with prescribers and pharmacies and patient/carer education. Indirect care included providing medicines information and education for nurses and assisting with review and implementation of organisational medication policies and procedures. The model allowed nurses to refer directly to the pharmacist, enabling timely resolution of medication issues. Direct care was provided to 84 older people over a 15-month implementation period. Ongoing feedback and consultation, in line with participatory action research principles, informed the development and refinement of the model and identification of enablers and challenges. Conclusions A collaborative, person-centred clinical pharmacy model that addressed the needs of clients, carers, nurses and other stakeholders was successfully developed. The model is likely to have applicability to home nursing services nationally and internationally. PMID:29102998

  9. Multi-model inference for incorporating trophic and climate uncertainty into stock assessments

    NASA Astrophysics Data System (ADS)

    Ianelli, James; Holsman, Kirstin K.; Punt, André E.; Aydin, Kerim

    2016-12-01

    Ecosystem-based fisheries management (EBFM) approaches allow a broader and more extensive consideration of objectives than is typically possible with conventional single-species approaches. Ecosystem linkages may include trophic interactions and climate change effects on productivity for the relevant species within the system. Presently, models are evolving to include a comprehensive set of fishery and ecosystem information to address these broader management considerations. The increased scope of EBFM approaches is accompanied with a greater number of plausible models to describe the systems. This can lead to harvest recommendations and biological reference points that differ considerably among models. Model selection for projections (and specific catch recommendations) often occurs through a process that tends to adopt familiar, often simpler, models without considering those that incorporate more complex ecosystem information. Multi-model inference provides a framework that resolves this dilemma by providing a means of including information from alternative, often divergent models to inform biological reference points and possible catch consequences. We apply an example of this approach to data for three species of groundfish in the Bering Sea: walleye pollock, Pacific cod, and arrowtooth flounder using three models: 1) an age-structured "conventional" single-species model, 2) an age-structured single-species model with temperature-specific weight at age, and 3) a temperature-specific multi-species stock assessment model. The latter two approaches also include consideration of alternative future climate scenarios, adding another dimension to evaluate model projection uncertainty. We show how Bayesian model-averaging methods can be used to incorporate such trophic and climate information to broaden single-species stock assessments by using an EBFM approach that may better characterize uncertainty.

  10. Philosophy and Sociology of Science Evolution and History

    NASA Astrophysics Data System (ADS)

    Rosen, Joe

    The following sections are included: * Concrete Versus Abstract Theoretical Models * Introduction: concrete and abstract in kepler's contribution * Einstein's theory of gravitation and mach's principle * Unitary symmetry and the structure of hadrons * Conclusion * Dedication * Symmetry, Entropy and Complexity * Introduction * Symmetry Implies Abstraction and Loss of Information * Broken Symmetries - Imposed or Spontaneous * Symmetry, Order and Information * References * Cosmological Surrealism: More Than "Eternal Reality" Is Needed * Pythagoreanism in atomic, nuclear and particle physics * Introduction: Pythagoreanism as part of the Greek scientific world view — and the three questions I will tackle * Point 1: the impact of Gersonides and Crescas, two scientific anti-Aristotelian rebels * Point 2: Kepler's spheres to Bohr's orbits — Pythagoreanisms at last! * Point 3: Aristotle to Maupertuis, Emmy Noether, Schwinger * References * Paradigm Completion For Generalized Evolutionary Theory With Application To Epistemology * Evolution Fully Generalized * Entropy: Gravity as Model * Evolution and Entropy: Measures of Complexity * Extinctions and a Balanced Evolutionary Paradigm * The Evolution of Human Society - the Age of Information as example * High-Energy Physics and the World Wide Web * Twentieth Century Epistemology has Strong (de facto) Evolutionary Elements * The discoveries towards the beginning of the XXth Century * Summary and Conclusions * References * Evolutionary Epistemology and Invalidation * Introduction * Extinctions and A New Evolutionary Paradigm * Evolutionary Epistemology - Active Mutations * Evolutionary Epistemology: Invalidation as An Extinction * References

  11. A Neural Network Architecture For Rapid Model Indexing In Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Pawlicki, Ted

    1988-03-01

    Models of objects stored in memory have been shown to be useful for guiding the processing of computer vision systems. A major consideration in such systems, however, is how stored models are initially accessed and indexed by the system. As the number of stored models increases, the time required to search memory for the correct model becomes high. Parallel distributed, connectionist, neural networks' have been shown to have appealing content addressable memory properties. This paper discusses an architecture for efficient storage and reference of model memories stored as stable patterns of activity in a parallel, distributed, connectionist, neural network. The emergent properties of content addressability and resistance to noise are exploited to perform indexing of the appropriate object centered model from image centered primitives. The system consists of three network modules each of which represent information relative to a different frame of reference. The model memory network is a large state space vector where fields in the vector correspond to ordered component objects and relative, object based spatial relationships between the component objects. The component assertion network represents evidence about the existence of object primitives in the input image. It establishes local frames of reference for object primitives relative to the image based frame of reference. The spatial relationship constraint network is an intermediate representation which enables the association between the object based and the image based frames of reference. This intermediate level represents information about possible object orderings and establishes relative spatial relationships from the image based information in the component assertion network below. It is also constrained by the lawful object orderings in the model memory network above. The system design is consistent with current psychological theories of recognition by component. It also seems to support Marr's notions of hierarchical indexing. (i.e. the specificity, adjunct, and parent indices) It supports the notion that multiple canonical views of an object may have to be stored in memory to enable its efficient identification. The use of variable fields in the state space vectors appears to keep the number of required nodes in the network down to a tractable number while imposing a semantic value on different areas of the state space. This semantic imposition supports an interface between the analogical aspects of neural networks and the propositional paradigms of symbolic processing.

  12. Stochastic Online Learning in Dynamic Networks under Unknown Models

    DTIC Science & Technology

    2016-08-02

    Repeated Game with Incomplete Information, IEEE International Conference on Acoustics, Speech, and Signal Processing. 20-MAR-16, Shanghai, China...in a game theoretic framework for the application of multi-seller dynamic pricing with unknown demand models. We formulated the problem as an...infinitely repeated game with incomplete information and developed a dynamic pricing strategy referred to as Competitive and Cooperative Demand Learning

  13. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  14. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  15. Let Documents Talk to Each Other: A Computer Model for Connection of Short Documents.

    ERIC Educational Resources Information Center

    Chen, Z.

    1993-01-01

    Discusses the integration of scientific texts through the connection of documents and describes a computer model that can connect short documents. Information retrieval and artificial intelligence are discussed; a prototype system of the model is explained; and the model is compared to other computer models. (17 references) (LRW)

  16. The International Geomagnetic Reference Field, 2005

    USGS Publications Warehouse

    Rukstales, Kenneth S.; Love, Jeffrey J.

    2007-01-01

    This is a set of five world charts showing the declination, inclination, horizontal intensity, vertical component, and total intensity of the Earth's magnetic field at mean sea level at the beginning of 2005. The charts are based on the International Geomagnetic Reference Field (IGRF) main model for 2005 and secular change model for 2005-2010. The IGRF is referenced to the World Geodetic System 1984 ellipsoid. Additional information about the USGS geomagnetism program is available at: http://geomag.usgs.gov/

  17. Selecting Data-Base Management Software for Microcomputers in Libraries and Information Units.

    ERIC Educational Resources Information Center

    Pieska, K. A. O.

    1986-01-01

    Presents a model for the evaluation of database management systems software from the viewpoint of librarians and information specialists. The properties of data management systems, database management systems, and text retrieval systems are outlined and compared. (10 references) (CLB)

  18. ECONOMIC GROWTH ANALYSIS SYSTEM: REFERENCE MANUAL

    EPA Science Inventory

    The two-volume report describes the development of, and provides information needed to operate, a prototype Economic Growth Analysis System (E-GAS) modeling system. The model will be used to project emissions inventories of volatile organic compounds (VOCs), oxides of nitrogen (...

  19. Land-Use History and Contemporary Management Inform an Ecological Reference Model for Longleaf Pine Woodland Understory Plant Communities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brudvig, Lars A.; Orrock, John L.; Damschen, Ellen I.

    Ecological restoration is frequently guided by reference conditions describing a successfully restored ecosystem; however, the causes and magnitude of ecosystem degradation vary, making simple knowledge of reference conditions insufficient for prioritizing and guiding restoration. Ecological reference models provide further guidance by quantifying reference conditions, as well as conditions at degraded states that deviate from reference conditions. Many reference models remain qualitative, however, limiting their utility. We quantified and evaluated a reference model for southeastern U.S. longleaf pine woodland understory plant communities. We used regression trees to classify 232 longleaf pine woodland sites at three locations along the Atlantic coastal plainmore » based on relationships between understory plant community composition, soils lol(which broadly structure these communities), and factors associated with understory degradation, including fire frequency, agricultural history, and tree basal area. To understand the spatial generality of this model, we classified all sites together. and for each of three study locations separately. Both the regional and location-specific models produced quantifiable degradation gradients–i.e., progressive deviation from conditions at 38 reference sites, based on understory species composition, diversity and total cover, litter depth, and other attributes. Regionally, fire suppression was the most important degrading factor, followed by agricultural history, but at individual locations, agricultural history or tree basal area was most important. At one location, the influence of a degrading factor depended on soil attributes. We suggest that our regional model can help prioritize longleaf pine woodland restoration across our study region; however, due to substantial landscape-to-landscape variation, local management decisions should take into account additional factors (e.g., soil attributes). Our study demonstrates the utility of quantifying degraded states and provides a series of hypotheses for future experimental restoration work. More broadly, our work provides a framework for developing and evaluating reference models that incorporate multiple, interactive anthropogenic drivers of ecosystem degradation.« less

  20. Using a generalised identity reference model with archetypes to support interoperability of demographics information in electronic health record systems.

    PubMed

    Xu Chen; Berry, Damon; Stephens, Gaye

    2015-01-01

    Computerised identity management is in general encountered as a low-level mechanism that enables users in a particular system or region to securely access resources. In the Electronic Health Record (EHR), the identifying information of both the healthcare professionals who access the EHR and the patients whose EHR is accessed, are subject to change. Demographics services have been developed to manage federated patient and healthcare professional identities and to support challenging healthcare-specific use cases in the presence of diverse and sometimes conflicting demographic identities. Demographics services are not the only use for identities in healthcare. Nevertheless, contemporary EHR specifications limit the types of entities that can be the actor or subject of a record to health professionals and patients, thus limiting the use of two level models in other healthcare information systems. Demographics are ubiquitous in healthcare, so for a general identity model to be usable, it should be capable of managing demographic information. In this paper, we introduce a generalised identity reference model (GIRM) based on key characteristics of five surveyed demographic models. We evaluate the GIRM by using it to express the EN13606 demographics model in an extensible way at the metadata level and show how two-level modelling can support the exchange of instances of demographic identities. This use of the GIRM to express demographics information shows its application for standards-compliant two-level modelling alongside heterogeneous demographics models. We advocate this approach to facilitate the interoperability of identities between two-level model-based EHR systems and show the validity and the extensibility of using GIRM for the expression of other health-related identities.

  1. 76 FR 21815 - Airworthiness Directives; The Boeing Company Model 737 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-19

    ...)(c) of this service bulletin provides instructions to check for obvious differences in thread shape between thread grooves ``as given in CMM 27-41-01.'' Air Tran noted that CMM 27-41-01 does not provide any... have revised this AD to refer to the new service information. We agree that processes referred to by...

  2. Practitioner Perspectives on a Disaster Management Architecture

    NASA Astrophysics Data System (ADS)

    Moe, K.; Evans, J. D.

    2012-12-01

    The Committee on Earth Observing Satellites (CEOS) Working Group on Information Systems and Services (WGISS) is constructing a high-level reference model for the use of satellites, sensors, models, and associated data products from many different global data and service providers in disaster response and risk assessment. To help streamline broad, effective access to satellite information, the reference model provides structured, shared, holistic views of distributed systems and services - in effect, a common vocabulary describing the system-of-systems building blocks and how they are composed for disaster management. These views are being inferred from real-world experience, by documenting and analyzing how practitioners have gone about using or providing satellite data to manage real disaster events or to assess or mitigate hazard risks. Crucial findings and insights come from case studies of three kinds of experience: - Disaster response and recovery (such as the 2008 Sichuan/Wenchuan earthquake in China; and the 2011 Tohoku earthquake and tsunami in Japan); - Technology pilot projects (such as NASA's Flood Sensor Web pilot in Namibia, or the interagency Virtual Mission Operation Center); - Information brokers (such as the International Charter: Space and Major Disasters, or the U.K.-based Disaster Management Constellation). Each of these experiences sheds light on the scope and stakeholders of disaster management; the information requirements for various disaster types and phases; and the services needed for effective access to information by a variety of users. They also highlight needs and gaps in the supply of satellite information for disaster management. One need stands out: rapid and effective access to complex data from multiple sources, across inter-organizational boundaries. This is the near-real-time challenge writ large: gaining access to satellite data resources from multiple organizationally distant and geographically disperse sources, to meet an urgent need. The case studies and reference model will highlight gaps in data supply and data delivery technologies, and suggest recommended priorities for satellite missions, ground data systems, and third-party service providers.

  3. WATER DISTRIBUTION SYSTEM ANALYSIS: FIELD STUDIES, MODELING AND MANAGEMENT

    EPA Science Inventory

    The user‘s guide entitled “Water Distribution System Analysis: Field Studies, Modeling and Management” is a reference guide for water utilities and an extensive summarization of information designed to provide drinking water utility personnel (and related consultants and research...

  4. Marketing & Libraries Do Mix: A Handbook for Libraries and Information Centers.

    ERIC Educational Resources Information Center

    Tenney, H. Baird; And Others

    This handbook offers a practical set of ideas to help all types of libraries in the task of marketing their services in an increasingly competitive economy and provides a model program as urged by the White House Conference on Library and Information Services. It is aimed at adult information services in particular, with passing references to…

  5. Reference evapotranspiration forecasting based on local meteorological and global climate information screened by partial mutual information

    NASA Astrophysics Data System (ADS)

    Fang, Wei; Huang, Shengzhi; Huang, Qiang; Huang, Guohe; Meng, Erhao; Luan, Jinkai

    2018-06-01

    In this study, reference evapotranspiration (ET0) forecasting models are developed for the least economically developed regions subject to meteorological data scarcity. Firstly, the partial mutual information (PMI) capable of capturing the linear and nonlinear dependence is investigated regarding its utility to identify relevant predictors and exclude those that are redundant through the comparison with partial linear correlation. An efficient input selection technique is crucial for decreasing model data requirements. Then, the interconnection between global climate indices and regional ET0 is identified. Relevant climatic indices are introduced as additional predictors to comprise information regarding ET0, which ought to be provided by meteorological data unavailable. The case study in the Jing River and Beiluo River basins, China, reveals that PMI outperforms the partial linear correlation in excluding the redundant information, favouring the yield of smaller predictor sets. The teleconnection analysis identifies the correlation between Nino 1 + 2 and regional ET0, indicating influences of ENSO events on the evapotranspiration process in the study area. Furthermore, introducing Nino 1 + 2 as predictors helps to yield more accurate ET0 forecasts. A model performance comparison also shows that non-linear stochastic models (SVR or RF with input selection through PMI) do not always outperform linear models (MLR with inputs screen by linear correlation). However, the former can offer quite comparable performance depending on smaller predictor sets. Therefore, efforts such as screening model inputs through PMI and incorporating global climatic indices interconnected with ET0 can benefit the development of ET0 forecasting models suitable for data-scarce regions.

  6. A methodology and supply chain management inspired reference ontology for modeling healthcare teams.

    PubMed

    Kuziemsky, Craig E; Yazdi, Sara

    2011-01-01

    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  7. Intuitive presentation of clinical forensic data using anonymous and person-specific 3D reference manikins.

    PubMed

    Urschler, Martin; Höller, Johannes; Bornik, Alexander; Paul, Tobias; Giretzlehner, Michael; Bischof, Horst; Yen, Kathrin; Scheurer, Eva

    2014-08-01

    The increasing use of CT/MR devices in forensic analysis motivates the need to present forensic findings from different sources in an intuitive reference visualization, with the aim of combining 3D volumetric images along with digital photographs of external findings into a 3D computer graphics model. This model allows a comprehensive presentation of forensic findings in court and enables comparative evaluation studies correlating data sources. The goal of this work was to investigate different methods to generate anonymous and patient-specific 3D models which may be used as reference visualizations. The issue of registering 3D volumetric as well as 2D photographic data to such 3D models is addressed to provide an intuitive context for injury documentation from arbitrary modalities. We present an image processing and visualization work-flow, discuss the major parts of this work-flow, compare the different investigated reference models, and show a number of cases studies that underline the suitability of the proposed work-flow for presenting forensically relevant information in 3D visualizations. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. Resources for National Water Savings for Outdoor Water Use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melody, Moya; Stratton, Hannah; Williams, Alison

    2014-05-01

    In support of efforts by the U.S. Environmental Agency's (EPA's) WaterSense program to develop a spreadsheet model for calculating the national water and financial savings attributable to WaterSense certification and labeling of weather-based irrigation controllers, Lawrence Berkeley National Laboratory reviewed reports, technical data, and other information related to outdoor water use and irrigation controllers. In this document we categorize and describe the reviewed references, highlighting pertinent data. We relied on these references when developing model parameters and calculating controller savings. We grouped resources into three major categories: landscapes (section 1); irrigation devices (section 2); and analytical and modeling efforts (sectionmore » 3). Each category is subdivided further as described in its section. References are listed in order of date of publication, most recent first.« less

  9. A reference model for space data system interconnection services

    NASA Astrophysics Data System (ADS)

    Pietras, John; Theis, Gerhard

    1993-03-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  10. A reference model for space data system interconnection services

    NASA Technical Reports Server (NTRS)

    Pietras, John; Theis, Gerhard

    1993-01-01

    The widespread adoption of standard packet-based data communication protocols and services for spaceflight missions provides the foundation for other standard space data handling services. These space data handling services can be defined as increasingly sophisticated processing of data or information received from lower-level services, using a layering approach made famous in the International Organization for Standardization (ISO) Open System Interconnection Reference Model (OSI-RM). The Space Data System Interconnection Reference Model (SDSI-RM) incorporates the conventions of the OSIRM to provide a framework within which a complete set of space data handling services can be defined. The use of the SDSI-RM is illustrated through its application to data handling services and protocols that have been defined by, or are under consideration by, the Consultative Committee for Space Data Systems (CCSDS).

  11. Surface Gravity Data Contribution to the Puerto Rico and U.S. Virgin Islands Geoid Model

    NASA Astrophysics Data System (ADS)

    Li, X.; Gerhards, C.; Holmes, S. A.; Saleh, J.; Shaw, B.

    2015-12-01

    The Gravity for the Redefinition of the American Vertical Datum (GRAV-D) project provides updated local gravity field information for the XGEOID15 models. In particular, its airborne gravity data in the area of Puerto Rico and U.S. Virgin Islands (PRVI) made substantial improvements (~60%) on the precision of the geoid models at the local GNSS/Leveling bench marks in the target area. Fortunately, PRVI is free of the huge systematic error in the North American Vertical Datum of 1988 (NAVD88). Thus, the airborne contribution was evaluated more realistically. In addition, the airborne data picked up more detailed gravity field information in the medium wavelength band (spherical harmonic degree 200 to 600) that are largely beyond the resolution of the current satellite missions, especially along the nearby ocean trench areas. Under this circumstance (significant airborne contributions in the medium band), local surface gravity data need to be examined more carefully than before during merging with the satellite and airborne information for local geoid improvement, especially considering the well-known systematic problems in the NGS historical gravity holdings (Saleh et al 2013 JoG). Initial tests showed that it is very important to maintain high consistency between the surface data sets and the airborne enhanced reference model. In addition, a new aggregation method (Gerhards 2014, Inverse Problems) will also be tested to optimally combine the local surface data with the reference model. The data cleaning and combining procedures in the target area will be summarized here as reference for future applications.

  12. The Computational Science Education Reference Desk: A tool for increasing inquiry based learning in the science classroom

    NASA Astrophysics Data System (ADS)

    Joiner, D. A.; Stevenson, D. E.; Panoff, R. M.

    2000-12-01

    The Computational Science Reference Desk is an online tool designed to provide educators in math, physics, astronomy, biology, chemistry, and engineering with information on how to use computational science to enhance inquiry based learning in the undergraduate and pre college classroom. The Reference Desk features a showcase of original content exploration activities, including lesson plans and background materials; a catalog of websites which contain models, lesson plans, software, and instructional resources; and a forum to allow educators to communicate their ideas. Many of the recent advances in astronomy rely on the use of computer simulation, and tools are being developed by CSERD to allow students to experiment with some of the models that have guided scientific discovery. One of these models allows students to study how scientists use spectral information to determine the makeup of the interstellar medium by modeling the interstellar extinction curve using spherical grains of silicate, amorphous carbon, or graphite. Students can directly compare their model to the average interstellar extinction curve, and experiment with how small changes in their model alter the shape of the interstellar extinction curve. A simpler model allows students to visualize spatial relationships between the Earth, Moon, and Sun to understand the cause of the phases of the moon. A report on the usefulness of these models in two classes, the Computational Astrophysics workshop at The Shodor Education Foundation and the Conceptual Astronomy class at the University of North Carolina at Greensboro, will be presented.

  13. Models as Feedback: Developing Representational Competence in Chemistry

    ERIC Educational Resources Information Center

    Padalkar, Shamin; Hegarty, Mary

    2015-01-01

    Spatial information in science is often expressed through representations such as diagrams and models. Learning the strengths and limitations of these representations and how to relate them are important aspects of developing scientific understanding, referred to as "representational competence." Diagram translation is particularly…

  14. Survey of simulation methods for modeling pulsed sieve-plate extraction columns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burkhart, L.

    1979-03-01

    The report first considers briefly the use of liquid-liquid extraction in nuclear fuel reprocessing and then describes the operation of the pulse column. Currently available simulation models of the column are reviewed, and followed by an analysis of the information presently available from which the necessary parameters can be obtained for use in a model of the column. Finally, overall conclusions are given regarding the information needed to develop an accurate model of the column for materials accountability in fuel reprocessing plants. 156 references.

  15. A Modernized National Spatial Reference System in 2022: Focus on the Caribbean Terrestrial Reference Frame

    NASA Astrophysics Data System (ADS)

    Roman, D. R.

    2017-12-01

    In 2022, the National Geodetic Survey will replace all three NAD 83 reference frames the four new terrestrial reference frames. Each frame will be named after a tectonic plate (North American, Pacific, Caribbean and Mariana) and each will be related to the IGS frame through three Euler Pole parameters (EPPs). This talk will focus on practical application in the Caribbean region. A working group is being re-established for development of the North American region and will likely also result in analysis of the Pacific region as well. Both of these regions are adequately covered with existing CORS sites to model the EPPs. The Mariana region currently lacks sufficient coverage, but a separate project is underway to collect additional information to help in defining EPPs for that region at a later date. The Caribbean region has existing robust coverage through UNAVCO's COCONet and other data sets, but these require further analysis. This discussion will focus on practical examination of Caribbean sites to establish candidates for determining the Caribbean frame EPPs as well as an examination of any remaining velocities that might inform a model of the remaining velocities within that frame (Intra-Frame Velocity Model). NGS has a vested interest in defining such a model to meet obligations to U.S. citizens in Puerto Rico and the U.S. Virgin Islands. Beyond this, NGS aims to collaborate with other countries in the region through efforts with SIRGAS and UN-GGIM-Americas for a more acceptable regional model to serve everyone's needs.

  16. A New Socio-technical Model for Studying Health Information Technology in Complex Adaptive Healthcare Systems

    PubMed Central

    Sittig, Dean F.; Singh, Hardeep

    2011-01-01

    Conceptual models have been developed to address challenges inherent in studying health information technology (HIT). This manuscript introduces an 8-dimensional model specifically designed to address the socio-technical challenges involved in design, development, implementation, use, and evaluation of HIT within complex adaptive healthcare systems. The 8 dimensions are not independent, sequential, or hierarchical, but rather are interdependent and interrelated concepts similar to compositions of other complex adaptive systems. Hardware and software computing infrastructure refers to equipment and software used to power, support, and operate clinical applications and devices. Clinical content refers to textual or numeric data and images that constitute the “language” of clinical applications. The human computer interface includes all aspects of the computer that users can see, touch, or hear as they interact with it. People refers to everyone who interacts in some way with the system, from developer to end-user, including potential patient-users. Workflow and communication are the processes or steps involved in assuring that patient care tasks are carried out effectively. Two additional dimensions of the model are internal organizational features (e.g., policies, procedures, and culture) and external rules and regulations, both of which may facilitate or constrain many aspects of the preceding dimensions. The final dimension is measurement and monitoring, which refers to the process of measuring and evaluating both intended and unintended consequences of HIT implementation and use. We illustrate how our model has been successfully applied in real-world complex adaptive settings to understand and improve HIT applications at various stages of development and implementation. PMID:20959322

  17. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems.

    PubMed

    Sittig, Dean F; Singh, Hardeep

    2010-10-01

    Conceptual models have been developed to address challenges inherent in studying health information technology (HIT). This manuscript introduces an eight-dimensional model specifically designed to address the sociotechnical challenges involved in design, development, implementation, use and evaluation of HIT within complex adaptive healthcare systems. The eight dimensions are not independent, sequential or hierarchical, but rather are interdependent and inter-related concepts similar to compositions of other complex adaptive systems. Hardware and software computing infrastructure refers to equipment and software used to power, support and operate clinical applications and devices. Clinical content refers to textual or numeric data and images that constitute the 'language' of clinical applications. The human--computer interface includes all aspects of the computer that users can see, touch or hear as they interact with it. People refers to everyone who interacts in some way with the system, from developer to end user, including potential patient-users. Workflow and communication are the processes or steps involved in ensuring that patient care tasks are carried out effectively. Two additional dimensions of the model are internal organisational features (eg, policies, procedures and culture) and external rules and regulations, both of which may facilitate or constrain many aspects of the preceding dimensions. The final dimension is measurement and monitoring, which refers to the process of measuring and evaluating both intended and unintended consequences of HIT implementation and use. We illustrate how our model has been successfully applied in real-world complex adaptive settings to understand and improve HIT applications at various stages of development and implementation.

  18. Mining of hospital laboratory information systems: a model study defining age- and gender-specific reference intervals and trajectories for plasma creatinine in a pediatric population.

    PubMed

    Søeby, Karen; Jensen, Peter Bjødstrup; Werge, Thomas; Sørensen, Steen

    2015-09-01

    The knowledge of physiological fluctuation and variation of even commonly used biochemical quantities in extreme age groups and during development is sparse. This challenges the clinical interpretation and utility of laboratory tests in these age groups. To explore the utility of hospital laboratory data as a source of information, we analyzed enzymatic plasma creatinine as a model analyte in two large pediatric hospital samples. Plasma creatinine measurements from 9700 children aged 0-18 years were obtained from hospital laboratory databases and partitioned into high-resolution gender- and age-groups. Normal probability plots were used to deduce parameters of the normal distributions from healthy creatinine values in the mixed hospital datasets. Furthermore, temporal trajectories were generated from repeated measurements to examine developmental patterns in periods of changing creatinine levels. Creatinine shows great age dependence from birth throughout childhood. We computed and replicated 95% reference intervals in narrow gender and age bins and showed them to be comparable to those determined in healthy population studies. We identified pronounced transitions in creatinine levels at different time points after birth and around the early teens, which challenges the establishment and usefulness of reference intervals in those age groups. The study documents that hospital laboratory data may inform on the developmental aspects of creatinine, on periods with pronounced heterogeneity and valid reference intervals. Furthermore, part of the heterogeneity in creatinine distribution is likely due to differences in biological and chronological age of children and should be considered when using age-specific reference intervals.

  19. "When information is not enough": A model for understanding BRCA-positive previvors' information needs regarding hereditary breast and ovarian cancer risk.

    PubMed

    Dean, Marleah; Scherr, Courtney L; Clements, Meredith; Koruo, Rachel; Martinez, Jennifer; Ross, Amy

    2017-09-01

    To investigate BRCA-positive, unaffected patients' - referred to as previvors - information needs after testing positive for a deleterious BRCA genetic mutation. 25 qualitative interviews were conducted with previvors. Data were analyzed using the constant comparison method of grounded theory. Analysis revealed a theoretical model of previvors' information needs related to the stage of their health journey. Specifically, a four-stage model was developed based on the data: (1) pre-testing information needs, (2) post-testing information needs, (3) pre-management information needs, and (4) post-management information needs. Two recurring dimensions of desired knowledge also emerged within the stages-personal/social knowledge and medical knowledge. While previvors may be genetically predisposed to develop cancer, they have not been diagnosed with cancer, and therefore have different information needs than cancer patients and cancer survivors. This model can serve as a framework for assisting healthcare providers in meeting the specific information needs of cancer previvors. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Technology mediator: a new role for the reference librarian?

    PubMed Central

    Howse, David K; Bracke, Paul J; Keim, Samuel M

    2006-01-01

    The Arizona Health Sciences Library has collaborated with clinical faculty to develop a federated search engine that is useful for meeting real-time clinical information needs. This article proposes a technology mediation role for the reference librarian that was inspired by the project, and describes the collaborative model used for developing technology-mediated services for targeted users. PMID:17040566

  1. Attachment and the Processing of Social Information in Adolescence

    ERIC Educational Resources Information Center

    Dykas, Matthew J.; Cassidy, Jude

    2007-01-01

    A key proposition of attachment theory is that experience-based cognitive representations of attachment, often referred to as internal working models of attachment, influence the manner in which individuals process attachment-relevant social information (Bowlby, 1969/1982, 1973, 1980; Bretherton & Munholland, 1999; Main, Kaplan, & Cassidy, 1985).…

  2. A system for protecting the environment from ionising radiation: selecting reference fauna and flora, and the possible dose models and environmental geometries that could be applied to them.

    PubMed

    Pentreath, R J; Woodhead, D S

    2001-09-28

    In order to demonstrate, explicitly, that the environment can be protected with respect to controlled sources of ionising radiation, it is essential to have a systematic framework within which dosimetry models for fauna and flora can be used. And because of the practical limitations on what could reasonably be modelled and the amount of information that could reasonably be obtained, it is also necessary to limit the application of such models to a 'set' of fauna and flora within a reference' context. This paper, therefore, outlines the factors that will need to be considered to select such 'reference' fauna and flora, and describes some of the factors and constraints necessary to develop the associated dosimetry models. It also describes some of the most basic environmental geometrics within which the dose models could be set in order to make comparisons amongst different radiation sources.

  3. Development of a clinical pharmacy model within an Australian home nursing service using co-creation and participatory action research: the Visiting Pharmacist (ViP) study.

    PubMed

    Elliott, Rohan A; Lee, Cik Yin; Beanland, Christine; Goeman, Dianne P; Petrie, Neil; Petrie, Barbara; Vise, Felicity; Gray, June

    2017-11-03

    To develop a collaborative, person-centred model of clinical pharmacy support for community nurses and their medication management clients. Co-creation and participatory action research, based on reflection, data collection, interaction and feedback from participants and other stakeholders. A large, non-profit home nursing service in Melbourne, Australia. Older people referred to the home nursing service for medication management, their carers, community nurses, general practitioners (GPs) and pharmacists, a multidisciplinary stakeholder reference group (including consumer representation) and the project team. Feedback and reflections from minutes, notes and transcripts from: project team meetings, clinical pharmacists' reflective diaries and interviews, meetings with community nurses, reference group meetings and interviews and focus groups with 27 older people, 18 carers, 53 nurses, 15 GPs and seven community pharmacists. The model was based on best practice medication management standards and designed to address key medication management issues raised by stakeholders. Pharmacist roles included direct client care and indirect care. Direct care included home visits, medication reconciliation, medication review, medication regimen simplification, preparation of medication lists for clients and nurses, liaison and information sharing with prescribers and pharmacies and patient/carer education. Indirect care included providing medicines information and education for nurses and assisting with review and implementation of organisational medication policies and procedures. The model allowed nurses to refer directly to the pharmacist, enabling timely resolution of medication issues. Direct care was provided to 84 older people over a 15-month implementation period. Ongoing feedback and consultation, in line with participatory action research principles, informed the development and refinement of the model and identification of enablers and challenges. A collaborative, person-centred clinical pharmacy model that addressed the needs of clients, carers, nurses and other stakeholders was successfully developed. The model is likely to have applicability to home nursing services nationally and internationally. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Opportunities and Efficiencies in Building a New Service Desk Model.

    PubMed

    Mayo, Alexa; Brown, Everly; Harris, Ryan

    2017-01-01

    In July 2015, the Health Sciences and Human Services Library (HS/HSL) at the University of Maryland, Baltimore (UMB), merged its reference and circulation services, creating the Information Services Department and Information Services Desk. Designing the Information Services Desk with a team approach allowed for the re-examination of the HS/HSL's service model from the ground up. With the creation of a single service point, the HS/HSL was able to create efficiencies, improve the user experience by eliminating handoffs, create a collaborative team environment, and engage information services staff in a variety of new projects.

  5. Markup of temporal information in electronic health records.

    PubMed

    Hyun, Sookyung; Bakken, Suzanne; Johnson, Stephen B

    2006-01-01

    Temporal information plays a critical role in the understanding of clinical narrative (i.e., free text). We developed a representation for marking up temporal information in a narrative, consisting of five elements: 1) reference point, 2) direction, 3) number, 4) time unit, and 5) pattern. We identified 254 temporal expressions from 50 discharge summaries and represented them using our scheme. The overall inter-rater reliability among raters applying the representation model was 75 percent agreement. The model can contribute to temporal reasoning in computer systems for decision support, data mining, and process and outcomes analyses by providing structured temporal information.

  6. [Study on the experimental application of floating-reference method to noninvasive blood glucose sensing].

    PubMed

    Yu, Hui; Qi, Dan; Li, Heng-da; Xu, Ke-xin; Yuan, Wei-jie

    2012-03-01

    Weak signal, low instrument signal-to-noise ratio, continuous variation of human physiological environment and the interferences from other components in blood make it difficult to extract the blood glucose information from near infrared spectrum in noninvasive blood glucose measurement. The floating-reference method, which analyses the effect of glucose concentration variation on absorption coefficient and scattering coefficient, gets spectrum at the reference point and the measurement point where the light intensity variations from absorption and scattering are counteractive and biggest respectively. By using the spectrum from reference point as reference, floating-reference method can reduce the interferences from variation of physiological environment and experiment circumstance. In the present paper, the effectiveness of floating-reference method working on improving prediction precision and stability was assessed through application experiments. The comparison was made between models whose data were processed with and without floating-reference method. The results showed that the root mean square error of prediction (RMSEP) decreased by 34.7% maximally. The floating-reference method could reduce the influences of changes of samples' state, instrument noises and drift, and improve the models' prediction precision and stability effectively.

  7. Categorization and modeling of information work for H-journal design.

    PubMed Central

    Sundaram, A.

    1996-01-01

    This paper reports on a doctoral research project that examined the work of reference librarians in the health sciences domain. Categories of information work are derived and used to build Hyper-MedLib, a proof-of-concept h-journal. Findings from this study may be used in the design of electronic documents and information systems for the practice environments. PMID:8947690

  8. Modeling Rare and Unique Documents: Using FRBR[subscript OO]/CIDOC CRM

    ERIC Educational Resources Information Center

    Le Boeuf, Patrick

    2012-01-01

    Both the library and the museum communities have developed conceptual models for the information they produce about the collections they hold: FRBR (Functional Requirements for Bibliographic Records) and CIDOC CRM (Conceptual Reference Model). But neither proves perfectly adequate when it comes to some specific types of rare and unique materials:…

  9. University Library Strategy Development: A Conceptual Model of Researcher Performance to Inform Service Delivery

    ERIC Educational Resources Information Center

    Maddox, Alexia; Zhao, Linlin

    2017-01-01

    This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…

  10. Support time-dependent transformations for surveying and GIS : current status and upcoming challenges

    NASA Astrophysics Data System (ADS)

    Mahmoudabadi, H.; Lercier, D.; Vielliard, S.; Mein, N.; Briggs, G.

    2016-12-01

    The support of time-dependent transformations for surveying and GIS is becoming a critical issue. We need to convert positions from the realizations of the International Terrestrial Reference Frame to any national reference frame. This problem is easy to solve when all of the required information is available. But it becomes really complicated in a worldwide context. We propose an overview of the current ITRF-aligned reference frames and we describe a global solution to support time-dependent transformations between them and the International Terrestrial Reference Frame. We focus on the uncertainties of station velocities used. In a first approximation, we use a global tectonic plate model to calculate point velocities. We show the impact of the velocity model on the coordinate accuracies. Several countries, particularly in active regions, are developing semi-dynamic reference frames. These frames include local displacement models updated regularly and/or after major events (such as earthquakes). Their integration into surveying or GIS applications is an upcoming challenge. We want to encourage the geodetic community to develop and use standard formats.

  11. ECONOMIC GROWTH ANALYSIS SYSTEM: REFERENCE MANUAL VERSION 2.0

    EPA Science Inventory

    The two-volume report describes the development of and provides information needed to operate, the Economic Growth Analysis System (E-GAS) Version 2.0 model. The model will be used to project emissions inventories of volatile organic compounds (VOCs), oxides of nitrogen (NOx), a...

  12. ECONOMIC GROWTH ANALYSIS SYSTEM: REFERENCE MANUAL VERSION 3.0

    EPA Science Inventory

    The two-volume report describes the development of, and provides information needed to operate, the Economic Growth Analysis System (E-GAS) Version 3.0 model. The model will be used to project emissions inventories of volatile organic compounds, oxides of nitrogen, and carbon mon...

  13. Z39.50 and GILS model. [Government Information Locator Service

    NASA Technical Reports Server (NTRS)

    Christian, Eliot

    1994-01-01

    The Government Information Locator System (GILS) is a component of the National Information Infrastructure (NII) which provides electronic access to sources of publicly accessible information maintained throughout the Federal Government. GILS is an internetworking information resource that identifies other information resources, describes the information available in the referenced resources, and provides assistance in how to obtain the information either directly or through intermediaries. The GILS core content which references each Federal information system holding publicly accessible data or information is described in terms of mandatory and optional core elements.

  14. [Study on Information Extraction of Clinic Expert Information from Hospital Portals].

    PubMed

    Zhang, Yuanpeng; Dong, Jiancheng; Qian, Danmin; Geng, Xingyun; Wu, Huiqun; Wang, Li

    2015-12-01

    Clinic expert information provides important references for residents in need of hospital care. Usually, such information is hidden in the deep web and cannot be directly indexed by search engines. To extract clinic expert information from the deep web, the first challenge is to make a judgment on forms. This paper proposes a novel method based on a domain model, which is a tree structure constructed by the attributes of search interfaces. With this model, search interfaces can be classified to a domain and filled in with domain keywords. Another challenge is to extract information from the returned web pages indexed by search interfaces. To filter the noise information on a web page, a block importance model is proposed. The experiment results indicated that the domain model yielded a precision 10.83% higher than that of the rule-based method, whereas the block importance model yielded an F₁ measure 10.5% higher than that of the XPath method.

  15. Modeling Interoperable Information Systems with 3LGM² and IHE.

    PubMed

    Stäubert, S; Schaaf, M; Jahn, F; Brandner, R; Winter, A

    2015-01-01

    Strategic planning of information systems (IS) in healthcare requires descriptions of the current and the future IS state. Enterprise architecture planning (EAP) tools like the 3LGM² tool help to build up and to analyze IS models. A model of the planned architecture can be derived from an analysis of current state IS models. Building an interoperable IS, i. e. an IS consisting of interoperable components, can be considered a relevant strategic information management goal for many IS in healthcare. Integrating the healthcare enterprise (IHE) is an initiative which targets interoperability by using established standards. To link IHE concepts to 3LGM² concepts within the 3LGM² tool. To describe how an information manager can be supported in handling the complex IHE world and planning interoperable IS using 3LGM² models. To describe how developers or maintainers of IHE profiles can be supported by the representation of IHE concepts in 3LGM². Conceptualization and concept mapping methods are used to assign IHE concepts such as domains, integration profiles actors and transactions to the concepts of the three-layer graph-based meta-model (3LGM²). IHE concepts were successfully linked to 3LGM² concepts. An IHE-master-model, i. e. an abstract model for IHE concepts, was modeled with the help of 3LGM² tool. Two IHE domains were modeled in detail (ITI, QRPH). We describe two use cases for the representation of IHE concepts and IHE domains as 3LGM² models. Information managers can use the IHE-master-model as reference model for modeling interoperable IS based on IHE profiles during EAP activities. IHE developers are supported in analyzing consistency of IHE concepts with the help of the IHE-master-model and functions of the 3LGM² tool The complex relations between IHE concepts can be modeled by using the EAP method 3LGM². 3LGM² tool offers visualization and analysis features which are now available for the IHE-master-model. Thus information managers and IHE developers can use or develop IHE profiles systematically. In order to improve the usability and handling of the IHE-master-model and its usage as a reference model, some further refinements have to be done. Evaluating the use of the IHE-master-model by information managers and IHE developers is subject to further research.

  16. Application of the 226Ra– 230Th– 234U and 227Ac– 231Pa– 235U radiochronometers to uranium certified reference materials

    DOE PAGES

    Rolison, John M.; Treinen, Kerri C.; McHugh, Kelly C.; ...

    2017-11-06

    Uranium certified reference materials (CRM) issued by New Brunswick Laboratory were subjected to dating using four independent uranium-series radiochronometers. In all cases, there was acceptable agreement between the model ages calculated using the 231Pa– 235U, 230Th– 234U, 227Ac– 235U or 226Ra– 234U radiochronometers and either the certified 230Th– 234U model date (CRM 125-A and CRM U630), or the known purification date (CRM U050 and CRM U100). Finally, the agreement between the four independent radiochronometers establishes these uranium certified reference materials as ideal informal standards for validating dating techniques utilized in nuclear forensic investigations in the absence of standards with certifiedmore » model ages for multiple radiochronometers.« less

  17. Application of the 226Ra– 230Th– 234U and 227Ac– 231Pa– 235U radiochronometers to uranium certified reference materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rolison, John M.; Treinen, Kerri C.; McHugh, Kelly C.

    Uranium certified reference materials (CRM) issued by New Brunswick Laboratory were subjected to dating using four independent uranium-series radiochronometers. In all cases, there was acceptable agreement between the model ages calculated using the 231Pa– 235U, 230Th– 234U, 227Ac– 235U or 226Ra– 234U radiochronometers and either the certified 230Th– 234U model date (CRM 125-A and CRM U630), or the known purification date (CRM U050 and CRM U100). Finally, the agreement between the four independent radiochronometers establishes these uranium certified reference materials as ideal informal standards for validating dating techniques utilized in nuclear forensic investigations in the absence of standards with certifiedmore » model ages for multiple radiochronometers.« less

  18. Methods to estimate irrigated reference crop evapotranspiration - a review.

    PubMed

    Kumar, R; Jat, M K; Shankar, V

    2012-01-01

    Efficient water management of crops requires accurate irrigation scheduling which, in turn, requires the accurate measurement of crop water requirement. Irrigation is applied to replenish depleted moisture for optimum plant growth. Reference evapotranspiration plays an important role for the determination of water requirements for crops and irrigation scheduling. Various models/approaches varying from empirical to physically base distributed are available for the estimation of reference evapotranspiration. Mathematical models are useful tools to estimate the evapotranspiration and water requirement of crops, which is essential information required to design or choose best water management practices. In this paper the most commonly used models/approaches, which are suitable for the estimation of daily water requirement for agricultural crops grown in different agro-climatic regions, are reviewed. Further, an effort has been made to compare the accuracy of various widely used methods under different climatic conditions.

  19. A Model-Driven, Science Data Product Registration Service

    NASA Astrophysics Data System (ADS)

    Hardman, S.; Ramirez, P.; Hughes, J. S.; Joyner, R.; Cayanan, M.; Lee, H.; Crichton, D. J.

    2011-12-01

    The Planetary Data System (PDS) has undertaken an effort to overhaul the PDS data architecture (including the data model, data structures, data dictionary, etc.) and to deploy an upgraded software system (including data services, distributed data catalog, etc.) that fully embraces the PDS federation as an integrated system while taking advantage of modern innovations in information technology (including networking capabilities, processing speeds, and software breakthroughs). A core component of this new system is the Registry Service that will provide functionality for tracking, auditing, locating, and maintaining artifacts within the system. These artifacts can range from data files and label files, schemas, dictionary definitions for objects and elements, documents, services, etc. This service offers a single reference implementation of the registry capabilities detailed in the Consultative Committee for Space Data Systems (CCSDS) Registry Reference Model White Book. The CCSDS Reference Model in turn relies heavily on the Electronic Business using eXtensible Markup Language (ebXML) standards for registry services and the registry information model, managed by the OASIS consortium. Registries are pervasive components in most information systems. For example, data dictionaries, service registries, LDAP directory services, and even databases provide registry-like services. These all include an account of informational items that are used in large-scale information systems ranging from data values such as names and codes, to vocabularies, services and software components. The problem is that many of these registry-like services were designed with their own data models associated with the specific type of artifact they track. Additionally these services each have their own specific interface for interacting with the service. This Registry Service implements the data model specified in the ebXML Registry Information Model (RIM) specification that supports the various artifacts above as well as offering the flexibility to support customer-defined artifacts. Key features for the Registry Service include: - Model-based configuration specifying customer-defined artifact types, metadata attributes to capture for each artifact type, supported associations and classification schemes. - A REST-based external interface that is accessible via the Hypertext Transfer Protocol (HTTP). - Federation of Registry Service instances allowing associations between registered artifacts across registries as well as queries for artifacts across those same registries. A federation also enables features such as replication and synchronization if desired for a given deployment. In addition to its use as a core component of the PDS, the generic implementation of the Registry Service facilitates its applicability as a core component in any science data archive or science data system.

  20. Validating archetypes for the Multiple Sclerosis Functional Composite.

    PubMed

    Braun, Michael; Brandt, Alexander Ulrich; Schulz, Stefan; Boeker, Martin

    2014-08-03

    Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions.This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model.

  1. Validating archetypes for the Multiple Sclerosis Functional Composite

    PubMed Central

    2014-01-01

    Background Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. Methods A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Results Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. Conclusions The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions. This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model. PMID:25087081

  2. Spreading gossip in social networks.

    PubMed

    Lind, Pedro G; da Silva, Luciano R; Andrade, José S; Herrmann, Hans J

    2007-09-01

    We study a simple model of information propagation in social networks, where two quantities are introduced: the spread factor, which measures the average maximal reachability of the neighbors of a given node that interchange information among each other, and the spreading time needed for the information to reach such a fraction of nodes. When the information refers to a particular node at which both quantities are measured, the model can be taken as a model for gossip propagation. In this context, we apply the model to real empirical networks of social acquaintances and compare the underlying spreading dynamics with different types of scale-free and small-world networks. We find that the number of friendship connections strongly influences the probability of being gossiped. Finally, we discuss how the spread factor is able to be applied to other situations.

  3. Spreading gossip in social networks

    NASA Astrophysics Data System (ADS)

    Lind, Pedro G.; da Silva, Luciano R.; Andrade, José S., Jr.; Herrmann, Hans J.

    2007-09-01

    We study a simple model of information propagation in social networks, where two quantities are introduced: the spread factor, which measures the average maximal reachability of the neighbors of a given node that interchange information among each other, and the spreading time needed for the information to reach such a fraction of nodes. When the information refers to a particular node at which both quantities are measured, the model can be taken as a model for gossip propagation. In this context, we apply the model to real empirical networks of social acquaintances and compare the underlying spreading dynamics with different types of scale-free and small-world networks. We find that the number of friendship connections strongly influences the probability of being gossiped. Finally, we discuss how the spread factor is able to be applied to other situations.

  4. Education for Business in Iowa. Curriculum and Reference Guide.

    ERIC Educational Resources Information Center

    University of Northern Iowa, Cedar Falls.

    This business education curriculum model contains elementary, middle/junior high, and high school business education courses for Iowa students in the following areas: accounting, basic business, information processing, marketing, and general topics. A curriculum model provides specific courses for different educational levels. Each area contains…

  5. A Model for Teaching Critical Thinking through Online Searching.

    ERIC Educational Resources Information Center

    Crane, Beverley; Markowitz, Nancy Lourie

    1994-01-01

    Presents a model that uses online searching to teach critical thinking skills in elementary and secondary education based on Bloom's taxonomy. Three levels of activity are described: analyzing a search statement; defining and clarifying a problem; and focusing an information need. (Contains 13 references.) (LRW)

  6. Indexes of the proceedings for the nine symposia (international) on detonation, 1951--89

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crane, S.L.; Deal, W.E.; Ramsay, J.B.

    1993-01-01

    The Proceedings of the nine Detonation Symposia have become the major archival source of information of international research in explosive phenomenology, theory, experimental techniques, numerical modeling, and high-rate reaction chemistry. In many cases, they contain the original reference or the only reference to major progress in the field. For some papers, the information is more complete than the complementary article appearing in a formal journal, yet for others, authors elected to publish only an abstract in the Proceedings. For the large majority of papers, the Symposia Proceedings provide the only published reference to a body of work. This report indexesmore » the nine existing Proceedings of the Detonation Symposia by paper titles, topic phrases, authors, and first appearance of acronyms and code names.« less

  7. Indexes of the Proceedings for the Ten International Symposia on Detonation 1951-93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deal, William E.; Ramsay, John B.; Roach, Alita M.

    1998-09-01

    The Proceedings of the ten Detonation Symposia have become the major archival source of information of international research in explosive phenomenology, theory, experimental techniques, numerical modeling, and high-rate reaction chemistry. In many cases, they contain the original reference or the only reference to major progress in the field. For some papers, the information is more complete than the complementary article appearing in a formal journal; yet for others, authors elected to publish only an abstract in the Proceedings. For the large majority of papers, the Symposia Proceedings provide the only published reference to a body of work. This report indexesmore » the ten existing Proceedings of the Detonation Symposia by paper titles, topic phrases, authors, and first appearance of acronyms and code names.« less

  8. (De-)accentuation and the process of information status: evidence from event-related brain potentials.

    PubMed

    Baumann, Stefan; Schumacher, Petra B

    2012-09-01

    The paper reports on a perception experiment in German that investigated the neuro-cognitive processing of information structural concepts and their prosodic marking using event-related brain potentials (ERPs). Experimental conditions controlled the information status (given vs. new) of referring and non-referring target expressions (nouns vs. adjectives) and were elicited via context sentences, which did not - unlike most previous ERP studies in the field--trigger an explicit focus expectation. Target utterances displayed prosodic realizations of the critical words which differed in accent position and accent type. Electrophysiological results showed an effect of information status, maximally distributed over posterior sites, displaying a biphasic N400--Late Positivity pattern for new information. We claim that this pattern reflects increased processing demands associated with new information, with the N400 indicating enhanced costs from linking information with the previous discourse and the Late Positivity indicating the listener's effort to update his/her discourse model. The prosodic manipulation registered more pronounced effects over anterior regions and revealed an enhanced negativity followed by a Late Positivity for deaccentuation, probably also reflecting costs from discourse linking and updating respectively. The data further lend indirect support for the idea that givenness applies not only to referents but also to non-referential expressions ('lexical givenness').

  9. Identification of nutrition communication styles and strategies: a qualitative study among Dutch GPs.

    PubMed

    van Dillen, Sonja M E; Hiddink, Gerrit J; Koelen, Maria A; de Graaf, Cees; van Woerkum, Cees M J

    2006-10-01

    The objectives of this study were to identify nutrition communication styles of Dutch GPs, their strategies regarding nutrition communication and nutrition information seeking behaviours. Another aim is to provide a hypothetical model for nutrition communication style, including psycho-social and socio-demographic variables. Nine focus groups with 81 GPs were used to obtain GPs' perceptions of nutrition communication. Data were analysed with the computer software program NUD*IST. Five nutrition communication styles were identified, namely informational, reference, motivational, confrontational and holistic style. Referring to a dietician, providing advice according to Dietary Guidelines, and offering written education materials were mentioned as strategies regarding nutrition communication. GPs sought nutrition information in scientific studies, specialist literature, and postgraduate training courses. The informational style of nutrition communication was dominant among Dutch GPs. GPs hardly provided maintenance advice for nutrition behaviour. Many GPs referred patients to dieticians, who were viewed as colleagues. GPs tried to get basic information about nutrition by scanning the literature, but they were seldom actively involved in seeking specific nutrition information. Although GPs felt that patients expect expert nutrition information, they perceived their nutrition knowledge as restricted. We advise to raise self-efficacy of GPs regarding nutrition communication and to build good collaboration with dieticians.

  10. BIBLIO: A Computer System Designed to Support the Near-Library User Model of Information Retrieval.

    ERIC Educational Resources Information Center

    Belew, Richard K.; Holland, Maurita Peterson

    1988-01-01

    Description of the development of the Information Exchange Facility, a prototype microcomputer-based personal bibliographic facility, covers software selection, user selection, overview of the system, and evaluation. The plan for an integrated system, BIBLIO, and the future role of libraries are discussed. (eight references) (MES)

  11. Design and Parametric Study of the Magnetic Sensor for Position Detection in Linear Motor Based on Nonlinear Parametric Model Order Reduction

    PubMed Central

    Paul, Sarbajit; Chang, Junghwan

    2017-01-01

    This paper presents a design approach for a magnetic sensor module to detect mover position using the proper orthogonal decomposition-dynamic mode decomposition (POD-DMD)-based nonlinear parametric model order reduction (PMOR). The parameterization of the sensor module is achieved by using the multipolar moment matching method. Several geometric variables of the sensor module are considered while developing the parametric study. The operation of the sensor module is based on the principle of the airgap flux density distribution detection by the Hall Effect IC. Therefore, the design objective is to achieve a peak flux density (PFD) greater than 0.1 T and total harmonic distortion (THD) less than 3%. To fulfill the constraint conditions, the specifications for the sensor module is achieved by using POD-DMD based reduced model. The POD-DMD based reduced model provides a platform to analyze the high number of design models very fast, with less computational burden. Finally, with the final specifications, the experimental prototype is designed and tested. Two different modes, 90° and 120° modes respectively are used to obtain the position information of the linear motor mover. The position information thus obtained are compared with that of the linear scale data, used as a reference signal. The position information obtained using the 120° mode has a standard deviation of 0.10 mm from the reference linear scale signal, whereas the 90° mode position signal shows a deviation of 0.23 mm from the reference. The deviation in the output arises due to the mechanical tolerances introduced into the specification during the manufacturing process. This provides a scope for coupling the reliability based design optimization in the design process as a future extension. PMID:28671580

  12. Design and Parametric Study of the Magnetic Sensor for Position Detection in Linear Motor Based on Nonlinear Parametric model order reduction.

    PubMed

    Paul, Sarbajit; Chang, Junghwan

    2017-07-01

    This paper presents a design approach for a magnetic sensor module to detect mover position using the proper orthogonal decomposition-dynamic mode decomposition (POD-DMD)-based nonlinear parametric model order reduction (PMOR). The parameterization of the sensor module is achieved by using the multipolar moment matching method. Several geometric variables of the sensor module are considered while developing the parametric study. The operation of the sensor module is based on the principle of the airgap flux density distribution detection by the Hall Effect IC. Therefore, the design objective is to achieve a peak flux density (PFD) greater than 0.1 T and total harmonic distortion (THD) less than 3%. To fulfill the constraint conditions, the specifications for the sensor module is achieved by using POD-DMD based reduced model. The POD-DMD based reduced model provides a platform to analyze the high number of design models very fast, with less computational burden. Finally, with the final specifications, the experimental prototype is designed and tested. Two different modes, 90° and 120° modes respectively are used to obtain the position information of the linear motor mover. The position information thus obtained are compared with that of the linear scale data, used as a reference signal. The position information obtained using the 120° mode has a standard deviation of 0.10 mm from the reference linear scale signal, whereas the 90° mode position signal shows a deviation of 0.23 mm from the reference. The deviation in the output arises due to the mechanical tolerances introduced into the specification during the manufacturing process. This provides a scope for coupling the reliability based design optimization in the design process as a future extension.

  13. Event boundaries and anaphoric reference.

    PubMed

    Thompson, Alexis N; Radvansky, Gabriel A

    2016-06-01

    The current study explored the finding that parsing a narrative into separate events impairs anaphor resolution. According to the Event Horizon Model, when a narrative event boundary is encountered, a new event model is created. Information associated with the prior event model is removed from working memory. So long as the event model containing the anaphor referent is currently being processed, this information should still be available when there is no narrative event boundary, even if reading has been disrupted by a working-memory-clearing distractor task. In those cases, readers may reactivate their prior event model, and anaphor resolution would not be affected. Alternatively, comprehension may not be as event oriented as this account suggests. Instead, any disruption of the contents of working memory during comprehension, event related or not, may be sufficient to disrupt anaphor resolution. In this case, reading comprehension would be more strongly guided by other, more basic language processing mechanisms and the event structure of the described events would play a more minor role. In the current experiments, participants were given stories to read in which we included, between the anaphor and its referent, either the presence of a narrative event boundary (Experiment 1) or a narrative event boundary along with a working-memory-clearing distractor task (Experiment 2). The results showed that anaphor resolution was affected by narrative event boundaries but not by a working-memory-clearing distractor task. This is interpreted as being consistent with the Event Horizon Model of event cognition.

  14. Development and Validation of a Computational Model for Androgen Receptor Activity

    PubMed Central

    2016-01-01

    Testing thousands of chemicals to identify potential androgen receptor (AR) agonists or antagonists would cost millions of dollars and take decades to complete using current validated methods. High-throughput in vitro screening (HTS) and computational toxicology approaches can more rapidly and inexpensively identify potential androgen-active chemicals. We integrated 11 HTS ToxCast/Tox21 in vitro assays into a computational network model to distinguish true AR pathway activity from technology-specific assay interference. The in vitro HTS assays probed perturbations of the AR pathway at multiple points (receptor binding, coregulator recruitment, gene transcription, and protein production) and multiple cell types. Confirmatory in vitro antagonist assay data and cytotoxicity information were used as additional flags for potential nonspecific activity. Validating such alternative testing strategies requires high-quality reference data. We compiled 158 putative androgen-active and -inactive chemicals from a combination of international test method validation efforts and semiautomated systematic literature reviews. Detailed in vitro assay information and results were compiled into a single database using a standardized ontology. Reference chemical concentrations that activated or inhibited AR pathway activity were identified to establish a range of potencies with reproducible reference chemical results. Comparison with existing Tier 1 AR binding data from the U.S. EPA Endocrine Disruptor Screening Program revealed that the model identified binders at relevant test concentrations (<100 μM) and was more sensitive to antagonist activity. The AR pathway model based on the ToxCast/Tox21 assays had balanced accuracies of 95.2% for agonist (n = 29) and 97.5% for antagonist (n = 28) reference chemicals. Out of 1855 chemicals screened in the AR pathway model, 220 chemicals demonstrated AR agonist or antagonist activity and an additional 174 chemicals were predicted to have potential weak AR pathway activity. PMID:27933809

  15. Integral Nursing: An Emerging Framework for Engaging the Evolution of the Profession.

    ERIC Educational Resources Information Center

    Fiandt, Kathryn; Forman, John; Megel, Mary Erickson; Pakieser, Ruth A.; Burge, Stephanie

    2003-01-01

    Proposes the Integral Nursing framework, which combines Wilber's All-Quadrant/All-Level model, a heuristic device to organize human experience, and the Spiral Dynamics model of human development organized around value memes or cultural units of information. Includes commentary by Beth L. Rodgers. (Contains 17 references.) (JOW)

  16. Estimation of the upper bound of predictive performance for alternative models that use in vivo reference data (OpenTox USA 2017)

    EPA Science Inventory

    The number of chemicals with limited toxicological information for chemical safety decision-making has accelerated alternative model development, which often are evaluated via referencing animal toxicology studies. In vivo studies are generally considered the standard for hazard ...

  17. Job Aid Manuals for Phase II--DESIGN of the Instructional Systems Development Model.

    ERIC Educational Resources Information Center

    Schulz, Russel E.; Farrell, Jean R.

    Designed to supplement the descriptive authoring flowcharts presented in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the second phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions;…

  18. Job Aid Manuals for Phase I--ANALYZE of the Instructional Systems Development Model.

    ERIC Educational Resources Information Center

    Schulz, Russel E.; Farrell, Jean R.

    Designed to supplement the descriptive authoring flowcharts in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the first phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions; descriptions of…

  19. Student Flow Model SFM-IA: System Documentation. Technical Report 41B. Preliminary Edition.

    ERIC Educational Resources Information Center

    Busby, John C.; Johnson, Richard S.

    Technical specifications, operating procedures, and reference information for the National Center for Higher Education Management Systems' (NCHEMS) Student Flow Model (SFM) computer programs are presented. Included are narrative descriptions of the system and its modules, specific program documentation for each of the modules, system flowcharts,…

  20. Job Aid Manuals for Phase III--DEVELOP of the Instructional Systems Development Model.

    ERIC Educational Resources Information Center

    Schulz, Russel E.; Farrell, Jean R.

    Designed to supplement the descriptive authoring flowcharts presented in a companion volume, this manual includes specific guidance, examples, and other information referred to in the flowcharts for the implementation of the third phase of the Instructional Systems Development Model (ISD). The introductory section includes definitions;…

  1. Quantification of urban structure on building block level utilizing multisensoral remote sensing data

    NASA Astrophysics Data System (ADS)

    Wurm, Michael; Taubenböck, Hannes; Dech, Stefan

    2010-10-01

    Dynamics of urban environments are a challenge to a sustainable development. Urban areas promise wealth, realization of individual dreams and power. Hence, many cities are characterized by a population growth as well as physical development. Traditional, visual mapping and updating of urban structure information of cities is a very laborious and cost-intensive task, especially for large urban areas. For this purpose, we developed a workflow for the extraction of the relevant information by means of object-based image classification. In this manner, multisensoral remote sensing data has been analyzed in terms of very high resolution optical satellite imagery together with height information by a digital surface model to retrieve a detailed 3D city model with the relevant land-use / land-cover information. This information has been aggregated on the level of the building block to describe the urban structure by physical indicators. A comparison between the indicators derived by the classification and a reference classification has been accomplished to show the correlation between the individual indicators and a reference classification of urban structure types. The indicators have been used to apply a cluster analysis to group the individual blocks into similar clusters.

  2. PATIENT'S RIGHT TO INFORMED CONSENT IN REPUBLIC SRPSKA: LEGAL AND ETHICAL ASPECTS (WITH SPECIAL REFERENCE TO PHYSICAL REHABILITATION).

    PubMed

    Milinkovic, Igor; Majstorovic, Biljana

    2014-12-01

    The principle of informed consent, which requires a patient's fully-informed consent prior to the medical treatment, is closely connected with the value of human dignity. The realization and protection of a patient's dignity is not possible without his/her right to choose the character and scope of medical treatment. This goal cannot be adequately achieved within the traditional model of medical paternalism characterized by the physician's authoritative position. The first part of the article deals with the content and ethical significance of the informed consent doctrine. The legal framework of informed consent in Republic Srpska (RS), one of the two Bosnia and Herzegovina (BH)entities, is analyzed. Special reference is made to the relevance of the informed consent principle within the physical rehabilitation process. Although ethical aspects of physical rehabilitation are often overlooked, this medical field possesses a strong ethical dimension (including an appropriate realization of the patient's right to informed consent).

  3. Enterprise Reference Library

    NASA Technical Reports Server (NTRS)

    Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary

    2011-01-01

    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and duplication costs. Most importantly, increasing collaboration across research groups provides unprecedented access to information relevant to NASA s mission. Conclusion: This project is an expansion and cost-effective leveraging of the existing JSC centralized library. Adding key word and author search capabilities and an alert function for notifications about new articles, based on users profiles, represent examples of future enhancements.

  4. Behavioral model of visual perception and recognition

    NASA Astrophysics Data System (ADS)

    Rybak, Ilya A.; Golovan, Alexander V.; Gusakova, Valentina I.

    1993-09-01

    In the processes of visual perception and recognition human eyes actively select essential information by way of successive fixations at the most informative points of the image. A behavioral program defining a scanpath of the image is formed at the stage of learning (object memorizing) and consists of sequential motor actions, which are shifts of attention from one to another point of fixation, and sensory signals expected to arrive in response to each shift of attention. In the modern view of the problem, invariant object recognition is provided by the following: (1) separated processing of `what' (object features) and `where' (spatial features) information at high levels of the visual system; (2) mechanisms of visual attention using `where' information; (3) representation of `what' information in an object-based frame of reference (OFR). However, most recent models of vision based on OFR have demonstrated the ability of invariant recognition of only simple objects like letters or binary objects without background, i.e. objects to which a frame of reference is easily attached. In contrast, we use not OFR, but a feature-based frame of reference (FFR), connected with the basic feature (edge) at the fixation point. This has provided for our model, the ability for invariant representation of complex objects in gray-level images, but demands realization of behavioral aspects of vision described above. The developed model contains a neural network subsystem of low-level vision which extracts a set of primary features (edges) in each fixation, and high- level subsystem consisting of `what' (Sensory Memory) and `where' (Motor Memory) modules. The resolution of primary features extraction decreases with distances from the point of fixation. FFR provides both the invariant representation of object features in Sensor Memory and shifts of attention in Motor Memory. Object recognition consists in successive recall (from Motor Memory) and execution of shifts of attention and successive verification of the expected sets of features (stored in Sensory Memory). The model shows the ability of recognition of complex objects (such as faces) in gray-level images invariant with respect to shift, rotation, and scale.

  5. Overview of Building Information Modelling (BIM) adoption factors for construction organisations

    NASA Astrophysics Data System (ADS)

    Mohammad, W. N. S. Wan; Abdullah, M. R.; Ismail, S.; Takim, R.

    2018-04-01

    Improvement and innovation in building visualization, project coordination and communication are the major benefits generated by Building Information Modelling (BIM) for construction organisations. Thus, as many firms across the world would adopt BIM, however they do not know the clear direction in which path they are moving as there is no specific reference available for them to refer to. Hence, the paper seeks to identify the factors of BIM adoption from previous research. The methodology used in this paper is based on literature review from various sources such as conference articles and journals. Then, the findings were analysed using content analysis. The findings show that there are 24 factors found from literature that influence the adoption of BIM and four (4) factors such as vendor, organisational vision, knowledge, and implementation plan are among the least factors mentioned by previous researchers.

  6. Technical Note: Atmospheric CO2 inversions on the mesoscale using data-driven prior uncertainties: methodology and system evaluation

    NASA Astrophysics Data System (ADS)

    Kountouris, Panagiotis; Gerbig, Christoph; Rödenbeck, Christian; Karstens, Ute; Koch, Thomas Frank; Heimann, Martin

    2018-03-01

    Atmospheric inversions are widely used in the optimization of surface carbon fluxes on a regional scale using information from atmospheric CO2 dry mole fractions. In many studies the prior flux uncertainty applied to the inversion schemes does not directly reflect the true flux uncertainties but is used to regularize the inverse problem. Here, we aim to implement an inversion scheme using the Jena inversion system and applying a prior flux error structure derived from a model-data residual analysis using high spatial and temporal resolution over a full year period in the European domain. We analyzed the performance of the inversion system with a synthetic experiment, in which the flux constraint is derived following the same residual analysis but applied to the model-model mismatch. The synthetic study showed a quite good agreement between posterior and true fluxes on European, country, annual and monthly scales. Posterior monthly and country-aggregated fluxes improved their correlation coefficient with the known truth by 7 % compared to the prior estimates when compared to the reference, with a mean correlation of 0.92. The ratio of the SD between the posterior and reference and between the prior and reference was also reduced by 33 % with a mean value of 1.15. We identified temporal and spatial scales on which the inversion system maximizes the derived information; monthly temporal scales at around 200 km spatial resolution seem to maximize the information gain.

  7. The PDS4 Information Model and its Role in Agile Science Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D.

    2017-12-01

    PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.

  8. CD-SEM real time bias correction using reference metrology based modeling

    NASA Astrophysics Data System (ADS)

    Ukraintsev, V.; Banke, W.; Zagorodnev, G.; Archie, C.; Rana, N.; Pavlovsky, V.; Smirnov, V.; Briginas, I.; Katnani, A.; Vaid, A.

    2018-03-01

    Accuracy of patterning impacts yield, IC performance and technology time to market. Accuracy of patterning relies on optical proximity correction (OPC) models built using CD-SEM inputs and intra die critical dimension (CD) control based on CD-SEM. Sub-nanometer measurement uncertainty (MU) of CD-SEM is required for current technologies. Reported design and process related bias variation of CD-SEM is in the range of several nanometers. Reference metrology and numerical modeling are used to correct SEM. Both methods are slow to be used for real time bias correction. We report on real time CD-SEM bias correction using empirical models based on reference metrology (RM) data. Significant amount of currently untapped information (sidewall angle, corner rounding, etc.) is obtainable from SEM waveforms. Using additional RM information provided for specific technology (design rules, materials, processes) CD extraction algorithms can be pre-built and then used in real time for accurate CD extraction from regular CD-SEM images. The art and challenge of SEM modeling is in finding robust correlation between SEM waveform features and bias of CD-SEM as well as in minimizing RM inputs needed to create accurate (within the design and process space) model. The new approach was applied to improve CD-SEM accuracy of 45 nm GATE and 32 nm MET1 OPC 1D models. In both cases MU of the state of the art CD-SEM has been improved by 3x and reduced to a nanometer level. Similar approach can be applied to 2D (end of line, contours, etc.) and 3D (sidewall angle, corner rounding, etc.) cases.

  9. Das US-Fachinformationssystem ERIC und die Entwicklung eines Fachinformationssystems Bildung in der Bundesrepublik Deutschland (The American Subject Information System ERIC and the Development of an Information System on Education in the German Federal Republic).

    ERIC Educational Resources Information Center

    Nachrichten fur Dokumentation, 1982

    1982-01-01

    In order to further develop West German information services for education, it is suggested that the ERIC structural model--a coordinating central office and a network of clearinghouses--be developed as a continuation of the existing "Dokumentationsring Padagogik" (DOPAED) documentation service. (16 references) (EJS)

  10. A Novel Multilayer Correlation Maximization Model for Improving CCA-Based Frequency Recognition in SSVEP Brain-Computer Interface.

    PubMed

    Jiao, Yong; Zhang, Yu; Wang, Yu; Wang, Bei; Jin, Jing; Wang, Xingyu

    2018-05-01

    Multiset canonical correlation analysis (MsetCCA) has been successfully applied to optimize the reference signals by extracting common features from multiple sets of electroencephalogram (EEG) for steady-state visual evoked potential (SSVEP) recognition in brain-computer interface application. To avoid extracting the possible noise components as common features, this study proposes a sophisticated extension of MsetCCA, called multilayer correlation maximization (MCM) model for further improving SSVEP recognition accuracy. MCM combines advantages of both CCA and MsetCCA by carrying out three layers of correlation maximization processes. The first layer is to extract the stimulus frequency-related information in using CCA between EEG samples and sine-cosine reference signals. The second layer is to learn reference signals by extracting the common features with MsetCCA. The third layer is to re-optimize the reference signals set in using CCA with sine-cosine reference signals again. Experimental study is implemented to validate effectiveness of the proposed MCM model in comparison with the standard CCA and MsetCCA algorithms. Superior performance of MCM demonstrates its promising potential for the development of an improved SSVEP-based brain-computer interface.

  11. Modeling on the grand scale: LANDFIRE lessons learned

    Treesearch

    Kori Blankenship; Jim Smith; Randy Swaty; Ayn J. Shlisky; Jeannie Patton; Sarah Hagen

    2012-01-01

    Between 2004 and 2009, the LANDFIRE project facilitated the creation of approximately 1,200 unique state-andtransition models (STMs) for all major ecosystems in the United States. The primary goal of the modeling effort was to create a consistent and comprehensive set of STMs describing reference conditions and to inform the mapping of a subset of LANDFIRE’s spatial...

  12. Comparative Analysis of Four Manpower Nursing Requirements Models. Health Manpower References. [Nurse Planning Information Series, No. 6].

    ERIC Educational Resources Information Center

    Deane, Robert T.; Ro, Kong-Kyun

    The analysis and description of four manpower nursing requirements models-- the Pugh-Roberts, the Vector, the Community Systems Foundation (CSF), and the Western Interstate Commission of Higher Education (WICHE)--are presented in this report. The introduction provides an overview of the project which was designed to analyze these different models.…

  13. Making Choices in the Virtual World: The New Model at United Technologies Information Network.

    ERIC Educational Resources Information Center

    Gulliford, Bradley

    1998-01-01

    Describes changes in services of the United Technologies Corporation Information Network from a traditional library system to a virtual system of World Wide Web sites, a document-delivery unit, telephone and e-mail reference, and desktop technical support to provide remote access. Staff time, security, and licensing issues are addressed.…

  14. Developing Managerial Learning Styles in the Context of the Strategic Application of Information and Communications Technologies.

    ERIC Educational Resources Information Center

    Holtham, Clive; Courtney, Nigel

    2001-01-01

    Training for 561 executives in the use of information and communications technologies was based on a model, the Executive Learning Ladder. Results indicated that sense making was accelerated when conducted in peer groups before being extended to less-experienced managers. Learning preference differences played a role. (Contains 38 references.) (SK)

  15. 75 FR 11439 - Airworthiness Directives; Airbus Model A319, A320, and A321 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-11

    ... the erasable programmable read only memory (EPROM) (for certain configurations) in addition to... A320-31-1286, dated January 22, 2008; for related information. Material Incorporated by Reference (i.... For information on the availability of this material at the FAA, call 425-227-1221 or 425-227-1152. (4...

  16. Anatomy of Data Integration

    PubMed Central

    Brazhnik, Olga; Jones, John F.

    2007-01-01

    Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets. PMID:17071142

  17. Negative self-referential processing is associated with genetic variation in the serotonin transporter-linked polymorphic region (5-HTTLPR): Evidence from two independent studies.

    PubMed

    Dainer-Best, Justin; Disner, Seth G; McGeary, John E; Hamilton, Bethany J; Beevers, Christopher G

    2018-01-01

    The current research examined whether carriers of the short 5-HTTLPR allele (in SLC6A4), who have been shown to selectively attend to negative information, exhibit a bias towards negative self-referent processing. The self-referent encoding task (SRET) was used to measure self-referential processing of positive and negative adjectives. Ratcliff's diffusion model isolated and extracted decision-making components from SRET responses and reaction times. Across the initial (N = 183) and replication (N = 137) studies, results indicated that short 5-HTTLPR allele carriers more easily categorized negative adjectives as self-referential (i.e., higher drift rate). Further, drift rate was associated with recall of negative self-referential stimuli. Findings across both studies provide further evidence that genetic variation may contribute to the etiology of negatively biased processing of self-referent information. Large scale studies examining the genetic contributions to negative self-referent processing may be warranted.

  18. Integrating the Master of Software Assurance Reference Curriculum into the Model Curriculum and Guidelines for Graduate Degree Programs in Information Systems

    DTIC Science & Technology

    2011-02-01

    Model Curriculum and Guidelines for Graduate Degree Programs in Information Systems (MSIS) 2006 is the latest product of a project that has been...conducted for nearly 40 years [Gor- gone 2006]. Various organizations affiliated with the project have developed specifications for the teaching of...considerations helps ensure that an institution’s individual courses of study are relevant to the industry that its students are preparing to enter

  19. Development of a geotechnical GIS for subsurface characterization with three dimensional modeling capabilities.

    DOT National Transportation Integrated Search

    2006-06-01

    The New Hampshire Department of Transportation initiated this research to develop a geographical information system (GIS) that : visualizes subsurface conditions three dimensionally by pulling together geotechnical data containing spatial references....

  20. Improvement of radiology services based on the process management approach.

    PubMed

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria

    2011-06-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  1. Basic Geometric Support of Systems for Earth Observation from Geostationary and Highly Elliptical Orbits

    NASA Astrophysics Data System (ADS)

    Gektin, Yu. M.; Egoshkin, N. A.; Eremeev, V. V.; Kuznecov, A. E.; Moskatinyev, I. V.; Smelyanskiy, M. B.

    2017-12-01

    A set of standardized models and algorithms for geometric normalization and georeferencing images from geostationary and highly elliptical Earth observation systems is considered. The algorithms can process information from modern scanning multispectral sensors with two-coordinate scanning and represent normalized images in optimal projection. Problems of the high-precision ground calibration of the imaging equipment using reference objects, as well as issues of the flight calibration and refinement of geometric models using the absolute and relative reference points, are considered. Practical testing of the models, algorithms, and technologies is performed in the calibration of sensors for spacecrafts of the Electro-L series and during the simulation of the Arktika prospective system.

  2. Impact of the choice of the precipitation reference data set on climate model selection and the resulting climate change signal

    NASA Astrophysics Data System (ADS)

    Gampe, D.; Ludwig, R.

    2017-12-01

    Regional Climate Models (RCMs) that downscale General Circulation Models (GCMs) are the primary tool to project future climate and serve as input to many impact models to assess the related changes and impacts under such climate conditions. Such RCMs are made available through the Coordinated Regional climate Downscaling Experiment (CORDEX). The ensemble of models provides a range of possible future climate changes around the ensemble mean climate change signal. The model outputs however are prone to biases compared to regional observations. A bias correction of these deviations is a crucial step in the impact modelling chain to allow the reproduction of historic conditions of i.e. river discharge. However, the detection and quantification of model biases are highly dependent on the selected regional reference data set. Additionally, in practice due to computational constraints it is usually not feasible to consider the entire ensembles of climate simulations with all members as input for impact models which provide information to support decision-making. Although more and more studies focus on model selection based on the preservation of the climate model spread, a selection based on validity, i.e. the representation of the historic conditions is still a widely applied approach. In this study, several available reference data sets for precipitation are selected to detect the model bias for the reference period 1989 - 2008 over the alpine catchment of the Adige River located in Northern Italy. The reference data sets originate from various sources, such as station data or reanalysis. These data sets are remapped to the common RCM grid at 0.11° resolution and several indicators, such as dry and wet spells, extreme precipitation and general climatology, are calculate to evaluate the capability of the RCMs to produce the historical conditions. The resulting RCM spread is compared against the spread of the reference data set to determine the related uncertainties and detect potential model biases with respect to each reference data set. The RCMs are then ranked based on various statistical measures for each indicator and a score matrix is derived to select a subset of RCMs. We show the impact and importance of the reference data set with respect to the resulting climate change signal on the catchment scale.

  3. Mixture models for protein structure ensembles.

    PubMed

    Hirsch, Michael; Habeck, Michael

    2008-10-01

    Protein structure ensembles provide important insight into the dynamics and function of a protein and contain information that is not captured with a single static structure. However, it is not clear a priori to what extent the variability within an ensemble is caused by internal structural changes. Additional variability results from overall translations and rotations of the molecule. And most experimental data do not provide information to relate the structures to a common reference frame. To report meaningful values of intrinsic dynamics, structural precision, conformational entropy, etc., it is therefore important to disentangle local from global conformational heterogeneity. We consider the task of disentangling local from global heterogeneity as an inference problem. We use probabilistic methods to infer from the protein ensemble missing information on reference frames and stable conformational sub-states. To this end, we model a protein ensemble as a mixture of Gaussian probability distributions of either entire conformations or structural segments. We learn these models from a protein ensemble using the expectation-maximization algorithm. Our first model can be used to find multiple conformers in a structure ensemble. The second model partitions the protein chain into locally stable structural segments or core elements and less structured regions typically found in loops. Both models are simple to implement and contain only a single free parameter: the number of conformers or structural segments. Our models can be used to analyse experimental ensembles, molecular dynamics trajectories and conformational change in proteins. The Python source code for protein ensemble analysis is available from the authors upon request.

  4. Solid waste projection model: Model version 1. 0 technical reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkins, M.L.; Crow, V.L.; Buska, D.E.

    1990-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software utilized in developing Version 1.0 of the modeling unit of SWPM. This document is intended for use by experienced software engineers and supports programming, code maintenance, and model enhancement. Those interested in using SWPM should refer to the SWPM Modelmore » User's Guide. This document is available from either the PNL project manager (D. L. Stiles, 509-376-4154) or the WHC program monitor (B. C. Anderson, 509-373-2796). 8 figs.« less

  5. Modeling the Distribution of Fingerprint Characteristics. Revision 1.

    DTIC Science & Technology

    1980-09-19

    the details of the print. The ridge-line details are termed Galton characteristics since Sir Francis Galton was among the first to study them...U.S.A. CONTENTS Abstract 1. Introduction 2. Background Information on Fingerprints 2.1. Types 2.2. Ridge counts 2.3. The Galton details 3. Data...The Multinomial Markov Model 7. The Poisson Markov Model 8. The Infinitely Divisible Model Acknowledgements References Appendices A The Galton

  6. Systems and methods that generate height map models for efficient three dimensional reconstruction from depth information

    DOEpatents

    Frahm, Jan-Michael; Pollefeys, Marc Andre Leon; Gallup, David Robert

    2015-12-08

    Methods of generating a three dimensional representation of an object in a reference plane from a depth map including distances from a reference point to pixels in an image of the object taken from a reference point. Weights are assigned to respective voxels in a three dimensional grid along rays extending from the reference point through the pixels in the image based on the distances in the depth map from the reference point to the respective pixels, and a height map including an array of height values in the reference plane is formed based on the assigned weights. An n-layer height map may be constructed by generating a probabilistic occupancy grid for the voxels and forming an n-dimensional height map comprising an array of layer height values in the reference plane based on the probabilistic occupancy grid.

  7. BRIDGING GAPS BETWEEN ZOO AND WILDLIFE MEDICINE: ESTABLISHING REFERENCE INTERVALS FOR FREE-RANGING AFRICAN LIONS (PANTHERA LEO).

    PubMed

    Broughton, Heather M; Govender, Danny; Shikwambana, Purvance; Chappell, Patrick; Jolles, Anna

    2017-06-01

    The International Species Information System has set forth an extensive database of reference intervals for zoologic species, allowing veterinarians and game park officials to distinguish normal health parameters from underlying disease processes in captive wildlife. However, several recent studies comparing reference values from captive and free-ranging animals have found significant variation between populations, necessitating the development of separate reference intervals in free-ranging wildlife to aid in the interpretation of health data. Thus, this study characterizes reference intervals for six biochemical analytes, eleven hematologic or immune parameters, and three hormones using samples from 219 free-ranging African lions ( Panthera leo ) captured in Kruger National Park, South Africa. Using the original sample population, exclusion criteria based on physical examination were applied to yield a final reference population of 52 clinically normal lions. Reference intervals were then generated via 90% confidence intervals on log-transformed data using parametric bootstrapping techniques. In addition to the generation of reference intervals, linear mixed-effect models and generalized linear mixed-effect models were used to model associations of each focal parameter with the following independent variables: age, sex, and body condition score. Age and sex were statistically significant drivers for changes in hepatic enzymes, renal values, hematologic parameters, and leptin, a hormone related to body fat stores. Body condition was positively correlated with changes in monocyte counts. Given the large variation in reference values taken from captive versus free-ranging lions, it is our hope that this study will serve as a baseline for future clinical evaluations and biomedical research targeting free-ranging African lions.

  8. Reference frames in allocentric representations are invariant across static and active encoding

    PubMed Central

    Chan, Edgar; Baumann, Oliver; Bellgrove, Mark A.; Mattingley, Jason B.

    2013-01-01

    An influential model of spatial memory—the so-called reference systems account—proposes that relationships between objects are biased by salient axes (“frames of reference”) provided by environmental cues, such as the geometry of a room. In this study, we sought to examine the extent to which a salient environmental feature influences the formation of spatial memories when learning occurs via a single, static viewpoint and via active navigation, where information has to be integrated across multiple viewpoints. In our study, participants learned the spatial layout of an object array that was arranged with respect to a prominent environmental feature within a virtual arena. Location memory was tested using judgments of relative direction. Experiment 1A employed a design similar to previous studies whereby learning of object-location information occurred from a single, static viewpoint. Consistent with previous studies, spatial judgments were significantly more accurate when made from an orientation that was aligned, as opposed to misaligned, with the salient environmental feature. In Experiment 1B, a fresh group of participants learned the same object-location information through active exploration, which required integration of spatial information over time from a ground-level perspective. As in Experiment 1A, object-location information was organized around the salient environmental cue. Taken together, the findings suggest that the learning condition (static vs. active) does not affect the reference system employed to encode object-location information. Spatial reference systems appear to be a ubiquitous property of spatial representations, and might serve to reduce the cognitive demands of spatial processing. PMID:24009595

  9. Predicting oropharyngeal tumor volume throughout the course of radiation therapy from pretreatment computed tomography data using general linear models.

    PubMed

    Yock, Adam D; Rao, Arvind; Dong, Lei; Beadle, Beth M; Garden, Adam S; Kudchadker, Rajat J; Court, Laurence E

    2014-05-01

    The purpose of this work was to develop and evaluate the accuracy of several predictive models of variation in tumor volume throughout the course of radiation therapy. Nineteen patients with oropharyngeal cancers were imaged daily with CT-on-rails for image-guided alignment per an institutional protocol. The daily volumes of 35 tumors in these 19 patients were determined and used to generate (1) a linear model in which tumor volume changed at a constant rate, (2) a general linear model that utilized the power fit relationship between the daily and initial tumor volumes, and (3) a functional general linear model that identified and exploited the primary modes of variation between time series describing the changing tumor volumes. Primary and nodal tumor volumes were examined separately. The accuracy of these models in predicting daily tumor volumes were compared with those of static and linear reference models using leave-one-out cross-validation. In predicting the daily volume of primary tumors, the general linear model and the functional general linear model were more accurate than the static reference model by 9.9% (range: -11.6%-23.8%) and 14.6% (range: -7.3%-27.5%), respectively, and were more accurate than the linear reference model by 14.2% (range: -6.8%-40.3%) and 13.1% (range: -1.5%-52.5%), respectively. In predicting the daily volume of nodal tumors, only the 14.4% (range: -11.1%-20.5%) improvement in accuracy of the functional general linear model compared to the static reference model was statistically significant. A general linear model and a functional general linear model trained on data from a small population of patients can predict the primary tumor volume throughout the course of radiation therapy with greater accuracy than standard reference models. These more accurate models may increase the prognostic value of information about the tumor garnered from pretreatment computed tomography images and facilitate improved treatment management.

  10. Librarianship, Professionalism, and Social Change.

    ERIC Educational Resources Information Center

    Birdsall, William F.

    1982-01-01

    Argues that librarians should be committed to ensure access to knowledge, adhere to encouraging users to be knowledge self-sufficient, avoid outmoded models of professionalism, and not feel threatened by other information dissemination groups. Included are 26 references. (RAA)

  11. The 1995 revision of the joint US/UK geomagnetic field models - I. Secular variation

    USGS Publications Warehouse

    Macmillan, S.; Barraclough, D.R.; Quinn, J.M.; Coleman, R.J.

    1997-01-01

    We present the methods used to derive mathematical models of global secular variation of the main geomagnetic field for the period 1985 to 2000. These secular-variation models are used in the construction of the candidate US/UK models for the Definitive Geomagnetic Reference Field at 1990, the International Geomagnetic Reference Field for 1995 to 2000, and the World Magnetic Model for 1995 to 2000 (see paper II, Quinn et al., 1997). The main sources of data for the secular-variation models are geomagnetic observatories and repeat stations. Over the areas devoid of these data secular-variation information is extracted from aeromagnetic and satellite data. We describe how secular variation is predicted up to the year 2000 at the observatories and repeat stations, how the aeromagnetic and satellite data are used, and how all the data are combined to produce the required models.

  12. Output Feedback Adaptive Control of Non-Minimum Phase Systems Using Optimal Control Modification

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan; Hashemi, Kelley E.; Yucelen, Tansel; Arabi, Ehsan

    2018-01-01

    This paper describes output feedback adaptive control approaches for non-minimum phase SISO systems with relative degree 1 and non-strictly positive real (SPR) MIMO systems with uniform relative degree 1 using the optimal control modification method. It is well-known that the standard model-reference adaptive control (MRAC) cannot be used to control non-SPR plants to track an ideal SPR reference model. Due to the ideal property of asymptotic tracking, MRAC attempts an unstable pole-zero cancellation which results in unbounded signals for non-minimum phase SISO systems. The optimal control modification can be used to prevent the unstable pole-zero cancellation which results in a stable adaptation of non-minimum phase SISO systems. However, the tracking performance using this approach could suffer if the unstable zero is located far away from the imaginary axis. The tracking performance can be recovered by using an observer-based output feedback adaptive control approach which uses a Luenberger observer design to estimate the state information of the plant. Instead of explicitly specifying an ideal SPR reference model, the reference model is established from the linear quadratic optimal control to account for the non-minimum phase behavior of the plant. With this non-minimum phase reference model, the observer-based output feedback adaptive control can maintain stability as well as tracking performance. However, in the presence of the mismatch between the SPR reference model and the non-minimum phase plant, the standard MRAC results in unbounded signals, whereas a stable adaptation can be achieved with the optimal control modification. An application of output feedback adaptive control for a flexible wing aircraft illustrates the approaches.

  13. Framework for a clinical information system.

    PubMed

    Van de Velde, R

    2000-01-01

    The current status of our work towards the design and implementation of a reference architecture for a Clinical Information System is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the 'middle' tier apply the clinical (business) model and application rules to communicate with so-called 'thin client' workstations. The main characteristics are the focus on modelling and reuse of both data and business logic as there is a shift away from data and functional modelling towards object modelling. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  14. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  15. Global Geopotential Modelling from Satellite-to-Satellite Tracking,

    DTIC Science & Technology

    1981-10-01

    measured range-rate sampled at regular intervals. The expansion of the potential has been truncated at degree n = 331, because little information on...averaging interval is 4 s , and sampling takes place every 4 s ; if residual data are used, with respect to a reference model of specified accuracy, complete...LEGFDN, MODEL, andNVAR... .. ....... 93 B-4 Sample Output .. .. .. .... ..... ..... ..... 94 Appendix C: Detailed Listings Degree by Degree

  16. Design and Establishment of Quality Model of Fundamental Geographic Information Database

    NASA Astrophysics Data System (ADS)

    Ma, W.; Zhang, J.; Zhao, Y.; Zhang, P.; Dang, Y.; Zhao, T.

    2018-04-01

    In order to make the quality evaluation for the Fundamental Geographic Information Databases(FGIDB) more comprehensive, objective and accurate, this paper studies and establishes a quality model of FGIDB, which formed by the standardization of database construction and quality control, the conformity of data set quality and the functionality of database management system, and also designs the overall principles, contents and methods of the quality evaluation for FGIDB, providing the basis and reference for carry out quality control and quality evaluation for FGIDB. This paper designs the quality elements, evaluation items and properties of the Fundamental Geographic Information Database gradually based on the quality model framework. Connected organically, these quality elements and evaluation items constitute the quality model of the Fundamental Geographic Information Database. This model is the foundation for the quality demand stipulation and quality evaluation of the Fundamental Geographic Information Database, and is of great significance on the quality assurance in the design and development stage, the demand formulation in the testing evaluation stage, and the standard system construction for quality evaluation technology of the Fundamental Geographic Information Database.

  17. Trauma-Informed Part C Early Intervention: A Vision, A Challenge, A New Reality

    ERIC Educational Resources Information Center

    Gilkerson, Linda; Graham, Mimi; Harris, Deborah; Oser, Cindy; Clarke, Jane; Hairston-Fuller, Tody C.; Lertora, Jessica

    2013-01-01

    Federal directives require that any child less than 3 years old with a substantiated case of abuse be referred to the early intervention (EI) system. This article details the need and presents a vision for a trauma-informed EI system. The authors describe two exemplary program models which implement this vision and recommend steps which the field…

  18. Dynamic route and departure time choice model based on self-adaptive reference point and reinforcement learning

    NASA Astrophysics Data System (ADS)

    Li, Xue-yan; Li, Xue-mei; Yang, Lingrun; Li, Jing

    2018-07-01

    Most of the previous studies on dynamic traffic assignment are based on traditional analytical framework, for instance, the idea of Dynamic User Equilibrium has been widely used in depicting both the route choice and the departure time choice. However, some recent studies have demonstrated that the dynamic traffic flow assignment largely depends on travelers' rationality degree, travelers' heterogeneity and what the traffic information the travelers have. In this paper, we develop a new self-adaptive multi agent model to depict travelers' behavior in Dynamic Traffic Assignment. We use Cumulative Prospect Theory with heterogeneous reference points to illustrate travelers' bounded rationality. We use reinforcement-learning model to depict travelers' route and departure time choosing behavior under the condition of imperfect information. We design the evolution rule of travelers' expected arrival time and the algorithm of traffic flow assignment. Compared with the traditional model, the self-adaptive multi agent model we proposed in this paper can effectively help travelers avoid the rush hour. Finally, we report and analyze the effect of travelers' group behavior on the transportation system, and give some insights into the relation between travelers' group behavior and the performance of transportation system.

  19. Towards generalised reference condition models for environmental assessment: a case study on rivers in Atlantic Canada.

    PubMed

    Armanini, D G; Monk, W A; Carter, L; Cote, D; Baird, D J

    2013-08-01

    Evaluation of the ecological status of river sites in Canada is supported by building models using the reference condition approach. However, geography, data scarcity and inter-operability constraints have frustrated attempts to monitor national-scale status and trends. This issue is particularly true in Atlantic Canada, where no ecological assessment system is currently available. Here, we present a reference condition model based on the River Invertebrate Prediction and Classification System approach with regional-scale applicability. To achieve this, we used biological monitoring data collected from wadeable streams across Atlantic Canada together with freely available, nationally consistent geographic information system (GIS) environmental data layers. For the first time, we demonstrated that it is possible to use data generated from different studies, even when collected using different sampling methods, to generate a robust predictive model. This model was successfully generated and tested using GIS-based rather than local habitat variables and showed improved performance when compared to a null model. In addition, ecological quality ratio data derived from the model responded to observed stressors in a test dataset. Implications for future large-scale implementation of river biomonitoring using a standardised approach with global application are presented.

  20. A three-talk model for shared decision making: multistage consultation process

    PubMed Central

    Durand, Marie Anne; Song, Julia; Aarts, Johanna; Barr, Paul J; Berger, Zackary; Cochran, Nan; Frosch, Dominick; Galasiński, Dariusz; Gulbrandsen, Pål; Han, Paul K J; Härter, Martin; Kinnersley, Paul; Lloyd, Amy; Mishra, Manish; Perestelo-Perez, Lilisbeth; Scholl, Isabelle; Tomori, Kounosuke; Trevena, Lyndal; Witteman, Holly O; Van der Weijden, Trudy

    2017-01-01

    Objectives To revise an existing three-talk model for learning how to achieve shared decision making, and to consult with relevant stakeholders to update and obtain wider engagement. Design Multistage consultation process. Setting Key informant group, communities of interest, and survey of clinical specialties. Participants 19 key informants, 153 member responses from multiple communities of interest, and 316 responses to an online survey from medically qualified clinicians from six specialties. Results After extended consultation over three iterations, we revised the three-talk model by making changes to one talk category, adding the need to elicit patient goals, providing a clear set of tasks for each talk category, and adding suggested scripts to illustrate each step. A new three-talk model of shared decision making is proposed, based on “team talk,” “option talk,” and “decision talk,” to depict a process of collaboration and deliberation. Team talk places emphasis on the need to provide support to patients when they are made aware of choices, and to elicit their goals as a means of guiding decision making processes. Option talk refers to the task of comparing alternatives, using risk communication principles. Decision talk refers to the task of arriving at decisions that reflect the informed preferences of patients, guided by the experience and expertise of health professionals. Conclusions The revised three-talk model of shared decision making depicts conversational steps, initiated by providing support when introducing options, followed by strategies to compare and discuss trade-offs, before deliberation based on informed preferences. PMID:29109079

  1. PACS for surgery and interventional radiology: features of a Therapy Imaging and Model Management System (TIMMS).

    PubMed

    Lemke, Heinz U; Berliner, Leonard

    2011-05-01

    Appropriate use of information and communication technology (ICT) and mechatronic (MT) systems is viewed by many experts as a means to improve workflow and quality of care in the operating room (OR). This will require a suitable information technology (IT) infrastructure, as well as communication and interface standards, such as specialized extensions of DICOM, to allow data interchange between surgical system components in the OR. A design of such an infrastructure, sometimes referred to as surgical PACS, but better defined as a Therapy Imaging and Model Management System (TIMMS), will be introduced in this article. A TIMMS should support the essential functions that enable and advance image guided therapy, and in the future, a more comprehensive form of patient-model guided therapy. Within this concept, the "image-centric world view" of the classical PACS technology is complemented by an IT "model-centric world view". Such a view is founded in the special patient modelling needs of an increasing number of modern surgical interventions as compared to the imaging intensive working mode of diagnostic radiology, for which PACS was originally conceptualised and developed. The modelling aspects refer to both patient information and workflow modelling. Standards for creating and integrating information about patients, equipment, and procedures are vitally needed when planning for an efficient OR. The DICOM Working Group 24 (WG-24) has been established to develop DICOM objects and services related to image and model guided surgery. To determine these standards, it is important to define step-by-step surgical workflow practices and create interventional workflow models per procedures or per variable cases. As the boundaries between radiation therapy, surgery and interventional radiology are becoming less well-defined, precise patient models will become the greatest common denominator for all therapeutic disciplines. In addition to imaging, the focus of WG-24 is to serve the therapeutic disciplines by enabling modelling technology to be based on standards. Copyright © 2011. Published by Elsevier Ireland Ltd.

  2. Predictors of consistent condom use based on the Information-Motivation-Behavioral Skills (IMB) model among female sex workers in Jinan, China

    PubMed Central

    2011-01-01

    Background Female commercial sex workers (FSWs) are at high risk of human immunodeficiency virus (HIV) transmission in China. This study was designed to examine the predictors of condom use with clients during vaginal intercourse among FSWs based on the Information-Motivation-Behavioral Skills (IMB) model and to describe the relationships between IMB model constructs. Methods A cross-sectional study was conducted in Jinan of Shandong Province, from May to October, 2009. Participants (N = 432) were recruited using Respondent-Driven Sampling (RDS). A self-administered questionnaire was used to collect data. Structural equation modeling was used to assess the IMB model. Results A total of 427 (98.8%) participants completed their questionnaires. Condom use was significantly predicted by social referents support, experiences with and attitudes toward condoms, self-efficacy, and health behaviors and condom use skills. Significant indirect predictors of condom use mediated through behavioral skills included HIV knowledge, social referents support, and substance use. Conclusions These results suggest that the IMB model could be used to predict condom use among Chinese FSWs. Further research is warranted to develop preventive interventions on the basis of the IMB model to promote condom use among FSWs in China. PMID:21329512

  3. On Utilizing Optimal and Information Theoretic Syntactic Modeling for Peptide Classification

    NASA Astrophysics Data System (ADS)

    Aygün, Eser; Oommen, B. John; Cataltepe, Zehra

    Syntactic methods in pattern recognition have been used extensively in bioinformatics, and in particular, in the analysis of gene and protein expressions, and in the recognition and classification of bio-sequences. These methods are almost universally distance-based. This paper concerns the use of an Optimal and Information Theoretic (OIT) probabilistic model [11] to achieve peptide classification using the information residing in their syntactic representations. The latter has traditionally been achieved using the edit distances required in the respective peptide comparisons. We advocate that one can model the differences between compared strings as a mutation model consisting of random Substitutions, Insertions and Deletions (SID) obeying the OIT model. Thus, in this paper, we show that the probability measure obtained from the OIT model can be perceived as a sequence similarity metric, using which a Support Vector Machine (SVM)-based peptide classifier, referred to as OIT_SVM, can be devised.

  4. [3-dimensional models of actual or simulated cesarean sections].

    PubMed

    Patzak, B; Schaller, A

    2001-01-01

    Following upon an etymological and historical introduction, this report refers to two three-dimensional wax models of Caesarean sections, which have recently been acquired by the Pathological-anatomical Federal Museum in Vienna. Information is given on origin, dating and kind of production; questions of indication and operation technique, and--when in doubt--obduction technique, are being considered.

  5. Aligning Perceptions of Laboratory Demonstrators' Responsibilities to Inform the Design of a Laboratory Teacher Development Program

    ERIC Educational Resources Information Center

    Flaherty, Aishling; O'Dwyer, Anne; Mannix-McNamara, Patricia; Leahy, J. J.

    2017-01-01

    Throughout countries such as Ireland, the U.K., and Australia, graduate students who fulfill teaching roles in the undergraduate laboratory are often referred to as "laboratory demonstrators". The laboratory demonstrator (LD) model of graduate teaching is similar to the more commonly known graduate teaching assistant (GTA) model that is…

  6. An Associative Index Model for the Results List Based on Vannevar Bush's Selection Concept

    ERIC Educational Resources Information Center

    Cole, Charles; Julien, Charles-Antoine; Leide, John E.

    2010-01-01

    Introduction: We define the results list problem in information search and suggest the "associative index model", an ad-hoc, user-derived indexing solution based on Vannevar Bush's description of an associative indexing approach for his memex machine. We further define what selection means in indexing terms with reference to Charles…

  7. Forecasting and Maximizing Post-Secondary Futures: Dilemmas Over Negative Futures and Their Hidden Costs.

    ERIC Educational Resources Information Center

    Hoffman, Benjamin B.

    Forecasting models for maximizing postsecondary futures and applications of the model are considered. The forecasting of broad human futures has many parallels to human futures in the field of medical prognosis. The concept of "exasperated negative" is used to refer to the suppression of critical information about a negative future with…

  8. America's Youth Are at Risk: Developing Models for Action in the Nation's Public Libraries.

    ERIC Educational Resources Information Center

    Flum, Judith G.; Weisner, Stan

    1993-01-01

    Discussion of public library support systems for at-risk teens focuses on the Bay Area Library and Information System (BALIS) that was developed to improve library services to at-risk teenagers in the San Francisco Bay area. Highlights include needs assessment; staff training; intervention models; and project evaluation. (10 references) (LRW)

  9. An approach for modeling thermal destruction of hazardous wastes in circulating fluidized bed incinerator.

    PubMed

    Patil, M P; Sonolikar, R L

    2008-10-01

    This paper presents a detailed computational fluid dynamics (CFD) based approach for modeling thermal destruction of hazardous wastes in a circulating fluidized bed (CFB) incinerator. The model is based on Eular - Lagrangian approach in which gas phase (continuous phase) is treated in a Eularian reference frame, whereas the waste particulate (dispersed phase) is treated in a Lagrangian reference frame. The reaction chemistry hasbeen modeled through a mixture fraction/ PDF approach. The conservation equations for mass, momentum, energy, mixture fraction and other closure equations have been solved using a general purpose CFD code FLUENT4.5. Afinite volume method on a structured grid has been used for solution of governing equations. The model provides detailed information on the hydrodynamics (gas velocity, particulate trajectories), gas composition (CO, CO2, O2) and temperature inside the riser. The model also allows different operating scenarios to be examined in an efficient manner.

  10. Thermodynamics of mixtures of patchy and spherical colloids of different sizes: A multi-body association theory with complete reference fluid information.

    PubMed

    Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D; Cox, Kenneth R; Chapman, Walter G

    2017-04-28

    We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.

  11. Thermodynamics of mixtures of patchy and spherical colloids of different sizes: A multi-body association theory with complete reference fluid information

    NASA Astrophysics Data System (ADS)

    Bansal, Artee; Valiya Parambathu, Arjun; Asthagiri, D.; Cox, Kenneth R.; Chapman, Walter G.

    2017-04-01

    We present a theory to predict the structure and thermodynamics of mixtures of colloids of different diameters, building on our earlier work [A. Bansal et al., J. Chem. Phys. 145, 074904 (2016)] that considered mixtures with all particles constrained to have the same size. The patchy, solvent particles have short-range directional interactions, while the solute particles have short-range isotropic interactions. The hard-sphere mixture without any association site forms the reference fluid. An important ingredient within the multi-body association theory is the description of clustering of the reference solvent around the reference solute. Here we account for the physical, multi-body clusters of the reference solvent around the reference solute in terms of occupancy statistics in a defined observation volume. These occupancy probabilities are obtained from enhanced sampling simulations, but we also present statistical mechanical models to estimate these probabilities with limited simulation data. Relative to an approach that describes only up to three-body correlations in the reference, incorporating the complete reference information better predicts the bonding state and thermodynamics of the physical solute for a wide range of system conditions. Importantly, analysis of the residual chemical potential of the infinitely dilute solute from molecular simulation and theory shows that whereas the chemical potential is somewhat insensitive to the description of the structure of the reference fluid, the energetic and entropic contributions are not, with the results from the complete reference approach being in better agreement with particle simulations.

  12. Influence of an information literacy course on students' information search behavior.

    PubMed

    Weinert, Daniel J; Palmer, Erin M

    2007-01-01

    The purpose of this study was to determine the influence of an information literacy course on students' information gathering behavior. Two student groups, consisting of 69 (Group One) and 177 (Group Two) students, were compared in their performance on a literature review assignment. Group one did not have an information literacy course, while group two was the first class to receive a newly introduced course in information literacy. Assignment references served as the dependent variables and included the following categories: total number of references, number and percentage of peer reviewed journal references, number and percentage of non-peer reviewed journal references, number and percentage of website references, number and percentage of authority opinion references, and number and percentage of textbook references. Referenced websites were further divided into the following: .com, .org, .edu, .gov. for both total number and percent utilization.Independent T-tests were performed between the information literacy course status and each of the dependent variables. Descriptive data (prior education, cumulative GPA, average age of student groups) was similar for both groups. Independent T-test analysis revealed a strong association (p < .05) between increasing both the number and percentage of peer reviewed references and having the information literacy course. The introduction of an information literacy course did influence the information gathering behavior of students. Students showed an increased reliance on peer-reviewed references.

  13. The Effect of Electroencephalogram (EEG) Reference Choice on Information-Theoretic Measures of the Complexity and Integration of EEG Signals

    PubMed Central

    Trujillo, Logan T.; Stanfield, Candice T.; Vela, Ruben D.

    2017-01-01

    Converging evidence suggests that human cognition and behavior emerge from functional brain networks interacting on local and global scales. We investigated two information-theoretic measures of functional brain segregation and integration—interaction complexity CI(X), and integration I(X)—as applied to electroencephalographic (EEG) signals and how these measures are affected by choice of EEG reference. CI(X) is a statistical measure of the system entropy accounted for by interactions among its elements, whereas I(X) indexes the overall deviation from statistical independence of the individual elements of a system. We recorded 72 channels of scalp EEG from human participants who sat in a wakeful resting state (interleaved counterbalanced eyes-open and eyes-closed blocks). CI(X) and I(X) of the EEG signals were computed using four different EEG references: linked-mastoids (LM) reference, average (AVG) reference, a Laplacian (LAP) “reference-free” transformation, and an infinity (INF) reference estimated via the Reference Electrode Standardization Technique (REST). Fourier-based power spectral density (PSD), a standard measure of resting state activity, was computed for comparison and as a check of data integrity and quality. We also performed dipole source modeling in order to assess the accuracy of neural source CI(X) and I(X) estimates obtained from scalp-level EEG signals. CI(X) was largest for the LAP transformation, smallest for the LM reference, and at intermediate values for the AVG and INF references. I(X) was smallest for the LAP transformation, largest for the LM reference, and at intermediate values for the AVG and INF references. Furthermore, across all references, CI(X) and I(X) reliably distinguished between resting-state conditions (larger values for eyes-open vs. eyes-closed). These findings occurred in the context of the overall expected pattern of resting state PSD. Dipole modeling showed that simulated scalp EEG-level CI(X) and I(X) reflected changes in underlying neural source dependencies, but only for higher levels of integration and with highest accuracy for the LAP transformation. Our observations suggest that the Laplacian-transformation should be preferred for the computation of scalp-level CI(X) and I(X) due to its positive impact on EEG signal quality and statistics, reduction of volume-conduction, and the higher accuracy this provides when estimating scalp-level EEG complexity and integration. PMID:28790884

  14. Approaches to Enhancing Cyber Resilience: Report of the North Atlantic Treaty Organization (NATO) Workshop IST-153

    DTIC Science & Technology

    2018-04-01

    referred to as “defense in depth” and has been the standard model of information security management for at least a decade. Concepts such as mandatory...instrumentation into the system and monitoring this instrumentation with appropriate reports and alerts (e.g., security information event management tools or...Coalition Battle Management Language (C-BML) (NATO 2012) define information (orders, plans, reports, requests, etc.) that can be readily processed by

  15. Communication in diagnostic radiology: meeting the challenges of complexity.

    PubMed

    Larson, David B; Froehle, Craig M; Johnson, Neil D; Towbin, Alexander J

    2014-11-01

    As patients and information flow through the imaging process, value is added step-by-step when information is acquired, interpreted, and communicated back to the referring clinician. However, radiology information systems are often plagued with communication errors and delays. This article presents theories and recommends strategies to continuously improve communication in the complex environment of modern radiology. Communication theories, methods, and systems that have proven their effectiveness in other environments can serve as models for radiology.

  16. Deriving video content type from HEVC bitstream semantics

    NASA Astrophysics Data System (ADS)

    Nightingale, James; Wang, Qi; Grecos, Christos; Goma, Sergio R.

    2014-05-01

    As network service providers seek to improve customer satisfaction and retention levels, they are increasingly moving from traditional quality of service (QoS) driven delivery models to customer-centred quality of experience (QoE) delivery models. QoS models only consider metrics derived from the network however, QoE models also consider metrics derived from within the video sequence itself. Various spatial and temporal characteristics of a video sequence have been proposed, both individually and in combination, to derive methods of classifying video content either on a continuous scale or as a set of discrete classes. QoE models can be divided into three broad categories, full reference, reduced reference and no-reference models. Due to the need to have the original video available at the client for comparison, full reference metrics are of limited practical value in adaptive real-time video applications. Reduced reference metrics often require metadata to be transmitted with the bitstream, while no-reference metrics typically operate in the decompressed domain at the client side and require significant processing to extract spatial and temporal features. This paper proposes a heuristic, no-reference approach to video content classification which is specific to HEVC encoded bitstreams. The HEVC encoder already makes use of spatial characteristics to determine partitioning of coding units and temporal characteristics to determine the splitting of prediction units. We derive a function which approximates the spatio-temporal characteristics of the video sequence by using the weighted averages of the depth at which the coding unit quadtree is split and the prediction mode decision made by the encoder to estimate spatial and temporal characteristics respectively. Since the video content type of a sequence is determined by using high level information parsed from the video stream, spatio-temporal characteristics are identified without the need for full decoding and can be used in a timely manner to aid decision making in QoE oriented adaptive real time streaming.

  17. An Updated Perspective of Single Event Gate Rupture and Single Event Burnout in Power MOSFETs

    NASA Astrophysics Data System (ADS)

    Titus, Jeffrey L.

    2013-06-01

    Studies over the past 25 years have shown that heavy ions can trigger catastrophic failure modes in power MOSFETs [e.g., single-event gate rupture (SEGR) and single-event burnout (SEB)]. In 1996, two papers were published in a special issue of the IEEE Transaction on Nuclear Science [Johnson, Palau, Dachs, Galloway and Schrimpf, “A Review of the Techniques Used for Modeling Single-Event Effects in Power MOSFETs,” IEEE Trans. Nucl. Sci., vol. 43, no. 2, pp. 546-560, April. 1996], [Titus and Wheatley, “Experimental Studies of Single-Event Gate Rupture and Burnout in Vertical Power MOSFETs,” IEEE Trans. Nucl. Sci., vol. 43, no. 2, pp. 533-545, Apr. 1996]. Those two papers continue to provide excellent information and references with regard to SEB and SEGR in vertical planar MOSFETs. This paper provides updated references/information and provides an updated perspective of SEB and SEGR in vertical planar MOSFETs as well as provides references/information to other device types that exhibit SEB and SEGR effects.

  18. Enhanced semantic interoperability by profiling health informatics standards.

    PubMed

    López, Diego M; Blobel, Bernd

    2009-01-01

    Several standards applied to the healthcare domain support semantic interoperability. These standards are far from being completely adopted in health information system development, however. The objective of this paper is to provide a method and suggest the necessary tooling for reusing standard health information models, by that way supporting the development of semantically interoperable systems and components. The approach is based on the definition of UML Profiles. UML profiling is a formal modeling mechanism to specialize reference meta-models in such a way that it is possible to adapt those meta-models to specific platforms or domains. A health information model can be considered as such a meta-model. The first step of the introduced method identifies the standard health information models and tasks in the software development process in which healthcare information models can be reused. Then, the selected information model is formalized as a UML Profile. That Profile is finally applied to system models, annotating them with the semantics of the information model. The approach is supported on Eclipse-based UML modeling tools. The method is integrated into a comprehensive framework for health information systems development, and the feasibility of the approach is demonstrated in the analysis, design, and implementation of a public health surveillance system, reusing HL7 RIM and DIMs specifications. The paper describes a method and the necessary tooling for reusing standard healthcare information models. UML offers several advantages such as tooling support, graphical notation, exchangeability, extensibility, semi-automatic code generation, etc. The approach presented is also applicable for harmonizing different standard specifications.

  19. A Completely Blind Video Integrity Oracle.

    PubMed

    Mittal, Anish; Saad, Michele A; Bovik, Alan C

    2016-01-01

    Considerable progress has been made toward developing still picture perceptual quality analyzers that do not require any reference picture and that are not trained on human opinion scores of distorted images. However, there do not yet exist any such completely blind video quality assessment (VQA) models. Here, we attempt to bridge this gap by developing a new VQA model called the video intrinsic integrity and distortion evaluation oracle (VIIDEO). The new model does not require the use of any additional information other than the video being quality evaluated. VIIDEO embodies models of intrinsic statistical regularities that are observed in natural vidoes, which are used to quantify disturbances introduced due to distortions. An algorithm derived from the VIIDEO model is thereby able to predict the quality of distorted videos without any external knowledge about the pristine source, anticipated distortions, or human judgments of video quality. Even with such a paucity of information, we are able to show that the VIIDEO algorithm performs much better than the legacy full reference quality measure MSE on the LIVE VQA database and delivers performance comparable with a leading human judgment trained blind VQA model. We believe that the VIIDEO algorithm is a significant step toward making real-time monitoring of completely blind video quality possible.

  20. Multiple reference frames in haptic spatial processing

    NASA Astrophysics Data System (ADS)

    Volčič, R.

    2008-08-01

    The present thesis focused on haptic spatial processing. In particular, our interest was directed to the perception of spatial relations with the main focus on the perception of orientation. To this end, we studied haptic perception in different tasks, either in isolation or in combination with vision. The parallelity task, where participants have to match the orientations of two spatially separated bars, was used in its two-dimensional and three-dimensional versions in Chapter 2 and Chapter 3, respectively. The influence of non-informative vision and visual interference on performance in the parallelity task was studied in Chapter 4. A different task, the mental rotation task, was introduced in a purely haptic study in Chapter 5 and in a visuo-haptic cross-modal study in Chapter 6. The interaction of multiple reference frames and their influence on haptic spatial processing were the common denominators of these studies. In this thesis we approached the problems of which reference frames play the major role in haptic spatial processing and how the relative roles of distinct reference frames change depending on the available information and the constraints imposed by different tasks. We found that the influence of a reference frame centered on the hand was the major cause of the deviations from veridicality observed in both the two-dimensional and three-dimensional studies. The results were described by a weighted average model, in which the hand-centered egocentric reference frame is supposed to have a biasing influence on the allocentric reference frame. Performance in haptic spatial processing has been shown to depend also on sources of information or processing that are not strictly connected to the task at hand. When non-informative vision was provided, a beneficial effect was observed in the haptic performance. This improvement was interpreted as a shift from the egocentric to the allocentric reference frame. Moreover, interfering visual information presented in the vicinity of the haptic stimuli parametrically modulated the magnitude of the deviations. The influence of the hand-centered reference frame was shown also in the haptic mental rotation task where participants were quicker in judging the parity of objects when these were aligned with respect to the hands than when they were physically aligned. Similarly, in the visuo-haptic cross-modal mental rotation task the parity judgments were influenced by the orientation of the exploring hand with respect to the viewing direction. This effect was shown to be modulated also by an intervening temporal delay that supposedly counteracts the influence of the hand-centered reference frame. We suggest that the hand-centered reference frame is embedded in a hierarchical structure of reference frames where some of these emerge depending on the demands and the circumstances of the surrounding environment and the needs of an active perceiver.

  1. Reconfigurable Control Design with Neural Network Augmentation for a Modified F-15 Aircraft

    NASA Technical Reports Server (NTRS)

    Burken, John J.

    2007-01-01

    The viewgraphs present background information about reconfiguration control design, design methods used for paper, control failure survivability results, and results and time histories of tests. Topics examined include control reconfiguration, general information about adaptive controllers, model reference adaptive control (MRAC), the utility of neural networks, radial basis functions (RBF) neural network outputs, neurons, and results of investigations of failures.

  2. Watershed Education for Sustainable Development.

    ERIC Educational Resources Information Center

    Stapp, William B.

    2000-01-01

    Presents information on the Global Rivers Environmental Education Network (GREEN), which is a global communication system for analyzing watershed usage and monitoring the quality and quantity of river water. Describes GREEN's watershed educational model and strategies and international development. (Contains 67 references.) (Author/YDS)

  3. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  4. Alternative Methods for Estimating Plane Parameters Based on a Point Cloud

    NASA Astrophysics Data System (ADS)

    Stryczek, Roman

    2017-12-01

    Non-contact measurement techniques carried out using triangulation optical sensors are increasingly popular in measurements with the use of industrial robots directly on production lines. The result of such measurements is often a cloud of measurement points that is characterized by considerable measuring noise, presence of a number of points that differ from the reference model, and excessive errors that must be eliminated from the analysis. To obtain vector information points contained in the cloud that describe reference models, the data obtained during a measurement should be subjected to appropriate processing operations. The present paperwork presents an analysis of suitability of methods known as RANdom Sample Consensus (RANSAC), Monte Carlo Method (MCM), and Particle Swarm Optimization (PSO) for the extraction of the reference model. The effectiveness of the tested methods is illustrated by examples of measurement of the height of an object and the angle of a plane, which were made on the basis of experiments carried out at workshop conditions.

  5. Theft of information in the take-grant protection model

    NASA Technical Reports Server (NTRS)

    Bishop, Matt

    1989-01-01

    Questions of information flow are in many ways more important than questions of access control, because the goal of many security policies is to thwart the unauthorized release of information, not merely the illicit obtaining of access rights to that information. The Take-Grant Protection Model is a theoretical tool for examining such issues because conditions necessary and sufficient for information to flow between two objects, and for rights to objects to be obtained or stolen, are known. These results are extended by examining the question of information flow from an object the owner of which is unwilling to release that information. Necessary and sufficient conditions for such theft of information to occur are derived, and bounds on the number of subjects that must take action for the theft to occur are presented. To emphasize the usefulness of these results, the security policies of complete isolation, transfer of rights with the cooperation of an owner, and transfer of information (but not rights) with the cooperation of the owner are presented; the last is used to model a simple reference monitor guarding a resource.

  6. PDS4 - Some Principles for Agile Data Curation

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; Crichton, D. J.; Hardman, S. H.; Joyner, R.; Algermissen, S.; Padams, J.

    2015-12-01

    PDS4, a research data management and curation system for NASA's Planetary Science Archive, was developed using principles that promote the characteristics of agile development. The result is an efficient system that produces better research data products while using less resources (time, effort, and money) and maximizes their usefulness for current and future scientists. The key principle is architectural. The PDS4 information architecture is developed and maintained independent of the infrastructure's process, application and technology architectures. The information architecture is based on an ontology-based information model developed to leverage best practices from standard reference models for digital archives, digital object registries, and metadata registries and capture domain knowledge from a panel of planetary science domain experts. The information model provides a sharable, stable, and formal set of information requirements for the system and is the primary source for information to configure most system components, including the product registry, search engine, validation and display tools, and production pipelines. Multi-level governance is also allowed for the effective management of the informational elements at the common, discipline, and project level. This presentation will describe the development principles, components, and uses of the information model and how an information model-driven architecture exhibits characteristics of agile curation including early delivery, evolutionary development, adaptive planning, continuous improvement, and rapid and flexible response to change.

  7. Information Object Definition–based Unified Modeling Language Representation of DICOM Structured Reporting

    PubMed Central

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K.P.

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification. PMID:11751804

  8. Technical attributes, health attribute, consumer attributes and their roles in adoption intention of healthcare wearable technology.

    PubMed

    Zhang, Min; Luo, Meifen; Nie, Rui; Zhang, Yan

    2017-12-01

    This paper aims to explore factors influencing the healthcare wearable technology adoption intention from perspectives of technical attributes (perceived convenience, perceived irreplaceability, perceived credibility and perceived usefulness), health attribute (health belief) and consumer attributes (consumer innovativeness, conspicuous consumption, informational reference group influence and gender difference). By integrating technology acceptance model, health belief model, snob effect and conformity and reference group theory, hypotheses and research model are proposed. The empirical investigation (N=436) collects research data through questionnaire. Results show that the adoption intention of healthcare wearable technology is influenced by technical attributes, health attribute and consumer attributes simultaneously. For technical attributes, perceived convenience and perceived credibility both positively affect perceived usefulness, and perceived usefulness influences adoption intention. The relation between perceived irreplaceability and perceived usefulness is only supported by males. For health attribute, health belief affects perceived usefulness for females. For consumer attributes, conspicuous consumption and informational reference group influence can significantly moderate the relation between perceived usefulness and adoption intention and the relation between consumer innovativeness and adoption intention respectively. What's more, consumer innovativeness significantly affects adoption intention for males. This paper aims to discuss technical attributes, health attribute and consumer attributes and their roles in the adoption intention of healthcare wearable technology. Findings may provide enlightenment to differentiate product developing and marketing strategies and provide some implications for clinical medicine. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Virtual reconstruction of modern and fossil hominoid crania: consequences of reference sample choice.

    PubMed

    Senck, Sascha; Bookstein, Fred L; Benazzi, Stefano; Kastner, Johann; Weber, Gerhard W

    2015-05-01

    Most hominin cranial fossils are incomplete and require reconstruction prior to subsequent analyses. Missing data can be estimated by geometric morphometrics using information from complete specimens, for example, by using thin-plate splines. In this study, we estimate missing data in several virtually fragmented models of hominoid crania (Homo, Pan, Pongo) and fossil hominins (e.g., Australopithecus africanus, Homo heidelbergensis). The aim is to investigate in which way different references influence estimations of cranial shape and how this information can be employed in the reconstruction of fossils. We used a sample of 64 three-dimensional digital models of complete human, chimpanzee, and orangutan crania and a set of 758 landmarks and semilandmarks. The virtually knocked out neurocranial and facial areas that were reconstructed corresponded to those of a real case found in A.L. 444-2 (A. afarensis) cranium. Accuracy of multiple intraspecies and interspecies reconstructions was computed as the maximum square root of the mean squared difference between the original and the reconstruction (root mean square). The results show that the uncertainty in reconstructions is a function of both the geometry of the knockout area and the dissimilarity between the reference sample and the specimen(s) undergoing reconstruction. We suggest that it is possible to estimate large missing cranial areas if the shape of the reference is similar enough to the shape of the specimen reconstructed, though caution must be exercised when employing these reconstructions in subsequent analyses. We provide a potential guide for the choice of the reference by means of bending energy. © 2015 Wiley Periodicals, Inc.

  10. Reference and Information Services. The Bookmark, Volume 41, Number II, Winter 1983.

    ERIC Educational Resources Information Center

    The Bookmark, 1983

    1983-01-01

    Thirteen articles comprise this issue on reference and information services: (1) "Librarianship as Information Resources Management," by Bettina H. Wolff; (2) one librarian's views on misinformation, disinformation, and information overload, by Murray Bob; (3-6) descriptions of reference and information services at the John Jay College…

  11. Handling of subpixel structures in the application of satellite derived irradiance data for solar energy system analysis - a review

    NASA Astrophysics Data System (ADS)

    Beyer, Hans Georg

    2016-04-01

    With the increasing availability of satellite derived irradiance information, this type of data set is more and more in use for the design and operation of solar energy systems, most notably PV- and CSP-systems. By this, the need for data measured on-site is reduced. However, due to basic limitations of the satellite-derived data, several requirements put by the intended application cannot be coped with this data type directly. Traw satellite information has to be enhanced in both space and time resolution by additional information to be fully applicable for all aspects of the modelling od solar energy systems. To cope with this problem, several individual and collaborative projects had been performed in the recent years or are ongoing. Approaches are on one hand based on pasting synthesized high-resolution data into the low-resolution original sets. Pre-requite is an appropriate model, validated against real world data. For the case of irradiance data, these models can be extracted either directly from ground measured data sets or from data referring to the cloud situation as gained from the images of sky cameras or from monte -carlo initialized physical models. The current models refer to the spatial structure of the cloud fields. Dynamics are imposed by moving the cloud structures according to a large scale cloud motion vector, either extracted from the dynamics interfered from consecutive satellite images or taken from a meso-scale meteorological model. Dynamic irradiance information is then derived from the cloud field structure and the cloud motion vector. This contribution, which is linked to subtask A - Solar Resource Applications for High Penetration of Solar Technologies - of IEA SHC task 46, will present the different approaches and discuss examples in view of validation, need for auxiliary information and respective general applicability.

  12. A reference model for model-based design of critical infrastructure protection systems

    NASA Astrophysics Data System (ADS)

    Shin, Young Don; Park, Cheol Young; Lee, Jae-Chon

    2015-05-01

    Today's war field environment is getting versatile as the activities of unconventional wars such as terrorist attacks and cyber-attacks have noticeably increased lately. The damage caused by such unconventional wars has also turned out to be serious particularly if targets are critical infrastructures that are constructed in support of banking and finance, transportation, power, information and communication, government, and so on. The critical infrastructures are usually interconnected to each other and thus are very vulnerable to attack. As such, to ensure the security of critical infrastructures is very important and thus the concept of critical infrastructure protection (CIP) has come. The program to realize the CIP at national level becomes the form of statute in each country. On the other hand, it is also needed to protect each individual critical infrastructure. The objective of this paper is to study on an effort to do so, which can be called the CIP system (CIPS). There could be a variety of ways to design CIPS's. Instead of considering the design of each individual CIPS, a reference model-based approach is taken in this paper. The reference model represents the design of all the CIPS's that have many design elements in common. In addition, the development of the reference model is also carried out using a variety of model diagrams. The modeling language used therein is the systems modeling language (SysML), which was developed and is managed by Object Management Group (OMG) and a de facto standard. Using SysML, the structure and operational concept of the reference model are designed to fulfil the goal of CIPS's, resulting in the block definition and activity diagrams. As a case study, the operational scenario of the nuclear power plant while being attacked by terrorists is studied using the reference model. The effectiveness of the results is also analyzed using multiple analysis models. It is thus expected that the approach taken here has some merits over the traditional design methodology of repeating requirements analysis and system design.

  13. Combining cow and bull reference populations to increase accuracy of genomic prediction and genome-wide association studies.

    PubMed

    Calus, M P L; de Haas, Y; Veerkamp, R F

    2013-10-01

    Genomic selection holds the promise to be particularly beneficial for traits that are difficult or expensive to measure, such that access to phenotypes on large daughter groups of bulls is limited. Instead, cow reference populations can be generated, potentially supplemented with existing information from the same or (highly) correlated traits available on bull reference populations. The objective of this study, therefore, was to develop a model to perform genomic predictions and genome-wide association studies based on a combined cow and bull reference data set, with the accuracy of the phenotypes differing between the cow and bull genomic selection reference populations. The developed bivariate Bayesian stochastic search variable selection model allowed for an unbalanced design by imputing residuals in the residual updating scheme for all missing records. The performance of this model is demonstrated on a real data example, where the analyzed trait, being milk fat or protein yield, was either measured only on a cow or a bull reference population, or recorded on both. Our results were that the developed bivariate Bayesian stochastic search variable selection model was able to analyze 2 traits, even though animals had measurements on only 1 of 2 traits. The Bayesian stochastic search variable selection model yielded consistently higher accuracy for fat yield compared with a model without variable selection, both for the univariate and bivariate analyses, whereas the accuracy of both models was very similar for protein yield. The bivariate model identified several additional quantitative trait loci peaks compared with the single-trait models on either trait. In addition, the bivariate models showed a marginal increase in accuracy of genomic predictions for the cow traits (0.01-0.05), although a greater increase in accuracy is expected as the size of the bull population increases. Our results emphasize that the chosen value of priors in Bayesian genomic prediction models are especially important in small data sets. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Advantages of soft versus hard constraints in self-modeling curve resolution problems. Alternating least squares with penalty functions.

    PubMed

    Gemperline, Paul J; Cash, Eric

    2003-08-15

    A new algorithm for self-modeling curve resolution (SMCR) that yields improved results by incorporating soft constraints is described. The method uses least squares penalty functions to implement constraints in an alternating least squares algorithm, including nonnegativity, unimodality, equality, and closure constraints. By using least squares penalty functions, soft constraints are formulated rather than hard constraints. Significant benefits are (obtained using soft constraints, especially in the form of fewer distortions due to noise in resolved profiles. Soft equality constraints can also be used to introduce incomplete or partial reference information into SMCR solutions. Four different examples demonstrating application of the new method are presented, including resolution of overlapped HPLC-DAD peaks, flow injection analysis data, and batch reaction data measured by UV/visible and near-infrared spectroscopy (NIR). Each example was selected to show one aspect of the significant advantages of soft constraints over traditionally used hard constraints. Incomplete or partial reference information into self-modeling curve resolution models is described. The method offers a substantial improvement in the ability to resolve time-dependent concentration profiles from mixture spectra recorded as a function of time.

  15. Estimation of genomic prediction accuracy from reference populations with varying degrees of relationship.

    PubMed

    Lee, S Hong; Clark, Sam; van der Werf, Julius H J

    2017-01-01

    Genomic prediction is emerging in a wide range of fields including animal and plant breeding, risk prediction in human precision medicine and forensic. It is desirable to establish a theoretical framework for genomic prediction accuracy when the reference data consists of information sources with varying degrees of relationship to the target individuals. A reference set can contain both close and distant relatives as well as 'unrelated' individuals from the wider population in the genomic prediction. The various sources of information were modeled as different populations with different effective population sizes (Ne). Both the effective number of chromosome segments (Me) and Ne are considered to be a function of the data used for prediction. We validate our theory with analyses of simulated as well as real data, and illustrate that the variation in genomic relationships with the target is a predictor of the information content of the reference set. With a similar amount of data available for each source, we show that close relatives can have a substantially larger effect on genomic prediction accuracy than lesser related individuals. We also illustrate that when prediction relies on closer relatives, there is less improvement in prediction accuracy with an increase in training data or marker panel density. We release software that can estimate the expected prediction accuracy and power when combining different reference sources with various degrees of relationship to the target, which is useful when planning genomic prediction (before or after collecting data) in animal, plant and human genetics.

  16. Using features of Arden Syntax with object-oriented medical data models for guideline modeling.

    PubMed

    Peleg, M; Ogunyemi, O; Tu, S; Boxwala, A A; Zeng, Q; Greenes, R A; Shortliffe, E H

    2001-01-01

    Computer-interpretable guidelines (CIGs) can deliver patient-specific decision support at the point of care. CIGs base their recommendations on eligibility and decision criteria that relate medical concepts to patient data. CIG models use expression languages for specifying these criteria, and define models for medical data to which the expressions can refer. In developing version 3 of the GuideLine Interchange Format (GLIF3), we used existing standards as the medical data model and expression language. We investigated the object-oriented HL7 Reference Information Model (RIM) as a default data model. We developed an expression language, called GEL, based on Arden Syntax's logic grammar. Together with other GLIF constructs, GEL reconciles incompatibilities between the data models of Arden Syntax and the HL7 RIM. These incompatibilities include Arden's lack of support for complex data types and time intervals, and the mismatch between Arden's single primary time and multiple time attributes of the HL7 RIM.

  17. Dimensional accuracy of jaw scans performed on alginate impressions or stone models: A practice-oriented study.

    PubMed

    Vogel, Annike B; Kilic, Fatih; Schmidt, Falko; Rübel, Sebastian; Lapatki, Bernd G

    2015-07-01

    Digital jaw models offer more extensive possibilities for analysis than casts and make it easier to share and archive relevant information. The aim of this study was to compare the dimensional accuracy of scans performed on alginate impressions and on stone models to reference scans performed on underlying resin models. Precision spheres 5 mm in diameter were occlusally fitted to the sites of the first premolars and first molars on a pair of jaw models fabricated from resin. A structured-light scanner was used for digitization. Once the two reference models had been scanned, alginate impressions were taken and scanned after no later than 1 h. A third series of scans was performed on type III stone models derived from the impressions. All scans were analyzed by performing five repeated measurements to determine the distances between the various sphere centers. Compared to the reference scans, the stone-model scans were larger by a mean of 73.6 µm (maxilla) or 65.2 µm (mandible). The impression scans were only larger by 7.7 µm (maxilla) or smaller by 0.7 µm (mandible). Median standard deviations over the five repeated measurements of 1.0 µm for the reference scans, 2.35 µm for the impression scans, and 2.0 µm for the stone-model scans indicate that the values measured in this study were adequately reproducible. Alginate impressions can be suitably digitized by structured-light scanning and offer considerably better dimensional accuracy than stone models. Apparently, however, both impression scans and stone-model scans can offer adequate precision for orthodontic purposes. The main issue of impression scans (which is incomplete representation of model surfaces) is being systematically explored in a follow-up study.

  18. The Librarian as Information Consultant: Transforming Reference for the Information Age

    ERIC Educational Resources Information Center

    Murphy, Sarah Anne

    2011-01-01

    Library users' evolving information needs and their choice of search methods have changed reference work profoundly. Today's reference librarian must work in a whole new way--not only service-focused and businesslike, but even entrepreneurial. Murphy innovatively rethinks the philosophy behind current library reference services in this…

  19. A three-talk model for shared decision making: multistage consultation process.

    PubMed

    Elwyn, Glyn; Durand, Marie Anne; Song, Julia; Aarts, Johanna; Barr, Paul J; Berger, Zackary; Cochran, Nan; Frosch, Dominick; Galasiński, Dariusz; Gulbrandsen, Pål; Han, Paul K J; Härter, Martin; Kinnersley, Paul; Lloyd, Amy; Mishra, Manish; Perestelo-Perez, Lilisbeth; Scholl, Isabelle; Tomori, Kounosuke; Trevena, Lyndal; Witteman, Holly O; Van der Weijden, Trudy

    2017-11-06

    Objectives  To revise an existing three-talk model for learning how to achieve shared decision making, and to consult with relevant stakeholders to update and obtain wider engagement. Design  Multistage consultation process. Setting  Key informant group, communities of interest, and survey of clinical specialties. Participants  19 key informants, 153 member responses from multiple communities of interest, and 316 responses to an online survey from medically qualified clinicians from six specialties. Results  After extended consultation over three iterations, we revised the three-talk model by making changes to one talk category, adding the need to elicit patient goals, providing a clear set of tasks for each talk category, and adding suggested scripts to illustrate each step. A new three-talk model of shared decision making is proposed, based on "team talk," "option talk," and "decision talk," to depict a process of collaboration and deliberation. Team talk places emphasis on the need to provide support to patients when they are made aware of choices, and to elicit their goals as a means of guiding decision making processes. Option talk refers to the task of comparing alternatives, using risk communication principles. Decision talk refers to the task of arriving at decisions that reflect the informed preferences of patients, guided by the experience and expertise of health professionals. Conclusions  The revised three-talk model of shared decision making depicts conversational steps, initiated by providing support when introducing options, followed by strategies to compare and discuss trade-offs, before deliberation based on informed preferences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. Reflecting on the challenges of building a rich interconnected metadata database to describe the experiments of phase six of the coupled climate model intercomparison project (CMIP6) for the Earth System Documentation Project (ES-DOC) and anticipating the opportunities that tooling and services based on rich metadata can provide.

    NASA Astrophysics Data System (ADS)

    Pascoe, C. L.

    2017-12-01

    The Coupled Model Intercomparison Project (CMIP) has coordinated climate model experiments involving multiple international modelling teams since 1995. This has led to a better understanding of past, present, and future climate. The 2017 sixth phase of the CMIP process (CMIP6) consists of a suite of common experiments, and 21 separate CMIP-Endorsed Model Intercomparison Projects (MIPs) making a total of 244 separate experiments. Precise descriptions of the suite of CMIP6 experiments have been captured in a Common Information Model (CIM) database by the Earth System Documentation Project (ES-DOC). The database contains descriptions of forcings, model configuration requirements, ensemble information and citation links, as well as text descriptions and information about the rationale for each experiment. The database was built from statements about the experiments found in the academic literature, the MIP submissions to the World Climate Research Programme (WCRP), WCRP summary tables and correspondence with the principle investigators for each MIP. The database was collated using spreadsheets which are archived in the ES-DOC Github repository and then rendered on the ES-DOC website. A diagramatic view of the workflow of building the database of experiment metadata for CMIP6 is shown in the attached figure.The CIM provides the formalism to collect detailed information from diverse sources in a standard way across all the CMIP6 MIPs. The ES-DOC documentation acts as a unified reference for CMIP6 information to be used both by data producers and consumers. This is especially important given the federated nature of the CMIP6 project. Because the CIM allows forcing constraints and other experiment attributes to be referred to by more than one experiment, we can streamline the process of collecting information from modelling groups about how they set up their models for each experiment. End users of the climate model archive will be able to ask questions enabled by the interconnectedness of the metadata such as "Which MIPs make use of experiment A?" and "Which experiments use forcing constraint B?".

  1. Biomedical data integration - capturing similarities while preserving disparities.

    PubMed

    Bianchi, Stefano; Burla, Anna; Conti, Costanza; Farkash, Ariel; Kent, Carmel; Maman, Yonatan; Shabo, Amnon

    2009-01-01

    One of the challenges of healthcare data processing, analysis and warehousing is the integration of data gathered from disparate and diverse data sources. Promoting the adoption of worldwide accepted information standards along with common terminologies and the use of technologies derived from semantic web representation, is a suitable path to achieve that. To that end, the HL7 V3 Reference Information Model (RIM) [1] has been used as the underlying information model coupled with the Web Ontology Language (OWL) [2] as the semantic data integration technology. In this paper we depict a biomedical data integration process and demonstrate how it was used for integrating various data sources, containing clinical, environmental and genomic data, within Hypergenes, a European Commission funded project exploring the Essential Hypertension [3] disease model.

  2. Development of a paediatric population-based model of the pharmacokinetics of rivaroxaban.

    PubMed

    Willmann, Stefan; Becker, Corina; Burghaus, Rolf; Coboeken, Katrin; Edginton, Andrea; Lippert, Jörg; Siegmund, Hans-Ulrich; Thelen, Kirstin; Mück, Wolfgang

    2014-01-01

    Venous thromboembolism has been increasingly recognised as a clinical problem in the paediatric population. Guideline recommendations for antithrombotic therapy in paediatric patients are based mainly on extrapolation from adult clinical trial data, owing to the limited number of clinical trials in paediatric populations. The oral, direct Factor Xa inhibitor rivaroxaban has been approved in adult patients for several thromboembolic disorders, and its well-defined pharmacokinetic and pharmacodynamic characteristics and efficacy and safety profiles in adults warrant further investigation of this agent in the paediatric population. The objective of this study was to develop and qualify a physiologically based pharmacokinetic (PBPK) model for rivaroxaban doses of 10 and 20 mg in adults and to scale this model to the paediatric population (0-18 years) to inform the dosing regimen for a clinical study of rivaroxaban in paediatric patients. Experimental data sets from phase I studies supported the development and qualification of an adult PBPK model. This adult PBPK model was then scaled to the paediatric population by including anthropometric and physiological information, age-dependent clearance and age-dependent protein binding. The pharmacokinetic properties of rivaroxaban in virtual populations of children were simulated for two body weight-related dosing regimens equivalent to 10 and 20 mg once daily in adults. The quality of the model was judged by means of a visual predictive check. Subsequently, paediatric simulations of the area under the plasma concentration-time curve (AUC), maximum (peak) plasma drug concentration (C max) and concentration in plasma after 24 h (C 24h) were compared with the adult reference simulations. Simulations for AUC, C max and C 24h throughout the investigated age range largely overlapped with values obtained for the corresponding dose in the adult reference simulation for both body weight-related dosing regimens. However, pharmacokinetic values in infants and preschool children (body weight <40 kg) were lower than the 90 % confidence interval threshold of the adult reference model and, therefore, indicated that doses in these groups may need to be increased to achieve the same plasma levels as in adults. For children with body weight between 40 and 70 kg, simulated plasma pharmacokinetic parameters (C max, C 24h and AUC) overlapped with the values obtained in the corresponding adult reference simulation, indicating that body weight-related exposure was similar between these children and adults. In adolescents of >70 kg body weight, the simulated 90 % prediction interval values of AUC and C 24h were much higher than the 90 % confidence interval of the adult reference population, owing to the weight-based simulation approach, but for these patients rivaroxaban would be administered at adult fixed doses of 10 and 20 mg. The paediatric PBPK model developed here allowed an exploratory analysis of the pharmacokinetics of rivaroxaban in children to inform the dosing regimen for a clinical study in paediatric patients.

  3. Uncertainty in Reference and Information Service

    ERIC Educational Resources Information Center

    VanScoy, Amy

    2015-01-01

    Introduction: Uncertainty is understood as an important component of the information seeking process, but it has not been explored as a component of reference and information service. Method: Interpretative phenomenological analysis was used to examine the practitioner perspective of reference and information service for eight academic research…

  4. Exploring behavior of an unusual megaherbivore: A spatially explicit foraging model of the hippopotamus

    USGS Publications Warehouse

    Lewison, R.L.; Carter, J.

    2004-01-01

    Herbivore foraging theories have been developed for and tested on herbivores across a range of sizes. Due to logistical constraints, however, little research has focused on foraging behavior of megaherbivores. Here we present a research approach that explores megaherbivore foraging behavior, and assesses the applicability of foraging theories developed on smaller herbivores to megafauna. With simulation models as reference points for the analysis of empirical data, we investigate foraging strategies of the common hippopotamus (Hippopotamus amphibius). Using a spatially explicit individual based foraging model, we apply traditional herbivore foraging strategies to a model hippopotamus, compare model output, and then relate these results to field data from wild hippopotami. Hippopotami appear to employ foraging strategies that respond to vegetation characteristics, such as vegetation quality, as well as spatial reference information, namely distance to a water source. Model predictions, field observations, and comparisons of the two support that hippopotami generally conform to the central place foraging construct. These analyses point to the applicability of general herbivore foraging concepts to megaherbivores, but also point to important differences between hippopotami and other herbivores. Our synergistic approach of models as reference points for empirical data highlights a useful method of behavioral analysis for hard-to-study megafauna. ?? 2003 Elsevier B.V. All rights reserved.

  5. Human Augmentation of Reasoning Through Patterning (HARP)

    DTIC Science & Technology

    2008-04-01

    develop what we then referred to as “ Uber - CIM,” in which a set of independent but tightly-joined CIM models could be developed. However, although that...analysts to apply “tags” (keywords) to Web-based resources, and to see and leverage the tags and tagged resources of others. Catalyst is a modeling ...issues. Catalyst models consist of nodes of information organized into hierarchical tree structures. Nodes can contain attachments or links to tags

  6. Jobs and Economic Development Impact (JEDI) Model: Offshore Wind User Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lantz, E.; Goldberg, M.; Keyser, D.

    2013-06-01

    The Offshore Wind Jobs and Economic Development Impact (JEDI) model, developed by NREL and MRG & Associates, is a spreadsheet based input-output tool. JEDI is meant to be a user friendly and transparent tool to estimate potential economic impacts supported by the development and operation of offshore wind projects. This guide describes how to use the model as well as technical information such as methodology, limitations, and data sources.

  7. Information, Consistent Estimation and Dynamic System Identification.

    DTIC Science & Technology

    1976-11-01

    Washington,DC 232129 Tj-CUOSITORING AGENCY NAMIE 6 AOORESS(lI dittevmet Itroo CuooottaaII Offics) IS.- SECURITY CLASS. (of this *.part) SCHEDULE ’B...representative model from a given model set, applicable to infinite and even non-compact model sets. S-UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAOrj(*whe...ergodicity. For a thorough development of ergodic theory the reader is referred to, e.g., Doob [1953], Halmos [1956] and Chacon and Ornstein [1959

  8. The Consumer Health Information System Adoption Model.

    PubMed

    Monkman, Helen; Kushniruk, Andre W

    2015-01-01

    Derived from overlapping concepts in consumer health, a consumer health information system refers to any of the broad range of applications, tools, and educational resources developed to empower consumers with knowledge, techniques, and strategies, to manage their own health. As consumer health information systems become increasingly popular, it is important to explore the factors that impact their adoption and success. Accumulating evidence indicates a relationship between usability and consumers' eHealth Literacy skills and the demands consumer HISs place on their skills. Here, we present a new model called the Consumer Health Information System Adoption Model, which depicts both consumer eHealth literacy skills and system demands on eHealth literacy as moderators with the potential to affect the strength of relationship between usefulness and usability (predictors of usage) and adoption, value, and successful use (actual usage outcomes). Strategies for aligning these two moderating factors are described.

  9. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    NASA Astrophysics Data System (ADS)

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  10. CCSDS Spacecraft Monitor and Control Service Framework

    NASA Technical Reports Server (NTRS)

    Merri, Mario; Schmidt, Michael; Ercolani, Alessandro; Dankiewicz, Ivan; Cooper, Sam; Thompson, Roger; Symonds, Martin; Oyake, Amalaye; Vaughs, Ashton; Shames, Peter

    2004-01-01

    This CCSDS paper presents a reference architecture and service framework for spacecraft monitoring and control. It has been prepared by the Spacecraft Monitoring and Control working group of the CCSDS Mission Operations and Information Management Systems (MOIMS) area. In this context, Spacecraft Monitoring and Control (SM&C) refers to end-to-end services between on- board or remote applications and ground-based functions responsible for mission operations. The scope of SM&C includes: 1) Operational Concept: definition of an operational concept that covers a set of standard operations activities related to the monitoring and control of both ground and space segments. 2) Core Set of Services: definition of an extensible set of services to support the operational concept together with its information model and behaviours. This includes (non exhaustively) ground systems such as Automatic Command and Control, Data Archiving and Retrieval, Flight Dynamics, Mission Planning and Performance Evaluation. 3) Application-layer information: definition of the standard information set to be exchanged for SM&C purposes.

  11. Terminological reference of a knowledge-based system: the data dictionary.

    PubMed

    Stausberg, J; Wormek, A; Kraut, U

    1995-01-01

    The development of open and integrated knowledge bases makes new demands on the definition of the used terminology. The definition should be realized in a data dictionary separated from the knowledge base. Within the works done at a reference model of medical knowledge, a data dictionary has been developed and used in different applications: a term definition shell, a documentation tool and a knowledge base. The data dictionary includes that part of terminology, which is largely independent of a certain knowledge model. For that reason, the data dictionary can be used as a basis for integrating knowledge bases into information systems, for knowledge sharing and reuse and for modular development of knowledge-based systems.

  12. Reference Architecture Model Enabling Standards Interoperability.

    PubMed

    Blobel, Bernd

    2017-01-01

    Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.

  13. The Terry-Wiseman Security Policy Model and Examples of Its Use

    DTIC Science & Technology

    1990-03-01

    Wiseman Security Policy Model and Examples of Its Use Author: C L Harrold Date: March 1990 Abstract This paper presents a model of security for computer ...Evolution of the Model 7. Summary and References Annex: An Overviev , of the Z N-,,tion Ace5sston For N?.1S (tRA&1DTIC TM-, [ U mr, ),inced El ,v l I...a computer . The files, objects or register locations in which the information may be stored are modelled by the black boxes. The robots model the

  14. Examining the cognitive costs of counterfactual language comprehension: Evidence from ERPs.

    PubMed

    Ferguson, Heather J; Cane, James E

    2015-10-05

    Recent empirical research suggests that understanding a counterfactual event (e.g. 'If Josie had revised, she would have passed her exams') activates mental representations of both the factual and counterfactual versions of events. However, it remains unclear when readers switch between these models during comprehension, and whether representing multiple 'worlds' is cognitively effortful. This paper reports two ERP studies where participants read contexts that set up a factual or counterfactual scenario, followed by a second sentence describing a consequence of this event. Critically, this sentence included a noun that was either consistent or inconsistent with the preceding context, and either included a modal verb to indicate reference to the counterfactual-world or not (thus referring to the factual-world). Experiment 2 used adapted versions of the materials used in Experiment 1 to examine the degree to which representing multiple versions of a counterfactual situation makes heavy demands on cognitive resources by measuring individuals' working memory capacity. Results showed that when reference to the counterfactual-world was maintained by the ongoing discourse, readers correctly interpreted events according to the counterfactual-world (i.e. showed larger N400 for inconsistent than consistent words). In contrast, when cues referred back to the factual-world, readers showed no difference between consistent and inconsistent critical words, suggesting that they simultaneously compared information against both possible worlds. These results support previous dual-representation accounts for counterfactuals, and provide new evidence that linguistic cues can guide the reader in selecting which world model to evaluate incoming information against. Crucially, we reveal evidence that maintaining and updating a hypothetical model over time relies upon the availability of cognitive resources. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. NASA's Earth Resources Laboratory - Seventeen years of using remotely sensed satellite data in land applications

    NASA Technical Reports Server (NTRS)

    Cashion, Kenneth D.; Whitehurst, Charles A.

    1987-01-01

    The activities of the Earth Resources Laboratoy (ERL) for the past seventeen years are reviewed with particular reference to four typical applications demonstrating the use of remotely sensed data in a geobased information system context. The applications discussed are: a fire control model for the Olympic National Park; wildlife habitat modeling; a resource inventory system including a potential soil erosion model; and a corridor analysis model for locating routes between geographical locations. Some future applications are also discussed.

  16. A Phrase-Based Matching Function.

    ERIC Educational Resources Information Center

    Galbiati, Giulia

    1991-01-01

    Describes the development of an information retrieval system designed for nonspecialist users that is based on the binary vector model. The syntactic structure of phrases used for indexing is examined, queries using an experimental collection of documents are described, and precision values are examined. (19 references) (LRW)

  17. Environmental indicator principium with case references to agricultural soil, water, and air qualities and model-derived indicators

    USDA-ARS?s Scientific Manuscript database

    Environmental indicators are powerful tools for tracking environmental changes, measuring environmental performance, and informing policy makers. With the ubiquitous nature of environmental assets and within the broad themes of environmental disciplines, many diverse environmental indicators, inclu...

  18. Ways to estimate speeds for the purposes of air quality conformity analyses.

    DOT National Transportation Integrated Search

    2002-01-01

    A speed post-processor refers to equations or lookup tables that can determine vehicle speeds on a particular roadway link using only the limited information available in a long-range planning model. An estimated link speed is usually based on volume...

  19. Functional Requirements for Information Resource Provenance on the Web

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCusker, James P.; Lebo, Timothy; Graves, Alvaro

    We provide a means to formally explain the relationship between HTTP URLs and the representations returned when they are requested. According to existing World Wide Web architecture, the URL serves as an identier for a semiotic referent while the document returned via HTTP serves as a representation of the same referent. This begins with two sides of a semiotic triangle; the third side is the relationship between the URL and the representation received. We complete this description by extending the library science resource model Functional Requirements for Bibliographic Resources (FRBR) with cryptographic message and content digests to create a Functionalmore » Requirements for Information Resources (FRIR). We show how applying the FRIR model to HTTP GET and POST transactions disambiguates the many relationships between a given URL and all representations received from its request, provides fine-grained explanations that are complementary to existing explanations of web resources, and integrates easily into the emerging W3C provenance standard.« less

  20. Identification of candidate reference chemicals for in vitro steroidogenesis assays.

    PubMed

    Pinto, Caroline Lucia; Markey, Kristan; Dix, David; Browne, Patience

    2018-03-01

    The Endocrine Disruptor Screening Program (EDSP) is transitioning from traditional testing methods to integrating ToxCast/Tox21 in vitro high-throughput screening assays for identifying chemicals with endocrine bioactivity. The ToxCast high-throughput H295R steroidogenesis assay may potentially replace the low-throughput assays currently used in the EDSP Tier 1 battery to detect chemicals that alter the synthesis of androgens and estrogens. Herein, we describe an approach for identifying in vitro candidate reference chemicals that affect the production of androgens and estrogens in models of steroidogenesis. Candidate reference chemicals were identified from a review of H295R and gonad-derived in vitro assays used in methods validation and published in the scientific literature. A total of 29 chemicals affecting androgen and estrogen levels satisfied all criteria for positive reference chemicals, while an additional set of 21 and 15 chemicals partially fulfilled criteria for positive reference chemicals for androgens and estrogens, respectively. The identified chemicals included pesticides, pharmaceuticals, industrial and naturally-occurring chemicals with the capability to increase or decrease the levels of the sex hormones in vitro. Additionally, 14 and 15 compounds were identified as potential negative reference chemicals for effects on androgens and estrogens, respectively. These candidate reference chemicals will be informative for performance-based validation of in vitro steroidogenesis models. Copyright © 2017. Published by Elsevier Ltd.

  1. Assessing and forecasting population health: integrating knowledge and beliefs in a comprehensive framework.

    PubMed

    Van Meijgaard, Jeroen; Fielding, Jonathan E; Kominski, Gerald F

    2009-01-01

    A comprehensive population health-forecasting model has the potential to interject new and valuable information about the future health status of the population based on current conditions, socioeconomic and demographic trends, and potential changes in policies and programs. Our Health Forecasting Model uses a continuous-time microsimulation framework to simulate individuals' lifetime histories by using birth, risk exposures, disease incidence, and death rates to mark changes in the state of the individual. The model generates a reference forecast of future health in California, including details on physical activity, obesity, coronary heart disease, all-cause mortality, and medical expenditures. We use the model to answer specific research questions, inform debate on important policy issues in public health, support community advocacy, and provide analysis on the long-term impact of proposed changes in policies and programs, thus informing stakeholders at all levels and supporting decisions that can improve the health of populations.

  2. Sensitivity tests to define the source apportionment performance criteria in the DeltaSA tool

    NASA Astrophysics Data System (ADS)

    Pernigotti, Denise; Belis, Claudio A.

    2017-04-01

    Identification and quantification of the contribution of emission sources to a given area is a key task for the design of abatement strategies. Moreover, European member states are obliged to report this kind of information for zones where the pollution levels exceed the limit values. At present, little is known about the performance and uncertainty of the variety of methodologies used for source apportionment and the comparability between the results of studies using different approaches. The source apportionment Delta (SA Delta) is a tool developed by the EC-JRC to support the particulate matter source apportionment modellers in the identification of sources (for factor analysis studies) and/or in the measure of their performance. The source identification is performed by the tool measuring the proximity of any user chemical profile to preloaded repository data (SPECIATE and SPECIEUROPE). The model performances criteria are based on standard statistical indexes calculated by comparing participants' source contribute estimates and their time series with preloaded references data. Those preloaded data refer to previous European SA intercomparison exercises: the first with real world data (22 participants), the second with synthetic data (25 participants) and the last with real world data which was also extended to Chemical Transport Models (38 receptor models and 4 CTMs). The references used for the model performances are 'true' (predefined by JRC) for the synthetic while they are calculated as ensemble average of the participants' results in real world intercomparisons. The candidates used for each source ensemble reference calculation were selected among participants results based on a number of consistency checks plus the similarity between their chemical profiles to the repository measured data. The estimation of the ensemble reference uncertainty is crucial in order to evaluate the users' performances against it. For this reason a sensitivity analysis on different methods to estimate the ensemble references' uncertainties was performed re-analyzing the synthetic intercomparison dataset, the only one where 'true' reference and ensemble reference contributions were both present. The Delta SA is now available on-line and will be presented, with a critical discussion of the sensitivity analysis on the ensemble reference uncertainty. In particular the grade of among participants mutual agreement on the presence of a certain source should be taken into account. Moreover also the importance of the synthetic intercomparisons in order to catch receptor models common biases will be stressed.

  3. Rényi entropy measure of noise-aided information transmission in a binary channel.

    PubMed

    Chapeau-Blondeau, François; Rousseau, David; Delahaies, Agnès

    2010-05-01

    This paper analyzes a binary channel by means of information measures based on the Rényi entropy. The analysis extends, and contains as a special case, the classic reference model of binary information transmission based on the Shannon entropy measure. The extended model is used to investigate further possibilities and properties of stochastic resonance or noise-aided information transmission. The results demonstrate that stochastic resonance occurs in the information channel and is registered by the Rényi entropy measures at any finite order, including the Shannon order. Furthermore, in definite conditions, when seeking the Rényi information measures that best exploit stochastic resonance, then nontrivial orders differing from the Shannon case usually emerge. In this way, through binary information transmission, stochastic resonance identifies optimal Rényi measures of information differing from the classic Shannon measure. A confrontation of the quantitative information measures with visual perception is also proposed in an experiment of noise-aided binary image transmission.

  4. Reduced-Reference Quality Assessment Based on the Entropy of DWT Coefficients of Locally Weighted Gradient Magnitudes.

    PubMed

    Golestaneh, S Alireza; Karam, Lina

    2016-08-24

    Perceptual image quality assessment (IQA) attempts to use computational models to estimate the image quality in accordance with subjective evaluations. Reduced-reference (RR) image quality assessment (IQA) methods make use of partial information or features extracted from the reference image for estimating the quality of distorted images. Finding a balance between the number of RR features and accuracy of the estimated image quality is essential and important in IQA. In this paper we propose a training-free low-cost RRIQA method that requires a very small number of RR features (6 RR features). The proposed RRIQA algorithm is based on the discrete wavelet transform (DWT) of locally weighted gradient magnitudes.We apply human visual system's contrast sensitivity and neighborhood gradient information to weight the gradient magnitudes in a locally adaptive manner. The RR features are computed by measuring the entropy of each DWT subband, for each scale, and pooling the subband entropies along all orientations, resulting in L RR features (one average entropy per scale) for an L-level DWT. Extensive experiments performed on seven large-scale benchmark databases demonstrate that the proposed RRIQA method delivers highly competitive performance as compared to the state-of-the-art RRIQA models as well as full reference ones for both natural and texture images. The MATLAB source code of REDLOG and the evaluation results are publicly available online at https://http://lab.engineering.asu.edu/ivulab/software/redlog/.

  5. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system was developed to allow users to easily select the most reliable reference climate data at each target point based on the elevation of grid cell. By constructing the best combination of reference data for the study domain, the accurate and reliable statistically downscaled climate projections could be significantly improved.

  6. Cultural Omnivorousness and Musical Gentrification: An Outline of a Sociological Framework and Its Applications for Music Education Research

    ERIC Educational Resources Information Center

    Dyndahl, Petter; Karlsen, Sidsel; Skårberg, Odd; Nielsen, Siw Graabraek

    2014-01-01

    In this article, we aim to develop a theoretical model to understand what we refer to as "musical gentrification" and to explore how this model might be applied to and inform music education research. We start from a Bourdieusian point of view, elaborating on the connections between social class and cultural capital, and then move on to…

  7. Software Helps Retrieve Information Relevant to the User

    NASA Technical Reports Server (NTRS)

    Mathe, Natalie; Chen, James

    2003-01-01

    The Adaptive Indexing and Retrieval Agent (ARNIE) is a code library, designed to be used by an application program, that assists human users in retrieving desired information in a hypertext setting. Using ARNIE, the program implements a computational model for interactively learning what information each human user considers relevant in context. The model, called a "relevance network," incrementally adapts retrieved information to users individual profiles on the basis of feedback from the users regarding specific queries. The model also generalizes such knowledge for subsequent derivation of relevant references for similar queries and profiles, thereby, assisting users in filtering information by relevance. ARNIE thus enables users to categorize and share information of interest in various contexts. ARNIE encodes the relevance and structure of information in a neural network dynamically configured with a genetic algorithm. ARNIE maintains an internal database, wherein it saves associations, and from which it returns associated items in response to a query. A C++ compiler for a platform on which ARNIE will be utilized is necessary for creating the ARNIE library but is not necessary for the execution of the software.

  8. A logical approach to semantic interoperability in healthcare.

    PubMed

    Bird, Linda; Brooks, Colleen; Cheong, Yu Chye; Tun, Nwe Ni

    2011-01-01

    Singapore is in the process of rolling out a number of national e-health initiatives, including the National Electronic Health Record (NEHR). A critical enabler in the journey towards semantic interoperability is a Logical Information Model (LIM) that harmonises the semantics of the information structure with the terminology. The Singapore LIM uses a combination of international standards, including ISO 13606-1 (a reference model for electronic health record communication), ISO 21090 (healthcare datatypes), and SNOMED CT (healthcare terminology). The LIM is accompanied by a logical design approach, used to generate interoperability artifacts, and incorporates mechanisms for achieving unidirectional and bidirectional semantic interoperability.

  9. Interoperability of clinical decision-support systems and electronic health records using archetypes: a case study in clinical trial eligibility.

    PubMed

    Marcos, Mar; Maldonado, Jose A; Martínez-Salvador, Begoña; Boscá, Diego; Robles, Montserrat

    2013-08-01

    Clinical decision-support systems (CDSSs) comprise systems as diverse as sophisticated platforms to store and manage clinical data, tools to alert clinicians of problematic situations, or decision-making tools to assist clinicians. Irrespective of the kind of decision-support task CDSSs should be smoothly integrated within the clinical information system, interacting with other components, in particular with the electronic health record (EHR). However, despite decades of developments, most CDSSs lack interoperability features. We deal with the interoperability problem of CDSSs and EHRs by exploiting the dual-model methodology. This methodology distinguishes a reference model and archetypes. A reference model is represented by a stable and small object-oriented model that describes the generic properties of health record information. For their part, archetypes are reusable and domain-specific definitions of clinical concepts in the form of structured and constrained combinations of the entities of the reference model. We rely on archetypes to make the CDSS compatible with EHRs from different institutions. Concretely, we use archetypes for modelling the clinical concepts that the CDSS requires, in conjunction with a series of knowledge-intensive mappings relating the archetypes to the data sources (EHR and/or other archetypes) they depend on. We introduce a comprehensive approach, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes. Archetypes are used to build a conceptual layer of the kind of a virtual health record (VHR) over the EHR whose contents need to be integrated and used in the CDSS, associating them with structural and terminology-based semantics. Subsequently, the archetypes are mapped to the EHR by means of an expressive mapping language and specific-purpose tools. We also describe a case study where the tools and methodology have been employed in a CDSS to support patient recruitment in the framework of a clinical trial for colorectal cancer screening. The utilisation of archetypes not only has proved satisfactory to achieve interoperability between CDSSs and EHRs but also offers various advantages, in particular from a data model perspective. First, the VHR/data models we work with are of a high level of abstraction and can incorporate semantic descriptions. Second, archetypes can potentially deal with different EHR architectures, due to their deliberate independence of the reference model. Third, the archetype instances we obtain are valid instances of the underlying reference model, which would enable e.g. feeding back the EHR with data derived by abstraction mechanisms. Lastly, the medical and technical validity of archetype models would be assured, since in principle clinicians should be the main actors in their development. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Nursing Information Flow in Long-Term Care Facilities.

    PubMed

    Wei, Quan; Courtney, Karen L

    2018-04-01

     Long-term care (LTC), residential care requiring 24-hour nursing services, plays an important role in the health care service delivery system. The purpose of this study was to identify the needed clinical information and information flow to support LTC Registered Nurses (RNs) in care collaboration and clinical decision making.  This descriptive qualitative study combines direct observations and semistructured interviews, conducted at Alberta's LTC facilities between May 2014 and August 2015. The constant comparative method (CCM) of joint coding was used for data analysis.  Nine RNs from six LTC facilities participated in the study. The RN practice environment includes two essential RN information management aspects: information resources and information spaces. Ten commonly used information resources by RNs included: (1) RN-personal notes; (2) facility-specific templates/forms; (3) nursing processes/tasks; (4) paper-based resident profile; (5) daily care plans; (6) RN-notebooks; (7) medication administration records (MARs); (8) reporting software application (RAI-MDS); (9) people (care providers); and (10) references (i.e., books). Nurses used a combination of shared information spaces, such as the Nurses Station or RN-notebook, and personal information spaces, such as personal notebooks or "sticky" notes. Four essential RN information management functions were identified: collection, classification, storage, and distribution. Six sets of information were necessary to perform RN care tasks and communication, including: (1) admission, discharge, and transfer (ADT); (2) assessment; (3) care plan; (4) intervention (with two subsets: medication and care procedure); (5) report; and (6) reference. Based on the RN information management system requirements, a graphic information flow model was constructed.  This baseline study identified key components of a current LTC nursing information management system. The information flow model may assist health information technology (HIT) developers to consolidate the design of HIT solutions for LTC, and serve as a communication tool between nurses and information technology (IT) staff to refine requirements and support further LTC HIT research. Schattauer GmbH Stuttgart.

  11. Distributed and Dynamic Storage of Working Memory Stimulus Information in Extrastriate Cortex

    PubMed Central

    Sreenivasan, Kartik K.; Vytlacil, Jason; D'Esposito, Mark

    2015-01-01

    The predominant neurobiological model of working memory (WM) posits that stimulus information is stored via stable elevated activity within highly selective neurons. Based on this model, which we refer to as the canonical model, the storage of stimulus information is largely associated with lateral prefrontal cortex (lPFC). A growing number of studies describe results that cannot be fully explained by the canonical model, suggesting that it is in need of revision. In the present study, we directly test key elements of the canonical model. We analyzed functional MRI data collected as participants performed a task requiring WM for faces and scenes. Multivariate decoding procedures identified patterns of activity containing information about the items maintained in WM (faces, scenes, or both). While information about WM items was identified in extrastriate visual cortex (EC) and lPFC, only EC exhibited a pattern of results consistent with a sensory representation. Information in both regions persisted even in the absence of elevated activity, suggesting that elevated population activity may not represent the storage of information in WM. Additionally, we observed that WM information was distributed across EC neural populations that exhibited a broad range of selectivity for the WM items rather than restricted to highly selective EC populations. Finally, we determined that activity patterns coding for WM information were not stable, but instead varied over the course of a trial, indicating that the neural code for WM information is dynamic rather than static. Together, these findings challenge the canonical model of WM. PMID:24392897

  12. The Neuroanatomical, Neurophysiological and Psychological Basis of Memory: Current Models and Their Origins

    PubMed Central

    Camina, Eduardo; Güell, Francisco

    2017-01-01

    This review aims to classify and clarify, from a neuroanatomical, neurophysiological, and psychological perspective, different memory models that are currently widespread in the literature as well as to describe their origins. We believe it is important to consider previous developments without which one cannot adequately understand the kinds of models that are now current in the scientific literature. This article intends to provide a comprehensive and rigorous overview for understanding and ordering the latest scientific advances related to this subject. The main forms of memory presented include sensory memory, short-term memory, and long-term memory. Information from the world around us is first stored by sensory memory, thus enabling the storage and future use of such information. Short-term memory (or memory) refers to information processed in a short period of time. Long-term memory allows us to store information for long periods of time, including information that can be retrieved consciously (explicit memory) or unconsciously (implicit memory). PMID:28713278

  13. The Neuroanatomical, Neurophysiological and Psychological Basis of Memory: Current Models and Their Origins.

    PubMed

    Camina, Eduardo; Güell, Francisco

    2017-01-01

    This review aims to classify and clarify, from a neuroanatomical, neurophysiological, and psychological perspective, different memory models that are currently widespread in the literature as well as to describe their origins. We believe it is important to consider previous developments without which one cannot adequately understand the kinds of models that are now current in the scientific literature. This article intends to provide a comprehensive and rigorous overview for understanding and ordering the latest scientific advances related to this subject. The main forms of memory presented include sensory memory, short-term memory, and long-term memory. Information from the world around us is first stored by sensory memory, thus enabling the storage and future use of such information. Short-term memory (or memory) refers to information processed in a short period of time. Long-term memory allows us to store information for long periods of time, including information that can be retrieved consciously (explicit memory) or unconsciously (implicit memory).

  14. Development of Final A-Fault Rupture Models for WGCEP/ NSHMP Earthquake Rate Model 2

    USGS Publications Warehouse

    Field, Edward H.; Weldon, Ray J.; Parsons, Thomas; Wills, Chris J.; Dawson, Timothy E.; Stein, Ross S.; Petersen, Mark D.

    2008-01-01

    This appendix discusses how we compute the magnitude and rate of earthquake ruptures for the seven Type-A faults (Elsinore, Garlock, San Jacinto, S. San Andreas, N. San Andreas, Hayward-Rodgers Creek, and Calaveras) in the WGCEP/NSHMP Earthquake Rate Model 2 (referred to as ERM 2. hereafter). By definition, Type-A faults are those that have relatively abundant paleoseismic information (e.g., mean recurrence-interval estimates). The first section below discusses segmentation-based models, where ruptures are assumed be confined to one or more identifiable segments. The second section discusses an un-segmented-model option, the third section discusses results and implications, and we end with a discussion of possible future improvements. General background information can be found in the main report.

  15. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  16. Approaches in highly parameterized inversion: bgaPEST, a Bayesian geostatistical approach implementation with PEST: documentation and instructions

    USGS Publications Warehouse

    Fienen, Michael N.; D'Oria, Marco; Doherty, John E.; Hunt, Randall J.

    2013-01-01

    The application bgaPEST is a highly parameterized inversion software package implementing the Bayesian Geostatistical Approach in a framework compatible with the parameter estimation suite PEST. Highly parameterized inversion refers to cases in which parameters are distributed in space or time and are correlated with one another. The Bayesian aspect of bgaPEST is related to Bayesian probability theory in which prior information about parameters is formally revised on the basis of the calibration dataset used for the inversion. Conceptually, this approach formalizes the conditionality of estimated parameters on the specific data and model available. The geostatistical component of the method refers to the way in which prior information about the parameters is used. A geostatistical autocorrelation function is used to enforce structure on the parameters to avoid overfitting and unrealistic results. Bayesian Geostatistical Approach is designed to provide the smoothest solution that is consistent with the data. Optionally, users can specify a level of fit or estimate a balance between fit and model complexity informed by the data. Groundwater and surface-water applications are used as examples in this text, but the possible uses of bgaPEST extend to any distributed parameter applications.

  17. A new item response theory model to adjust data allowing examinee choice

    PubMed Central

    Costa, Marcelo Azevedo; Braga Oliveira, Rivert Paulo

    2018-01-01

    In a typical questionnaire testing situation, examinees are not allowed to choose which items they answer because of a technical issue in obtaining satisfactory statistical estimates of examinee ability and item difficulty. This paper introduces a new item response theory (IRT) model that incorporates information from a novel representation of questionnaire data using network analysis. Three scenarios in which examinees select a subset of items were simulated. In the first scenario, the assumptions required to apply the standard Rasch model are met, thus establishing a reference for parameter accuracy. The second and third scenarios include five increasing levels of violating those assumptions. The results show substantial improvements over the standard model in item parameter recovery. Furthermore, the accuracy was closer to the reference in almost every evaluated scenario. To the best of our knowledge, this is the first proposal to obtain satisfactory IRT statistical estimates in the last two scenarios. PMID:29389996

  18. Mass storage system reference model, Version 4

    NASA Technical Reports Server (NTRS)

    Coleman, Sam (Editor); Miller, Steve (Editor)

    1993-01-01

    The high-level abstractions that underlie modern storage systems are identified. The information to generate the model was collected from major practitioners who have built and operated large storage facilities, and represents a distillation of the wisdom they have acquired over the years. The model provides a common terminology and set of concepts to allow existing systems to be examined and new systems to be discussed and built. It is intended that the model and the interfaces identified from it will allow and encourage vendors to develop mutually-compatible storage components that can be combined to form integrated storage systems and services. The reference model presents an abstract view of the concepts and organization of storage systems. From this abstraction will come the identification of the interfaces and modules that will be used in IEEE storage system standards. The model is not yet suitable as a standard; it does not contain implementation decisions, such as how abstract objects should be broken up into software modules or how software modules should be mapped to hosts; it does not give policy specifications, such as when files should be migrated; does not describe how the abstract objects should be used or connected; and does not refer to specific hardware components. In particular, it does not fully specify the interfaces.

  19. Functional correlates of the lateral and medial entorhinal cortex: objects, path integration and local-global reference frames.

    PubMed

    Knierim, James J; Neunuebel, Joshua P; Deshmukh, Sachin S

    2014-02-05

    The hippocampus receives its major cortical input from the medial entorhinal cortex (MEC) and the lateral entorhinal cortex (LEC). It is commonly believed that the MEC provides spatial input to the hippocampus, whereas the LEC provides non-spatial input. We review new data which suggest that this simple dichotomy between 'where' versus 'what' needs revision. We propose a refinement of this model, which is more complex than the simple spatial-non-spatial dichotomy. MEC is proposed to be involved in path integration computations based on a global frame of reference, primarily using internally generated, self-motion cues and external input about environmental boundaries and scenes; it provides the hippocampus with a coordinate system that underlies the spatial context of an experience. LEC is proposed to process information about individual items and locations based on a local frame of reference, primarily using external sensory input; it provides the hippocampus with information about the content of an experience.

  20. Functional correlates of the lateral and medial entorhinal cortex: objects, path integration and local–global reference frames

    PubMed Central

    Knierim, James J.; Neunuebel, Joshua P.; Deshmukh, Sachin S.

    2014-01-01

    The hippocampus receives its major cortical input from the medial entorhinal cortex (MEC) and the lateral entorhinal cortex (LEC). It is commonly believed that the MEC provides spatial input to the hippocampus, whereas the LEC provides non-spatial input. We review new data which suggest that this simple dichotomy between ‘where’ versus ‘what’ needs revision. We propose a refinement of this model, which is more complex than the simple spatial–non-spatial dichotomy. MEC is proposed to be involved in path integration computations based on a global frame of reference, primarily using internally generated, self-motion cues and external input about environmental boundaries and scenes; it provides the hippocampus with a coordinate system that underlies the spatial context of an experience. LEC is proposed to process information about individual items and locations based on a local frame of reference, primarily using external sensory input; it provides the hippocampus with information about the content of an experience. PMID:24366146

  1. Initiation and Modification of Reaction by Energy Addition: Kinetic and Transport Phenomena

    DTIC Science & Technology

    1990-10-01

    ignition- delay time ranges from about 2 to 100 ps. The results of a computer- modeling calcu- lation of the chemical kinetics suggest that the...Page PROGRAM INFORMATION iii 1.0 RESEARCH OBJECTIVES 2.0 ANALYSIS 2 3.0 EXPERIMENT 7 REFERENCES 8 APPENDIX I. Evaluating a Simple Model for Laminar...Flame-Propagation I-1 Rates. I. Planar Geometry. APPENDIX II. Evaluating a Simple Model for Laminar-Flame-Propagation II-1 Rates. II. Spherical

  2. Non-Power Purchase Agreement (PPA) Options for Financing Solar Deployment at Universities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Financing solar using power purchase agreements (PPAs) has facilitated solar deployment of more than 100 megawatts (MW) at universities--as compared to 50 MW facilitated by financing models not using PPAs. This brochure, which overviews existing financing models and funding mechanisms available for solar procurement, focuses on non-PPA financing models. For more information on solar deployment at universities using PPAs, refer to Using Power Purchase Agreements for Solar Deployment at Universities.

  3. Integration of implant planning workflows into the PACS infrastructure

    NASA Astrophysics Data System (ADS)

    Gessat, Michael; Strauß, Gero; Burgert, Oliver

    2008-03-01

    The integration of imaging devices, diagnostic workstations, and image servers into Picture Archiving and Communication Systems (PACS) has had an enormous effect on the efficiency of radiology workflows. The standardization of the information exchange between the devices with the DICOM standard has been an essential precondition for that development. For surgical procedures, no such infrastructure exists. With the increasingly important role computerized planning and assistance systems play in the surgical domain, an infrastructure that unifies the communication between devices becomes necessary. In recent publications, the need for a modularized system design has been established. A reference architecture for a Therapy Imaging and Model Management System (TIMMS) has been proposed. It was accepted by the DICOM Working Group 6 as the reference architecture for DICOM developments for surgery. In this paper we propose the inclusion of implant planning systems into the PACS infrastructure. We propose a generic information model for the patient specific selection and positioning of implants from a repository according to patient image data. The information models are based on clinical workflows from ENT, cardiac, and orthopedic surgery as well as technical requirements derived from different use cases and systems. We show an exemplary implementation of the model for application in ENT surgery: the selection and positioning of an ossicular implant in the middle ear. An implant repository is stored in the PACS. It makes use of an experimental implementation of the Surface Mesh Module that is currently being developed as extension to the DICOM standard.

  4. Quicklook overview of model changes in Melcor 2.2: Rev 6342 to Rev 9496

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humphries, Larry L.

    2017-05-01

    MELCOR 2.2 is a significant official release of the MELCOR code with many new models and model improvements. This report provides the code user with a quick review and characterization of new models added, changes to existing models, the effect of code changes during this code development cycle (rev 6342 to rev 9496), a preview of validation results with this code version. More detailed information is found in the code Subversion logs as well as the User Guide and Reference Manuals.

  5. Solving large test-day models by iteration on data and preconditioned conjugate gradient.

    PubMed

    Lidauer, M; Strandén, I; Mäntysaari, E A; Pösö, J; Kettunen, A

    1999-12-01

    A preconditioned conjugate gradient method was implemented into an iteration on a program for data estimation of breeding values, and its convergence characteristics were studied. An algorithm was used as a reference in which one fixed effect was solved by Gauss-Seidel method, and other effects were solved by a second-order Jacobi method. Implementation of the preconditioned conjugate gradient required storing four vectors (size equal to number of unknowns in the mixed model equations) in random access memory and reading the data at each round of iteration. The preconditioner comprised diagonal blocks of the coefficient matrix. Comparison of algorithms was based on solutions of mixed model equations obtained by a single-trait animal model and a single-trait, random regression test-day model. Data sets for both models used milk yield records of primiparous Finnish dairy cows. Animal model data comprised 665,629 lactation milk yields and random regression test-day model data of 6,732,765 test-day milk yields. Both models included pedigree information of 1,099,622 animals. The animal model ¿random regression test-day model¿ required 122 ¿305¿ rounds of iteration to converge with the reference algorithm, but only 88 ¿149¿ were required with the preconditioned conjugate gradient. To solve the random regression test-day model with the preconditioned conjugate gradient required 237 megabytes of random access memory and took 14% of the computation time needed by the reference algorithm.

  6. A Phenomenographical Study of Voluntary Digital Exclusion

    ERIC Educational Resources Information Center

    Anderson, Derrick L.

    2012-01-01

    Traditionally scholars have used the digital divide and technology acceptance model definitions when examining why some people elect not to use certain information and communications technologies. When examining the phenomenon referred to as voluntary digital exclusion, the use of these classic definitions is woefully inadequate. They do not…

  7. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) Brand name, if any; and (ii) Make or model number; (3) Include descriptive literature such as illustrations, drawings, or a clear reference to previously furnished descriptive data or information available... product to make it conform to the solicitation requirements. Mark any descriptive material to clearly show...

  8. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) Brand name, if any; and (ii) Make or model number; (3) Include descriptive literature such as illustrations, drawings, or a clear reference to previously furnished descriptive data or information available... product to make it conform to the solicitation requirements. Mark any descriptive material to clearly show...

  9. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) Brand name, if any; and (ii) Make or model number; (3) Include descriptive literature such as illustrations, drawings, or a clear reference to previously furnished descriptive data or information available... product to make it conform to the solicitation requirements. Mark any descriptive material to clearly show...

  10. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...) Brand name, if any; and (ii) Make or model number; (3) Include descriptive literature such as illustrations, drawings, or a clear reference to previously furnished descriptive data or information available... product to make it conform to the solicitation requirements. Mark any descriptive material to clearly show...

  11. 48 CFR 52.211-6 - Brand name or equal.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) Brand name, if any; and (ii) Make or model number; (3) Include descriptive literature such as illustrations, drawings, or a clear reference to previously furnished descriptive data or information available... product to make it conform to the solicitation requirements. Mark any descriptive material to clearly show...

  12. Teaching Reading Sourcebook, Second Edition

    ERIC Educational Resources Information Center

    Honig, Bill; Diamond, Linda; Gutlohn, Linda

    2008-01-01

    The "Teaching Reading Sourcebook, Second Edition" is a comprehensive reference about reading instruction. Organized according to the elements of explicit instruction (what? why? when? and how?), the "Sourcebook" includes both a research-informed knowledge base and practical sample lesson models. It teaches the key elements of an effective reading…

  13. Registering Names and Addresses for Information Technology.

    ERIC Educational Resources Information Center

    Knapp, Arthur A.

    The identification of administrative authorities and the development of associated procedures for registering and accessing names and addresses of communications data systems are considered in this paper. It is noted that, for data communications systems using standards based on the Open Systems Interconnection (OSI) Reference Model specified by…

  14. Effect of price and information on the food choices of women university students in Saudi Arabia: An experimental study.

    PubMed

    Halimic, Aida; Gage, Heather; Raats, Monique; Williams, Peter

    2018-04-01

    To explore the impact of price manipulation and healthy eating information on intended food choices. Health information was provided to a random half of subjects (vs. information on Saudi agriculture). Each subject chose from the same lunch menu, containing two healthy and two unhealthy entrees, deserts and beverages, on five occasions. Reference case prices were 5, 3 and 2 Saudi Arabian Reals (SARs). Prices of healthy and unhealthy items were manipulated up (taxed) and down (subsidized) by 1 SAR in four menu variations (random order); subjects were given a budget enabling full choice within any menu. The number of healthy food choices were compared with different price combinations, and between information groups. Linear regression modelling explored the effect of relative prices of healthy/unhealthy options and information on number of healthy choices controlling for dietary behaviours and hunger levels. University campus, Saudi Arabia, 2013. 99 women students. In the reference case, 49.5% of choices were for healthy items. When the price of healthy items was reduced, 58.5% of selections were healthy; 57.2% when the price of unhealthy items rose. In regression modelling, reducing the price of healthy items and increasing the price of unhealthy items increased the number of healthy choices by 5% and 6% respectively. Students reporting a less healthy usual diet selected significantly fewer healthy items. Providing healthy eating information was not a significant influence. Price manipulation offers potential for altering behaviours to combat rising youth obesity in Saudi Arabia. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. The MADE Reference Information Model for Interoperable Pervasive Telemedicine Systems.

    PubMed

    Fung, Nick L S; Jones, Valerie M; Hermens, Hermie J

    2017-03-23

    The main objective is to develop and validate a reference information model (RIM) to support semantic interoperability of pervasive telemedicine systems. The RIM is one component within a larger, computer-interpretable "MADE language" developed by the authors in the context of the MobiGuide project. To validate our RIM, we applied it to a clinical guideline for patients with gestational diabetes mellitus (GDM). The RIM is derived from a generic data flow model of disease management which comprises a network of four types of concurrent processes: Monitoring (M), Analysis (A), Decision (D) and Effectuation (E). This resulting MADE RIM, which was specified using the formal Vienna Development Method (VDM), includes six main, high-level data types representing measurements, observations, abstractions, action plans, action instructions and control instructions. The authors applied the MADE RIM to the complete GDM guideline and derived from it a domain information model (DIM) comprising 61 archetypes, specifically 1 measurement, 8 observation, 10 abstraction, 18 action plan, 3 action instruction and 21 control instruction archetypes. It was observed that there are six generic patterns for transforming different guideline elements into MADE archetypes, although a direct mapping does not exist in some cases. Most notable examples are notifications to the patient and/or clinician as well as decision conditions which pertain to specific stages in the therapy. The results provide evidence that the MADE RIM is suitable for modelling clinical data in the design of pervasive telemedicine systems. Together with the other components of the MADE language, the MADE RIM supports development of pervasive telemedicine systems that are interoperable and independent of particular clinical applications.

  16. The Organizations and Functions of Documentation and Information Centres in Defence and Aerospace Environments

    DTIC Science & Technology

    1988-10-20

    express my sincere gratitude to Mrs. D. Patrinou, of the Hellenic Aerospace Industry, for her valuable and constructive comments during the preparation of...cognitive awareness, we get closer to constructing the elusive user model. 3-7 REFERENCES 1 Ziman, J. Knowing everything about nothing: specialization and...Collection, with information from 18 abstract journals - the Construction Criteria Base. with over 50,000 pages of guide specifications and standards

  17. Comprehensive analysis of information dissemination in disasters

    NASA Astrophysics Data System (ADS)

    Zhang, N.; Huang, H.; Su, Boni

    2016-11-01

    China is a country that experiences a large number of disasters. The number of deaths caused by large-scale disasters and accidents in past 10 years is around 900,000. More than 92.8 percent of these deaths could be avoided if there were an effective pre-warning system deployed. Knowledge of the information dissemination characteristics of different information media taking into consideration governmental assistance (information published by a government) in disasters in urban areas, plays a critical role in increasing response time and reducing the number of deaths and economic losses. In this paper we have developed a comprehensive information dissemination model to optimize efficiency of pre-warning mechanics. This model also can be used for disseminating information for evacuees making real-time evacuation plans. We analyzed every single information dissemination models for pre-warning in disasters by considering 14 media: short message service (SMS), phone, television, radio, news portals, Wechat, microblogs, email, newspapers, loudspeaker vehicles, loudspeakers, oral communication, and passive information acquisition via visual and auditory senses. Since governmental assistance is very useful in a disaster, we calculated the sensitivity of governmental assistance ratio. The results provide useful references for information dissemination during disasters in urban areas.

  18. Modeling financial markets by self-organized criticality

    NASA Astrophysics Data System (ADS)

    Biondo, Alessio Emanuele; Pluchino, Alessandro; Rapisarda, Andrea

    2015-10-01

    We present a financial market model, characterized by self-organized criticality, that is able to generate endogenously a realistic price dynamics and to reproduce well-known stylized facts. We consider a community of heterogeneous traders, composed by chartists and fundamentalists, and focus on the role of informative pressure on market participants, showing how the spreading of information, based on a realistic imitative behavior, drives contagion and causes market fragility. In this model imitation is not intended as a change in the agent's group of origin, but is referred only to the price formation process. We introduce in the community also a variable number of random traders in order to study their possible beneficial role in stabilizing the market, as found in other studies. Finally, we also suggest some counterintuitive policy strategies able to dampen fluctuations by means of a partial reduction of information.

  19. An evaluation of light intensity functions for determination of shaded reference stream metabolism.

    PubMed

    Zell, Chris; Hubbart, Jason A

    2012-04-30

    The performance of three single-station whole stream metabolism models were evaluated within three shaded, seasonally hypoxic, Missouri reference streams using high resolution (15-minute) dissolved oxygen (DO), temperature, and light intensity data collected during the summers (July-September) of 2006-2008. The model incorporating light intensity data consistently achieved a lower root mean square error (median RMSE = 0.20 mg L(-1)) relative to models assuming sinusoidal light intensity functions (median RMSE = 0.28 mg L(-1)) and constant diel temperature (median RMSE = 0.53 mg L(-1)). Incorporation of site-specific light intensity into metabolism models better predicted morning DO concentrations and exposure to hypoxic conditions in shaded study streams. Model choice significantly affected (p < 0.05) rate estimates for daily average photosynthesis. Low reaeration (pooled site mean 1.1 day(-1) at 20 °C) coupled with summer temperatures (pooled site mean = 25.8 °C) and low to moderate community respiration (site median 1.0-3.0 g O(2) m(-2) day(-1)) yielded diel dissolved oxygen concentrations near or below critical aquatic life thresholds in studied reference streams. Quantifying these process combinations in best-available or least-disturbed (i.e., reference) systems advances our understanding of regional dissolved oxygen expectations and informs environmental management policy. Additional research is warranted to better link landscape processes with distributed sources that contribute to community respiration. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Information object definition-based unified modeling language representation of DICOM structured reporting: a case study of transcoding DICOM to XML.

    PubMed

    Tirado-Ramos, Alfredo; Hu, Jingkun; Lee, K P

    2002-01-01

    Supplement 23 to DICOM (Digital Imaging and Communications for Medicine), Structured Reporting, is a specification that supports a semantically rich representation of image and waveform content, enabling experts to share image and related patient information. DICOM SR supports the representation of textual and coded data linked to images and waveforms. Nevertheless, the medical information technology community needs models that work as bridges between the DICOM relational model and open object-oriented technologies. The authors assert that representations of the DICOM Structured Reporting standard, using object-oriented modeling languages such as the Unified Modeling Language, can provide a high-level reference view of the semantically rich framework of DICOM and its complex structures. They have produced an object-oriented model to represent the DICOM SR standard and have derived XML-exchangeable representations of this model using World Wide Web Consortium specifications. They expect the model to benefit developers and system architects who are interested in developing applications that are compliant with the DICOM SR specification.

  1. Automatic generation of computable implementation guides from clinical information models.

    PubMed

    Boscá, Diego; Maldonado, José Alberto; Moner, David; Robles, Montserrat

    2015-06-01

    Clinical information models are increasingly used to describe the contents of Electronic Health Records. Implementation guides are a common specification mechanism used to define such models. They contain, among other reference materials, all the constraints and rules that clinical information must obey. However, these implementation guides typically are oriented to human-readability, and thus cannot be processed by computers. As a consequence, they must be reinterpreted and transformed manually into an executable language such as Schematron or Object Constraint Language (OCL). This task can be difficult and error prone due to the big gap between both representations. The challenge is to develop a methodology for the specification of implementation guides in such a way that humans can read and understand easily and at the same time can be processed by computers. In this paper, we propose and describe a novel methodology that uses archetypes as basis for generation of implementation guides. We use archetypes to generate formal rules expressed in Natural Rule Language (NRL) and other reference materials usually included in implementation guides such as sample XML instances. We also generate Schematron rules from NRL rules to be used for the validation of data instances. We have implemented these methods in LinkEHR, an archetype editing platform, and exemplify our approach by generating NRL rules and implementation guides from EN ISO 13606, openEHR, and HL7 CDA archetypes. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Comparing models of the combined-stimulation advantage for speech recognition.

    PubMed

    Micheyl, Christophe; Oxenham, Andrew J

    2012-05-01

    The "combined-stimulation advantage" refers to an improvement in speech recognition when cochlear-implant or vocoded stimulation is supplemented by low-frequency acoustic information. Previous studies have been interpreted as evidence for "super-additive" or "synergistic" effects in the combination of low-frequency and electric or vocoded speech information by human listeners. However, this conclusion was based on predictions of performance obtained using a suboptimal high-threshold model of information combination. The present study shows that a different model, based on Gaussian signal detection theory, can predict surprisingly large combined-stimulation advantages, even when performance with either information source alone is close to chance, without involving any synergistic interaction. A reanalysis of published data using this model reveals that previous results, which have been interpreted as evidence for super-additive effects in perception of combined speech stimuli, are actually consistent with a more parsimonious explanation, according to which the combined-stimulation advantage reflects an optimal combination of two independent sources of information. The present results do not rule out the possible existence of synergistic effects in combined stimulation; however, they emphasize the possibility that the combined-stimulation advantages observed in some studies can be explained simply by non-interactive combination of two information sources.

  3. Designing Excellence and Quality Model for Training Centers of Primary Health Care: A Delphi Method Study.

    PubMed

    Tabrizi, Jafar-Sadegh; Farahbakhsh, Mostafa; Shahgoli, Javad; Rahbar, Mohammad Reza; Naghavi-Behzad, Mohammad; Ahadi, Hamid-Reza; Azami-Aghdash, Saber

    2015-10-01

    Excellence and quality models are comprehensive methods for improving the quality of healthcare. The aim of this study was to design excellence and quality model for training centers of primary health care using Delphi method. In this study, Delphi method was used. First, comprehensive information were collected using literature review. In extracted references, 39 models were identified from 34 countries and related sub-criteria and standards were extracted from 34 models (from primary 39 models). Then primary pattern including 8 criteria, 55 sub-criteria, and 236 standards was developed as a Delphi questionnaire and evaluated in four stages by 9 specialists of health care system in Tabriz and 50 specialists from all around the country. Designed primary model (8 criteria, 55 sub-criteria, and 236 standards) were concluded with 8 criteria, 45 sub-criteria, and 192 standards after 4 stages of evaluations by specialists. Major criteria of the model are leadership, strategic and operational planning, resource management, information analysis, human resources management, process management, costumer results, and functional results, where the top score was assigned as 1000 by specialists. Functional results had the maximum score of 195 whereas planning had the minimum score of 60. Furthermore the most and the least sub-criteria was for leadership with 10 sub-criteria and strategic planning with 3 sub-criteria, respectively. The model that introduced in this research has been designed following 34 reference models of the world. This model could provide a proper frame for managers of health system in improving quality.

  4. Safety assessment of plant varieties using transcriptomics profiling and a one-class classifier.

    PubMed

    van Dijk, Jeroen P; de Mello, Carla Souza; Voorhuijzen, Marleen M; Hutten, Ronald C B; Arisi, Ana Carolina Maisonnave; Jansen, Jeroen J; Buydens, Lutgarde M C; van der Voet, Hilko; Kok, Esther J

    2014-10-01

    An important part of the current hazard identification of novel plant varieties is comparative targeted analysis of the novel and reference varieties. Comparative analysis will become much more informative with unbiased analytical approaches, e.g. omics profiling. Data analysis estimating the similarity of new varieties to a reference baseline class of known safe varieties would subsequently greatly facilitate hazard identification. Further biological and eventually toxicological analysis would then only be necessary for varieties that fall outside this reference class. For this purpose, a one-class classifier tool was explored to assess and classify transcriptome profiles of potato (Solanum tuberosum) varieties in a model study. Profiles of six different varieties, two locations of growth, two year of harvest and including biological and technical replication were used to build the model. Two scenarios were applied representing evaluation of a 'different' variety and a 'similar' variety. Within the model higher class distances resulted for the 'different' test set compared with the 'similar' test set. The present study may contribute to a more global hazard identification of novel plant varieties. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Sentiment analysis in twitter data using data analytic techniques for predictive modelling

    NASA Astrophysics Data System (ADS)

    Razia Sulthana, A.; Jaithunbi, A. K.; Sai Ramesh, L.

    2018-04-01

    Sentiment analysis refers to the task of natural language processing to determine whether a piece of text contains subjective information and the kind of subjective information it expresses. The subjective information represents the attitude behind the text: positive, negative or neutral. Understanding the opinions behind user-generated content automatically is of great concern. We have made data analysis with huge amount of tweets taken as big data and thereby classifying the polarity of words, sentences or entire documents. We use linear regression for modelling the relationship between a scalar dependent variable Y and one or more explanatory variables (or independent variables) denoted X. We conduct a series of experiments to test the performance of the system.

  6. Parallel updating and weighting of multiple spatial maps for visual stability during whole body motion

    PubMed Central

    Medendorp, W. P.

    2015-01-01

    It is known that the brain uses multiple reference frames to code spatial information, including eye-centered and body-centered frames. When we move our body in space, these internal representations are no longer in register with external space, unless they are actively updated. Whether the brain updates multiple spatial representations in parallel, or whether it restricts its updating mechanisms to a single reference frame from which other representations are constructed, remains an open question. We developed an optimal integration model to simulate the updating of visual space across body motion in multiple or single reference frames. To test this model, we designed an experiment in which participants had to remember the location of a briefly presented target while being translated sideways. The behavioral responses were in agreement with a model that uses a combination of eye- and body-centered representations, weighted according to the reliability in which the target location is stored and updated in each reference frame. Our findings suggest that the brain simultaneously updates multiple spatial representations across body motion. Because both representations are kept in sync, they can be optimally combined to provide a more precise estimate of visual locations in space than based on single-frame updating mechanisms. PMID:26490289

  7. Research on Geo-information Data Model for Preselected Areas of Geological Disposal of High-level Radioactive Waste

    NASA Astrophysics Data System (ADS)

    Gao, M.; Huang, S. T.; Wang, P.; Zhao, Y. A.; Wang, H. B.

    2016-11-01

    The geological disposal of high-level radioactive waste (hereinafter referred to "geological disposal") is a long-term, complex, and systematic scientific project, whose data and information resources in the research and development ((hereinafter referred to ”R&D”) process provide the significant support for R&D of geological disposal system, and lay a foundation for the long-term stability and safety assessment of repository site. However, the data related to the research and engineering in the sitting of the geological disposal repositories is more complicated (including multi-source, multi-dimension and changeable), the requirements for the data accuracy and comprehensive application has become much higher than before, which lead to the fact that the data model design of geo-information database for the disposal repository are facing more serious challenges. In the essay, data resources of the pre-selected areas of the repository has been comprehensive controlled and systematic analyzed. According to deeply understanding of the application requirements, the research work has made a solution for the key technical problems including reasonable classification system of multi-source data entity, complex logic relations and effective physical storage structures. The new solution has broken through data classification and conventional spatial data the organization model applied in the traditional industry, realized the data organization and integration with the unit of data entities and spatial relationship, which were independent, holonomic and with application significant features in HLW geological disposal. The reasonable, feasible and flexible data conceptual models, logical models and physical models have been established so as to ensure the effective integration and facilitate application development of multi-source data in pre-selected areas for geological disposal.

  8. On Meaningful Measurement: Concepts, Technology and Examples.

    ERIC Educational Resources Information Center

    Cheung, K. C.

    This paper discusses how concepts and procedural skills in problem-solving tasks, as well as affects and emotions, can be subjected to meaningful measurement (MM), based on a multisource model of learning and a constructivist information-processing theory of knowing. MM refers to the quantitative measurement of conceptual and procedural knowledge…

  9. Second Generation Weather Impacts Decision Aid User’s Manual

    DTIC Science & Technology

    2013-09-01

    from the pulldown Base Reference Time menu. Most models start at times based on Coordinated Universal Time (UTC) or Zulu time (Z) with the selections...Effects Matrix Z Zulu time 31 No. of Copies Organization 1 DEFENSE TECHNICAL (PDF) INFORMATION CTR DTIC OCA 1 DIRECTOR (PDF) US

  10. Archive, Access, and Supply of Scientifically Derived Data: A Data Model for Multi-Parameterized Querying Where Spectral Data Base Meets GIS-Based Mapping Archive

    NASA Astrophysics Data System (ADS)

    Nass, A.; D'Amore, M.; Helbert, J.

    2018-04-01

    An archiving structure and reference level of derived and already published data supports the scientific community significantly by a constant rise of knowledge and understanding based on recent discussions within Information Science and Management.

  11. Sex Education for Deaf-Blind Youths and Adults.

    ERIC Educational Resources Information Center

    Ingraham, Cynthia L.; Vernon, McCay; Clemente, Brenda; Olney, Linda

    2000-01-01

    This article describes a model sex education program developed for youths and adults who are deafblind by the Helen Keller National Center for Deaf-Blind Youths and Adults. In addition, it also discusses major related issues and presents general recommendations and a resource for further information. (Contains 11 references.) (Author/CR)

  12. A spatial reference frame model of Beijing based on spatial cognitive experiment

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Jing; Liu, Yu

    2006-10-01

    Orientation relation in the spatial relation is very important in GIS. People can obtain orientation information by making use of map reading and the cognition of the surrounding environment, and then create the spatial reference frame. City is a kind of special spatial environment, a person with life experiences has some spatial knowledge about the city where he or she lives in. Based on the spatial knowledge of the city environment, people can position, navigate and understand the meaning embodied in the environment correctly. Beijing as a real geographic space, its layout is very special and can form a kind of new spatial reference frame. Based on the characteristics of the layout of Beijing city, this paper will introduce a new spatial reference frame of Beijing and use two psychological experiments to validate its cognitive plausibility.

  13. Reconstruction of hyperspectral image using matting model for classification

    NASA Astrophysics Data System (ADS)

    Xie, Weiying; Li, Yunsong; Ge, Chiru

    2016-05-01

    Although hyperspectral images (HSIs) captured by satellites provide much information in spectral regions, some bands are redundant or have large amounts of noise, which are not suitable for image analysis. To address this problem, we introduce a method for reconstructing the HSI with noise reduction and contrast enhancement using a matting model for the first time. The matting model refers to each spectral band of an HSI that can be decomposed into three components, i.e., alpha channel, spectral foreground, and spectral background. First, one spectral band of an HSI with more refined information than most other bands is selected, and is referred to as an alpha channel of the HSI to estimate the hyperspectral foreground and hyperspectral background. Finally, a combination operation is applied to reconstruct the HSI. In addition, the support vector machine (SVM) classifier and three sparsity-based classifiers, i.e., orthogonal matching pursuit (OMP), simultaneous OMP, and OMP based on first-order neighborhood system weighted classifiers, are utilized on the reconstructed HSI and the original HSI to verify the effectiveness of the proposed method. Specifically, using the reconstructed HSI, the average accuracy of the SVM classifier can be improved by as much as 19%.

  14. Exchanging honest employment references: avoiding the traps of defamation and negligent hiring.

    PubMed

    McConnell, Charles R

    2015-01-01

    In present-day reference checking, many of the same organizations that seek as much information as possible about people they wish to hire resist giving out more than a bare minimum of information to other organizations. The strongest force driving this minimal reference information release is fear of legal action taken because of something said about an individual in a reference response. Many employers seem so frightened of being sued that they share nothing of substance, usually not realizing that in supposedly protecting themselves against defamation charges they are sometimes increasing the risk of negligent hiring charges. However, truthful reference information can be provided with minimal risk if it is provided in good faith, given only to those who have a legitimate need to know, is strictly job related, and is not communicated maliciously. References must always be provided objectively with information verifiable in personnel files.

  15. Data Publication Process for CMIP5 Data and the Role of PIDs within Federated Earth System Science Projects

    NASA Astrophysics Data System (ADS)

    Stockhause, M.; Höck, H.; Toussaint, F.; Weigel, T.; Lautenschlager, M.

    2012-12-01

    We present the publication process for the CMIP5 (Coupled Model Intercomparison Project Phase 5) data with special emphasis on the current role of identifiers and the potential future role of PIDs in such distributed technical infrastructures. The DataCite data publication with DOI assignment finalizes the 3 levels quality control procedure for CMIP5 data (Stockhause et al., 2012). WDCC utilizes the Assistant System Atarrabi to support the publication process. Atarrabi is a web-based workflow system for metadata reviews of data creators and Publication Agents (PAs). Within the quality checks for level 3 all available information in the different infrastructure components is cross-checked for consistency by the DataCite PA. This information includes: metadata on data, metadata in the long-term archive of the Publication Agency, quality information, and external metadata on model and simulation (CIM). For these consistency checks metadata related to the data publication has to be identified. The Data Reference Syntax (DRS) convention functions as global identifier for data. Since the DRS structures the data, hierarchically, it can be used to identify data collections like DataCite publication units, i.e. all data belonging to a CMIP5 simulation. Every technical component of the infrastructure uses DRS or maps to it, but there is no central repository storing DRS_ids. Thus they have to be mapped, occasionally. Additional local identifiers are used within the different technical infrastructure components. Identification of related pieces of information in their repositories is cumbersome and tricky for the PA. How could PIDs improve the situation? To establish a reliable distributed data and metadata infrastructure, PIDs for all objects are needed as well as relations between them. An ideal data publication scenario for federated community projects within Earth System Sciences, e.g. CMIP, would be: 1. Data creators at the modeling centers define their simulation, related metadata, and software, which are assigned PIDs. 2. During ESGF data publication the data entities are assigned PIDs with references to the PIDs of 1. Since we deal with different hierarchical levels, the definition of collections on these levels is advantageous. A possible implementation concept using Handles is described by Weigel et al. (2012). 3. Quality results are assigned PID(s) and a reference to the data. A quality PID is added as a reference to the data collection PID. 4. The PA accesses the PID on the data collection to get the data and all related information for cross-checking. The presented example of the technical infrastructure for the CMIP5 data distribution shows the importance of PIDs, especially as the data is distributed over multiple repositories world-wide and additional separate pieces of data related information are independently collected from the data. References: Stockhause, M., Höck, H., Toussaint, F., Lautenschlager, M. (2012): 'Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data', Geosci. Model Dev. Discuss., 5, 781-802, doi:10.5194/gmdd-5-781-2012. Weigel, T., et al. (2012): 'Structural Elements in a Persistent Identifier Infrastructure and Resulting Benefits for the Earth Science Community', submitted to AGU 2012 Session IN009.

  16. NKG201xGIA - first results for a new model of glacial isostatic adjustment in Fennoscandia

    NASA Astrophysics Data System (ADS)

    Steffen, Holger; Barletta, Valentina; Kollo, Karin; Milne, Glenn A.; Nordman, Maaria; Olsson, Per-Anders; Simpson, Matthew J. R.; Tarasov, Lev; Ågren, Jonas

    2016-04-01

    Glacial isostatic adjustment (GIA) is a dominant process in northern Europe, which is observed with several geodetic and geophysical methods. The observed land uplift due to this process amounts to about 1 cm/year in the northern Gulf of Bothnia. GIA affects the establishment and maintenance of reliable geodetic and gravimetric reference networks in the Nordic countries. To support a high level of accuracy in the determination of position, adequate corrections have to be applied with dedicated models. Currently, there are efforts within a Nordic Geodetic Commission (NKG) activity towards a model of glacial isostatic adjustment for Fennoscandia. The new model, NKG201xGIA, to be developed in the near future will complement the forthcoming empirical NKG land uplift model, which will substitute the currently used empirical land uplift model NKG2005LU (Ågren & Svensson, 2007). Together, the models will be a reference for vertical and horizontal motion, gravity and geoid change and more. NKG201xGIA will also provide uncertainty estimates for each field. Following former investigations, the GIA model is based on a combination of an ice and an earth model. The selected reference ice model, GLAC, for Fennoscandia, the Barents/Kara seas and the British Isles is provided by Lev Tarasov and co-workers. Tests of different ice and earth models will be performed based on the expertise of each involved modeler. This includes studies on high resolution ice sheets, different rheologies, lateral variations in lithosphere and mantle viscosity and more. This will also be done in co-operation with scientists outside NKG who help in the development and testing of the model. References Ågren, J., Svensson, R. (2007): Postglacial Land Uplift Model and System Definition for the New Swedish Height System RH 2000. Reports in Geodesy and Geographical Information Systems Rapportserie, LMV-Rapport 4, Lantmäteriet, Gävle.

  17. An Integrative Account of Constraints on Cross-Situational Learning

    PubMed Central

    Yurovsky, Daniel; Frank, Michael C.

    2015-01-01

    Word-object co-occurrence statistics are a powerful information source for vocabulary learning, but there is considerable debate about how learners actually use them. While some theories hold that learners accumulate graded, statistical evidence about multiple referents for each word, others suggest that they track only a single candidate referent. In two large-scale experiments, we show that neither account is sufficient: Cross-situational learning involves elements of both. Further, the empirical data are captured by a computational model that formalizes how memory and attention interact with co-occurrence tracking. Together, the data and model unify opposing positions in a complex debate and underscore the value of understanding the interaction between computational and algorithmic levels of explanation. PMID:26302052

  18. Petroleum Refinery Jobs and Economic Development Impact (JEDI) Model User Reference Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Marshall

    The Jobs and Economic Development Impact (JEDI) models, developed through the National Renewable Energy Laboratory (NREL), are user-friendly tools utilized to estimate the economic impacts at the local level of constructing and operating fuel and power generation projects for a range of conventional and renewable energy technologies. The JEDI Petroleum Refinery Model User Reference Guide was developed to assist users in employing and understanding the model. This guide provides information on the model's underlying methodology, as well as the parameters and references used to develop the cost data utilized in the model. This guide also provides basic instruction on modelmore » add-in features, operation of the model, and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the model estimates job creation, earning and output (total economic activity) for a given petroleum refinery. This includes the direct, indirect and induced economic impacts to the local economy associated with the refinery's construction and operation phases. Project cost and job data used in the model are derived from the most current cost estimations available. Local direct and indirect economic impacts are estimated using economic multipliers derived from IMPLAN software. By determining the regional economic impacts and job creation for a proposed refinery, the JEDI Petroleum Refinery model can be used to field questions about the added value refineries may bring to the local community.« less

  19. The total laboratory solution: a new laboratory E-business model based on a vertical laboratory meta-network.

    PubMed

    Friedman, B A

    2001-08-01

    Major forces are now reshaping all businesses on a global basis, including the healthcare and clinical laboratory industries. One of the major forces at work is information technology (IT), which now provides the opportunity to create a new economic and business model for the clinical laboratory industry based on the creation of an integrated vertical meta-network, referred to here as the "total laboratory solution" (TLS). Participants at the most basic level of such a network would include a hospital-based laboratory, a reference laboratory, a laboratory information system/application service provider/laboratory portal vendor, an in vitro diagnostic manufacturer, and a pharmaceutical/biotechnology manufacturer. It is suggested that each of these participants would add value to the network primarily in its area of core competency. Subvariants of such a network have evolved over recent years, but a TLS comprising all or most of these participants does not exist at this time. Although the TLS, enabled by IT and closely akin to the various e-businesses that are now taking shape, offers many advantages from a theoretical perspective over the current laboratory business model, its success will depend largely on (a) market forces, (b) how the collaborative networks are organized and managed, and (c) whether the network can offer healthcare organizations higher quality testing services at lower cost. If the concept is successful, new demands will be placed on hospital-based laboratory professionals to shift the range of professional services that they offer toward clinical consulting, integration of laboratory information from multiple sources, and laboratory information management. These information management and integration tasks can only increase in complexity in the future as new genomic and proteomics testing modalities are developed and come on-line in clinical laboratories.

  20. Business Process Modelling is an Essential Part of a Requirements Analysis. Contribution of EFMI Primary Care Working Group.

    PubMed

    de Lusignan, S; Krause, P; Michalakidis, G; Vicente, M Tristan; Thompson, S; McGilchrist, M; Sullivan, F; van Royen, P; Agreus, L; Desombre, T; Taweel, A; Delaney, B

    2012-01-01

    To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data. We extended our initial data-centric approach to include socio-cultural and business requirements. We created reference models of core data requirements common to most studies using unified modelling language (UML), dataflow diagrams (DFD) and business process modelling notation (BPMN). We conducted a stakeholder analysis and constructed DFD and UML diagrams for use cases based on simulated research studies. We used research output as a sensitivity analysis. Differences between the reference model and use cases identified study specific data requirements. The stakeholder analysis identified: tensions, changes in specification, some indifference from data providers and enthusiastic informaticians urging inclusion of socio-cultural context. We identified requirements to collect information at three levels: micro- data items, which need to be semantically interoperable, meso- the medical record and data extraction, and macro- the health system and socio-cultural issues. BPMN clarified complex business requirements among data providers and vendors; and additional geographical requirements for patients to be represented in both linked datasets. High quality research output was the norm for most repositories. Reference models provide high-level schemata of the core data requirements. However, business requirements' modelling identifies stakeholder issues and identifies what needs to be addressed to enable participation.

  1. Long-term pavement performance ancillary information management system (AIMS) reference guide.

    DOT National Transportation Integrated Search

    2012-11-01

    This document provides information on the Long-Term Pavement Performance (LTPP) program ancillary information. : Ancillary information includes data, images, reference materials, resource documents, and other information that : support and extend the...

  2. Building a semantic web-based metadata repository for facilitating detailed clinical modeling in cancer genome studies.

    PubMed

    Sharma, Deepak K; Solbrig, Harold R; Tao, Cui; Weng, Chunhua; Chute, Christopher G; Jiang, Guoqian

    2017-06-05

    Detailed Clinical Models (DCMs) have been regarded as the basis for retaining computable meaning when data are exchanged between heterogeneous computer systems. To better support clinical cancer data capturing and reporting, there is an emerging need to develop informatics solutions for standards-based clinical models in cancer study domains. The objective of the study is to develop and evaluate a cancer genome study metadata management system that serves as a key infrastructure in supporting clinical information modeling in cancer genome study domains. We leveraged a Semantic Web-based metadata repository enhanced with both ISO11179 metadata standard and Clinical Information Modeling Initiative (CIMI) Reference Model. We used the common data elements (CDEs) defined in The Cancer Genome Atlas (TCGA) data dictionary, and extracted the metadata of the CDEs using the NCI Cancer Data Standards Repository (caDSR) CDE dataset rendered in the Resource Description Framework (RDF). The ITEM/ITEM_GROUP pattern defined in the latest CIMI Reference Model is used to represent reusable model elements (mini-Archetypes). We produced a metadata repository with 38 clinical cancer genome study domains, comprising a rich collection of mini-Archetype pattern instances. We performed a case study of the domain "clinical pharmaceutical" in the TCGA data dictionary and demonstrated enriched data elements in the metadata repository are very useful in support of building detailed clinical models. Our informatics approach leveraging Semantic Web technologies provides an effective way to build a CIMI-compliant metadata repository that would facilitate the detailed clinical modeling to support use cases beyond TCGA in clinical cancer study domains.

  3. Using language models to identify relevant new information in inpatient clinical notes.

    PubMed

    Zhang, Rui; Pakhomov, Serguei V; Lee, Janet T; Melton, Genevieve B

    2014-01-01

    Redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous and may negatively impact the use of these notes by clinicians, and, potentially, the efficiency of patient care delivery. Automated methods to identify redundant versus relevant new information may provide a valuable tool for clinicians to better synthesize patient information and navigate to clinically important details. In this study, we investigated the use of language models for identification of new information in inpatient notes, and evaluated our methods using expert-derived reference standards. The best method achieved precision of 0.743, recall of 0.832 and F1-measure of 0.784. The average proportion of redundant information was similar between inpatient and outpatient progress notes (76.6% (SD=17.3%) and 76.7% (SD=14.0%), respectively). Advanced practice providers tended to have higher rates of redundancy in their notes compared to physicians. Future investigation includes the addition of semantic components and visualization of new information.

  4. Using Language Models to Identify Relevant New Information in Inpatient Clinical Notes

    PubMed Central

    Zhang, Rui; Pakhomov, Serguei V.; Lee, Janet T.; Melton, Genevieve B.

    2014-01-01

    Redundant information in clinical notes within electronic health record (EHR) systems is ubiquitous and may negatively impact the use of these notes by clinicians, and, potentially, the efficiency of patient care delivery. Automated methods to identify redundant versus relevant new information may provide a valuable tool for clinicians to better synthesize patient information and navigate to clinically important details. In this study, we investigated the use of language models for identification of new information in inpatient notes, and evaluated our methods using expert-derived reference standards. The best method achieved precision of 0.743, recall of 0.832 and F1-measure of 0.784. The average proportion of redundant information was similar between inpatient and outpatient progress notes (76.6% (SD=17.3%) and 76.7% (SD=14.0%), respectively). Advanced practice providers tended to have higher rates of redundancy in their notes compared to physicians. Future investigation includes the addition of semantic components and visualization of new information. PMID:25954438

  5. Intelligent Analysis in the LOCATE Workspace Layout Tool

    DTIC Science & Technology

    1999-07-01

    function values ~· Print colour cost displays. .... Figure 10. Consequence for System (Selt) Model of User Query in Figure 9. I I I I I I I I...as a model for other complex and many-featured applications. I I I I I I I I I I I 42 References Broadbent , G. (1988). Design in... models that provide ways of monitoring LOCATE’s understanding of what the user is doing, what he or she knows and how that information might be used to

  6. Fundamentals of Modeling, Data Assimilation, and High-performance Computing

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.

    2005-01-01

    This lecture will introduce the concepts of modeling, data assimilation and high- performance computing as it relates to the study of atmospheric composition. The lecture will work from basic definitions and will strive to provide a framework for thinking about development and application of models and data assimilation systems. It will not provide technical or algorithmic information, leaving that to textbooks, technical reports, and ultimately scientific journals. References to a number of textbooks and papers will be provided as a gateway to the literature.

  7. Leveraging the UML Metamodel: Expressing ORM Semantics Using a UML Profile

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CUYLER,DAVID S.

    2000-11-01

    Object Role Modeling (ORM) techniques produce a detailed domain model from the perspective of the business owner/customer. The typical process begins with a set of simple sentences reflecting facts about the business. The output of the process is a single model representing primarily the persistent information needs of the business. This type of model contains little, if any reference to a targeted computerized implementation. It is a model of business entities not of software classes. Through well-defined procedures, an ORM model can be transformed into a high quality objector relational schema.

  8. Scientific and educational recommender systems

    NASA Astrophysics Data System (ADS)

    Guseva, A. I.; Kireev, V. S.; Bochkarev, P. V.; Kuznetsov, I. A.; Philippov, S. A.

    2017-01-01

    This article discusses the questions associated with the use of reference systems in the preparation of graduates in physical function. The objective of this research is creation of model of recommender system user from the sphere of science and education. The detailed review of current scientific and social network for scientists and the problem of constructing recommender systems in this area. The result of this study is to research user information model systems. The model is presented in two versions: the full one - in the form of a semantic network, and short - in a relational form. The relational model is the projection in the form of semantic network, taking into account the restrictions on the amount of bonds that characterize the number of information items (research results), which interact with the system user.

  9. Australian Seismological Reference Model (AuSREM): crustal component

    NASA Astrophysics Data System (ADS)

    Salmon, M.; Kennett, B. L. N.; Saygin, E.

    2013-01-01

    Although Australia has been the subject of a wide range of seismological studies, these have concentrated on specific features of the continent at crustal scales and on the broad scale features in the mantle. The Australian Seismological Reference Model (AuSREM) is designed to bring together the existing information, and provide a synthesis in the form of a 3-D model that can provide the basis for future refinement from more detailed studies. Extensive studies in the last few decades provide good coverage for much of the continent, and the crustal model builds on the various data sources to produce a representative model that captures the major features of the continental structure and provides a basis for a broad range of further studies. The model is grid based with a 0.5° sampling in latitude and longitude, and is designed to be fully interpolable, so that properties can be extracted at any point. The crustal structure is built from five-layer representations of refraction and receiver function studies and tomographic information. The AuSREM crustal model is available at 1 km intervals. The crustal component makes use of prior compilations of sediment thicknesses, with cross checks against recent reflection profiling, and provides P and S wavespeed distributions through the crust. The primary information for P wavespeed comes from refraction profiles, for S wavespeed from receiver function studies. We are also able to use the results of ambient noise tomography to link the point observations into national coverage. Density values are derived using results from gravity interpretations with an empirical relation between P wavespeed and density. AuSREM is able to build on a new map of depth to Moho, which has been created using all available information including Moho picks from over 12 000 km of full crustal profiling across the continent. The crustal component of AuSREM provides a representative model that should be useful for modelling of seismic wave propagation and calculation of crustal corrections for tomography. Other applications include gravity studies and dynamic topography at the continental scale.

  10. An internal reference model-based PRF temperature mapping method with Cramer-Rao lower bound noise performance analysis.

    PubMed

    Li, Cheng; Pan, Xinyi; Ying, Kui; Zhang, Qiang; An, Jing; Weng, Dehe; Qin, Wen; Li, Kuncheng

    2009-11-01

    The conventional phase difference method for MR thermometry suffers from disturbances caused by the presence of lipid protons, motion-induced error, and field drift. A signal model is presented with multi-echo gradient echo (GRE) sequence using a fat signal as an internal reference to overcome these problems. The internal reference signal model is fit to the water and fat signals by the extended Prony algorithm and the Levenberg-Marquardt algorithm to estimate the chemical shifts between water and fat which contain temperature information. A noise analysis of the signal model was conducted using the Cramer-Rao lower bound to evaluate the noise performance of various algorithms, the effects of imaging parameters, and the influence of the water:fat signal ratio in a sample on the temperature estimate. Comparison of the calculated temperature map and thermocouple temperature measurements shows that the maximum temperature estimation error is 0.614 degrees C, with a standard deviation of 0.06 degrees C, confirming the feasibility of this model-based temperature mapping method. The influence of sample water:fat signal ratio on the accuracy of the temperature estimate is evaluated in a water-fat mixed phantom experiment with an optimal ratio of approximately 0.66:1. (c) 2009 Wiley-Liss, Inc.

  11. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  12. Structure and thermodynamics of a mixture of patchy and spherical colloids: A multi-body association theory with complete reference fluid information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bansal, Artee; Asthagiri, D.; Cox, Kenneth R.

    A mixture of solvent particles with short-range, directional interactions and solute particles with short-range, isotropic interactions that can bond multiple times is of fundamental interest in understanding liquids and colloidal mixtures. Because of multi-body correlations, predicting the structure and thermodynamics of such systems remains a challenge. Earlier Marshall and Chapman [J. Chem. Phys. 139, 104904 (2013)] developed a theory wherein association effects due to interactions multiply the partition function for clustering of particles in a reference hard-sphere system. The multi-body effects are incorporated in the clustering process, which in their work was obtained in the absence of the bulk medium.more » The bulk solvent effects were then modeled approximately within a second order perturbation approach. However, their approach is inadequate at high densities and for large association strengths. Based on the idea that the clustering of solvent in a defined coordination volume around the solute is related to occupancy statistics in that defined coordination volume, we develop an approach to incorporate the complete information about hard-sphere clustering in a bulk solvent at the density of interest. The occupancy probabilities are obtained from enhanced sampling simulations but we also develop a concise parametric form to model these probabilities using the quasichemical theory of solutions. We show that incorporating the complete reference information results in an approach that can predict the bonding state and thermodynamics of the colloidal solute for a wide range of system conditions.« less

  13. A Flexible and Accurate Genotype Imputation Method for the Next Generation of Genome-Wide Association Studies

    PubMed Central

    Howie, Bryan N.; Donnelly, Peter; Marchini, Jonathan

    2009-01-01

    Genotype imputation methods are now being widely used in the analysis of genome-wide association studies. Most imputation analyses to date have used the HapMap as a reference dataset, but new reference panels (such as controls genotyped on multiple SNP chips and densely typed samples from the 1,000 Genomes Project) will soon allow a broader range of SNPs to be imputed with higher accuracy, thereby increasing power. We describe a genotype imputation method (IMPUTE version 2) that is designed to address the challenges presented by these new datasets. The main innovation of our approach is a flexible modelling framework that increases accuracy and combines information across multiple reference panels while remaining computationally feasible. We find that IMPUTE v2 attains higher accuracy than other methods when the HapMap provides the sole reference panel, but that the size of the panel constrains the improvements that can be made. We also find that imputation accuracy can be greatly enhanced by expanding the reference panel to contain thousands of chromosomes and that IMPUTE v2 outperforms other methods in this setting at both rare and common SNPs, with overall error rates that are 15%–20% lower than those of the closest competing method. One particularly challenging aspect of next-generation association studies is to integrate information across multiple reference panels genotyped on different sets of SNPs; we show that our approach to this problem has practical advantages over other suggested solutions. PMID:19543373

  14. Modeling and Simulation Verification, Validation and Accreditation (VV&A): A New Undertaking for the Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Prill, Mark E.

    2005-01-01

    and Accreditation (VV&A) session audience, a snapshot review of the Exploration Space Mission Directorate s (ESMD) investigation into implementation of a modeling and simulation (M&S) VV&A program. The presentation provides some legacy ESMD reference material, including information on the then-current organizational structure, and M&S (Simulation Based Acquisition (SBA)) focus contained therein, to provide a context for the proposed M&S VV&A approach. This reference material briefly highlights the SBA goals and objectives, and outlines FY05 M&S development and implementation consistent with the Subjective Assessment, Constructive Assessment, Operator-in-the-Loop Assessment, Hardware-in-the-Loop Assessment, and In Service Operations Assessment M&S construct, the NASA Exploration Information Ontology Model (NExIOM) data model, and integration with the Windchill-based Integrated Collaborative Environment (ICE). The presentation then addresses the ESMD team s initial conclusions regarding an M&S VV&A program, summarizes the general VV&A implementation approach anticipated, and outlines some of the recognized VV&A program challenges, all within a broader context of the overarching Integrated Modeling and Simulation (IM&S) environment at both the ESMD and Agency (NASA) levels. The presentation concludes with a status on the current M&S organization s progress to date relative to the recommended IM&S implementation activity. The overall presentation was focused to provide, for the Verification, Validation,

  15. Point Cloud Refinement with a Target-Free Intrinsic Calibration of a Mobile Multi-Beam LIDAR System

    NASA Astrophysics Data System (ADS)

    Nouiraa, H.; Deschaud, J. E.; Goulettea, F.

    2016-06-01

    LIDAR sensors are widely used in mobile mapping systems. The mobile mapping platforms allow to have fast acquisition in cities for example, which would take much longer with static mapping systems. The LIDAR sensors provide reliable and precise 3D information, which can be used in various applications: mapping of the environment; localization of objects; detection of changes. Also, with the recent developments, multi-beam LIDAR sensors have appeared, and are able to provide a high amount of data with a high level of detail. A mono-beam LIDAR sensor mounted on a mobile platform will have an extrinsic calibration to be done, so the data acquired and registered in the sensor reference frame can be represented in the body reference frame, modeling the mobile system. For a multibeam LIDAR sensor, we can separate its calibration into two distinct parts: on one hand, we have an extrinsic calibration, in common with mono-beam LIDAR sensors, which gives the transformation between the sensor cartesian reference frame and the body reference frame. On the other hand, there is an intrinsic calibration, which gives the relations between the beams of the multi-beam sensor. This calibration depends on a model given by the constructor, but the model can be non optimal, which would bring errors and noise into the acquired point clouds. In the litterature, some optimizations of the calibration parameters are proposed, but need a specific routine or environment, which can be constraining and time-consuming. In this article, we present an automatic method for improving the intrinsic calibration of a multi-beam LIDAR sensor, the Velodyne HDL-32E. The proposed approach does not need any calibration target, and only uses information from the acquired point clouds, which makes it simple and fast to use. Also, a corrected model for the Velodyne sensor is proposed. An energy function which penalizes points far from local planar surfaces is used to optimize the different proposed parameters for the corrected model, and we are able to give a confidence value for the calibration parameters found. Optimization results on both synthetic and real data are presented.

  16. Global daily reference evapotranspiration modeling and evaluation

    USGS Publications Warehouse

    Senay, G.B.; Verdin, J.P.; Lietzow, R.; Melesse, Assefa M.

    2008-01-01

    Accurate and reliable evapotranspiration (ET) datasets are crucial in regional water and energy balance studies. Due to the complex instrumentation requirements, actual ET values are generally estimated from reference ET values by adjustment factors using coefficients for water stress and vegetation conditions, commonly referred to as crop coefficients. Until recently, the modeling of reference ET has been solely based on important weather variables collected from weather stations that are generally located in selected agro-climatic locations. Since 2001, the National Oceanic and Atmospheric Administration’s Global Data Assimilation System (GDAS) has been producing six-hourly climate parameter datasets that are used to calculate daily reference ET for the whole globe at 1-degree spatial resolution. The U.S. Geological Survey Center for Earth Resources Observation and Science has been producing daily reference ET (ETo) since 2001, and it has been used on a variety of operational hydrological models for drought and streamflow monitoring all over the world. With the increasing availability of local station-based reference ET estimates, we evaluated the GDAS-based reference ET estimates using data from the California Irrigation Management Information System (CIMIS). Daily CIMIS reference ET estimates from 85 stations were compared with GDAS-based reference ET at different spatial and temporal scales using five-year daily data from 2002 through 2006. Despite the large difference in spatial scale (point vs. ∼100 km grid cell) between the two datasets, the correlations between station-based ET and GDAS-ET were very high, exceeding 0.97 on a daily basis to more than 0.99 on time scales of more than 10 days. Both the temporal and spatial correspondences in trend/pattern and magnitudes between the two datasets were satisfactory, suggesting the reliability of using GDAS parameter-based reference ET for regional water and energy balance studies in many parts of the world. While the study revealed the potential of GDAS ETo for large-scale hydrological applications, site-specific use of GDAS ETo in complex hydro-climatic regions such as coastal areas and rugged terrain may require the application of bias correction and/or disaggregation of the GDAS ETo using downscaling techniques.

  17. From conservative to reactive transport under diffusion-controlled conditions

    NASA Astrophysics Data System (ADS)

    Babey, Tristan; de Dreuzy, Jean-Raynald; Ginn, Timothy R.

    2016-05-01

    We assess the possibility to use conservative transport information, such as that contained in transit time distributions, breakthrough curves and tracer tests, to predict nonlinear fluid-rock interactions in fracture/matrix or mobile/immobile conditions. Reference simulated data are given by conservative and reactive transport simulations in several diffusive porosity structures differing by their topological organization. Reactions includes nonlinear kinetically controlled dissolution and desorption. Effective Multi-Rate Mass Transfer models (MRMT) are calibrated solely on conservative transport information without pore topology information and provide concentration distributions on which effective reaction rates are estimated. Reference simulated reaction rates and effective reaction rates evaluated by MRMT are compared, as well as characteristic desorption and dissolution times. Although not exactly equal, these indicators remain very close whatever the porous structure, differing at most by 0.6% and 10% for desorption and dissolution. At early times, this close agreement arises from the fine characterization of the diffusive porosity close to the mobile zone that controls fast mobile-diffusive exchanges. At intermediate to late times, concentration gradients are strongly reduced by diffusion, and reactivity can be captured by a very limited number of rates. We conclude that effective models calibrated solely on conservative transport information like MRMT can accurately estimate monocomponent kinetically controlled nonlinear fluid-rock interactions. Their relevance might extend to more advanced biogeochemical reactions because of the good characterization of conservative concentration distributions, even by parsimonious models (e.g., MRMT with 3-5 rates). We propose a methodology to estimate reactive transport from conservative transport in mobile-immobile conditions.

  18. Voxel inversion of airborne electromagnetic data

    NASA Astrophysics Data System (ADS)

    Auken, E.; Fiandaca, G.; Kirkegaard, C.; Vest Christiansen, A.

    2013-12-01

    Inversion of electromagnetic data usually refers to a model space being linked to the actual observation points, and for airborne surveys the spatial discretization of the model space reflects the flight lines. On the contrary, geological and groundwater models most often refer to a regular voxel grid, not correlated to the geophysical model space. This means that incorporating the geophysical data into the geological and/or hydrological modelling grids involves a spatial relocation of the models, which in itself is a subtle process where valuable information is easily lost. Also the integration of prior information, e.g. from boreholes, is difficult when the observation points do not coincide with the position of the prior information, as well as the joint inversion of airborne and ground-based surveys. We developed a geophysical inversion algorithm working directly in a voxel grid disconnected from the actual measuring points, which then allows for informing directly geological/hydrogeological models, for easier incorporation of prior information and for straightforward integration of different data types in joint inversion. The new voxel model space defines the soil properties (like resistivity) on a set of nodes, and the distribution of the properties is computed everywhere by means of an interpolation function f (e.g. inverse distance or kriging). The position of the nodes is fixed during the inversion and is chosen to sample the soil taking into account topography and inversion resolution. Given this definition of the voxel model space, both 1D and 2D/3D forward responses can be computed. The 1D forward responses are computed as follows: A) a 1D model subdivision, in terms of model thicknesses and direction of the "virtual" horizontal stratification, is defined for each 1D data set. For EM soundings the "virtual" horizontal stratification is set up parallel to the topography at the sounding position. B) the "virtual" 1D models are constructed by interpolating the soil properties in the medium point of the "virtual" layers. For 2D/3D forward responses the algorithm operates similarly, simply filling the 2D/3D meshes of the forward responses by computing the interpolation values in the centres of the mesh cells. The new definition of the voxel model space allows for incorporating straightforwardly the geophysical information into geological and/or hydrological models, just by using for defining the geophysical model space a voxel (hydro)geological grid. This simplify also the propagation of the uncertainty of geophysical parameters into the (hydro)geological models. Furthermore, prior information from boreholes, like resistivity logs, can be applied directly to the voxel model space, even if the borehole positions do not coincide with the actual observation points. In fact, the prior information is constrained to the model parameters through the interpolation function at the borehole locations. The presented algorithm is a further development of the AarhusInv program package developed at Aarhus University (formerly em1dinv), which manages both large scale AEM surveys and ground-based data. This work has been carried out as part of the HyGEM project, supported by the Danish Council of Strategic Research under grant number DSF 11-116763.

  19. The openEHR Java reference implementation project.

    PubMed

    Chen, Rong; Klein, Gunnar

    2007-01-01

    The openEHR foundation has developed an innovative design for interoperable and future-proof Electronic Health Record (EHR) systems based on a dual model approach with a stable reference information model complemented by archetypes for specific clinical purposes.A team from Sweden has implemented all the stable specifications in the Java programming language and donated the source code to the openEHR foundation. It was adopted as the openEHR Java Reference Implementation in March 2005 and released under open source licenses. This encourages early EHR implementation projects around the world and a number of groups have already started to use this code. The early Java implementation experience has also led to the publication of the openEHR Java Implementation Technology Specification. A number of design changes to the specifications and important minor corrections have been directly initiated by the implementation project over the last two years. The Java Implementation has been important for the validation and improvement of the openEHR design specifications and provides building blocks for future EHR systems.

  20. Mobile, Collaborative Situated Knowledge Creation for Urban Planning

    PubMed Central

    Zurita, Gustavo; Baloian, Nelson

    2012-01-01

    Geo-collaboration is an emerging research area in computer sciences studying the way spatial, geographically referenced information and communication technologies can support collaborative activities. Scenarios in which information associated to its physical location are of paramount importance are often referred as Situated Knowledge Creation scenarios. To date there are few computer systems supporting knowledge creation that explicitly incorporate physical context as part of the knowledge being managed in mobile face-to-face scenarios. This work presents a collaborative software application supporting visually-geo-referenced knowledge creation in mobile working scenarios while the users are interacting face-to-face. The system allows to manage data information associated to specific physical locations for knowledge creation processes in the field, such as urban planning, identifying specific physical locations, territorial management, etc.; using Tablet-PCs and GPS in order to geo-reference data and information. It presents a model for developing mobile applications supporting situated knowledge creation in the field, introducing the requirements for such an application and the functionalities it should have in order to fulfill them. The paper also presents the results of utility and usability evaluations. PMID:22778639

  1. Mobile, collaborative situated knowledge creation for urban planning.

    PubMed

    Zurita, Gustavo; Baloian, Nelson

    2012-01-01

    Geo-collaboration is an emerging research area in computer sciences studying the way spatial, geographically referenced information and communication technologies can support collaborative activities. Scenarios in which information associated to its physical location are of paramount importance are often referred as Situated Knowledge Creation scenarios. To date there are few computer systems supporting knowledge creation that explicitly incorporate physical context as part of the knowledge being managed in mobile face-to-face scenarios. This work presents a collaborative software application supporting visually-geo-referenced knowledge creation in mobile working scenarios while the users are interacting face-to-face. The system allows to manage data information associated to specific physical locations for knowledge creation processes in the field, such as urban planning, identifying specific physical locations, territorial management, etc.; using Tablet-PCs and GPS in order to geo-reference data and information. It presents a model for developing mobile applications supporting situated knowledge creation in the field, introducing the requirements for such an application and the functionalities it should have in order to fulfill them. The paper also presents the results of utility and usability evaluations.

  2. Quantifying the value of information for uncertainty reduction in chemical EOR modeling

    NASA Astrophysics Data System (ADS)

    Leray, Sarah; Yeates, Christopher; Douarche, Frédéric; Roggero, Frédéric

    2016-04-01

    Reservoir modeling is a powerful tool to assess the technical and economic feasibility of chemical Enhanced Oil Recovery methods such as the joint injection of surfactant and polymer. Laboratory recovery experiments are usually undertaken on cores to understand recovery mechanisms and to estimate properties, that will be further used to build large scale models. To capture the different processes involved in chemical EOR, models are described by a large number of parameters which are basically only partially constrained by recovery experiments and additional characterizations, mainly because of cost and time restrictions or limited representativeness. Among the most uncertain properties, features the surfactant adsorption which cannot be straightforwardly derived from bulk or simplified dynamic measurements (e.g. single phase dynamic adsorption experiments). It is unfortunately critical for the economics of the process. Identifying the most informative observations (e.g. saturation scans, pressure differential, surfactant production, oil recovery) is of primary interest to compensate deficiency of some characterizations and improve models robustness and their predictive capability. Building a consistent set of recovery experiments that will allow to seize recovery mechanisms is critical as well. To address these inverse methodology issues, we create a synthetic numerical model with a well-defined set of parameter values, considered to be our reference case. This choice of model is based on a similar real data set and a broad literature review. It consists of a water-wet sandstone subject to typical surfactant-polymer injections. We first study the effect of a salinity gradient injected after a surfactant-polymer slug, as it is known to significantly improve oil recovery. We show that reaching optimal conditions of salinity gradient is a fragile balance between surfactant desorption and interfacial tension increase. This high dependence on surfactant adsorption properties indicates that two recovery tests with and without salinity gradient are of great interest for model inversion and characterization of surfactant adsorption. Second, we analyze our capacity to find again the reference model using an assisted history matching method to reproduce a set of synthetic core-scale experiments. To do so, we use the reference model over five configurations with respect to chemicals injection to provide baseline recovery data. Then, we consider some uncertainty on model parameters, regarding surfactant adsorption properties amongst others, leading to a total of twelve uncertain parameters. Finally, we extensively explore the parameter space to find several reasonable matches. We show that an additional sixth recovery experiment is necessary to fully constrain the model, and specifically characterize surfactant adsorption. We besides show that production data are not equally informative: pressure differential is for instance the less informative data while a saturation scan at the end of the polymer post-flush can greatly help in the inversion. The inverse methodology carried out here has also been successfully tested with a real set of coreflood experiments.

  3. The SO(3)×SO(3)×U(1) Hubbard model on a square lattice in terms of c and αν fermions and deconfined η-spinons and spinons

    NASA Astrophysics Data System (ADS)

    Carmelo, J. M. P.

    2012-03-01

    In this paper, a general description for the Hubbard model with nearest-neighbor transfer integral t and on-site repulsion U on a square lattice with Na2≫1 sites is introduced. It refers to three types of elementary objects whose occupancy configurations generate the state representations of the model extended global SO(3)×SO(3)×U(1) symmetry recently found in Ref. [11] (Carmelo and Östlund, 2010). Such objects emerge from a suitable electron-rotated-electron unitary transformation. It is such that rotated-electron single and double occupancy are good quantum numbers for U≠0. The advantage of the description is that it accounts for the new found hidden U(1) symmetry in SO(3)×SO(3)×U(1)=[SU(2)×SU(2)×U(1)]/Z22 beyond the well-known SO(4)=[SU(2)×SU(2)]/Z2 model (partial) global symmetry. Specifically, the hidden U(1) symmetry state representations store full information on the positions of the spins of the rotated-electron singly occupied sites relative to the remaining sites. Profiting from that complementary information, for the whole U/4t>0 interaction range independent spin state representations are naturally generated in terms of spin-1/2 spinon occupancy configurations in a spin effective lattice. For all states, such an effective lattice has as many sites as spinons. This allows the extension to intermediate U/4t values of the usual large-U/4t descriptions of the spin degrees of freedom of the electrons that singly occupy sites, now in terms of the spins of the singly-occupied sites rotated electrons. The operator description introduced in this paper brings about a more suitable scenario for handling the effects of hole doping. Within this, such effects are accounted for in terms of the residual interactions of the elementary objects whose occupancy configurations generate the state representations of the charge hidden U(1) symmetry and spin SU(2) symmetry, respectively. This problem is investigated elsewhere. The most interesting physical information revealed by the description refers to the model on the subspace generated by the application of one- and two-electron operators onto zero-magnetization ground states. (This is the square-lattice quantum liquid further studied in Ref. [5] (Carmelo, 2010).) However, to access such an information, one must start from the general description introduced in this paper, which refers to the model in the full Hilbert space.

  4. Visual guidance of mobile platforms

    NASA Astrophysics Data System (ADS)

    Blissett, Rodney J.

    1993-12-01

    Two systems are described and results presented demonstrating aspects of real-time visual guidance of autonomous mobile platforms. The first approach incorporates prior knowledge in the form of rigid geometrical models linking visual references within the environment. The second approach is based on a continuous synthesis of information extracted from image tokens to generate a coarse-grained world model, from which potential obstacles are inferred. The use of these techniques in workplace applications is discussed.

  5. A DDC Bibliography on Computers in Information Sciences. Volume I. Information Sciences Series.

    ERIC Educational Resources Information Center

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 249 annotated references grouped under two major headings: Time Shared, On-Line, and Real Time Systems, and Computer Components. The references are arranged in accesion number (AD-number)…

  6. 21 CFR 822.14 - May I reference information previously submitted instead of submitting it again?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false May I reference information previously submitted..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES POSTMARKET SURVEILLANCE Postmarket Surveillance Plan § 822.14 May I reference information previously submitted instead of submitting it again? Yes...

  7. 40 CFR 1042.910 - Reference materials.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Reference materials. 1042.910 Section... Other Reference Information § 1042.910 Reference materials. Documents listed in this section have been... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...

  8. 40 CFR 1042.910 - Reference materials.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Reference materials. 1042.910 Section... Other Reference Information § 1042.910 Reference materials. Documents listed in this section have been... information on the availability of this material at NARA, call 202-741-6030, or go to: http://www.archives.gov...

  9. What's in a Name? Interlocutors Dynamically Update Expectations about Shared Names.

    PubMed

    Gegg-Harrison, Whitney M; Tanenhaus, Michael K

    2016-01-01

    In order to refer using a name, speakers must believe that their addressee knows about the link between the name and the intended referent. In cases where speakers and addressees learned a subset of names together, speakers are adept at using only the names their partner knows. But speakers do not always share such learning experience with their conversational partners. In these situations, what information guides speakers' choice of referring expression? A speaker who is uncertain about a names' common ground (CG) status often uses a name and description together. This N+D form allows speakers to demonstrate knowledge of a name, and could provide, even in the absence of miscommunication, useful evidence to the addressee regarding the speaker's knowledge. In cases where knowledge of one name is associated with knowledge of other names, this could provide indirect evidence regarding knowledge of other names that could support generalizations used to update beliefs about CG. Using Bayesian approaches to language processing as a guiding framework, we predict that interlocutors can use their partner's choice of referring expression, in particular their use of an N+D form, to generate more accurate beliefs regarding their partner's knowledge of other names. In Experiment 1, we find that domain experts are able to use their partner's referring expression choices to generate more accurate estimates of CG. In Experiment 2, we find that interlocutors are able to infer from a partner's use of an N+D form which other names that partner is likely to know or not know. Our results suggest that interlocutors can use the information conveyed in their partner's choice of referring expression to make generalizations that contribute to more accurate beliefs about what is shared with their partner, and further, that models of CG for reference need to account not just for the status of referents, but the status of means of referring to those referents.

  10. What's in a Name? Interlocutors Dynamically Update Expectations about Shared Names

    PubMed Central

    Gegg-Harrison, Whitney M.; Tanenhaus, Michael K.

    2016-01-01

    In order to refer using a name, speakers must believe that their addressee knows about the link between the name and the intended referent. In cases where speakers and addressees learned a subset of names together, speakers are adept at using only the names their partner knows. But speakers do not always share such learning experience with their conversational partners. In these situations, what information guides speakers' choice of referring expression? A speaker who is uncertain about a names' common ground (CG) status often uses a name and description together. This N+D form allows speakers to demonstrate knowledge of a name, and could provide, even in the absence of miscommunication, useful evidence to the addressee regarding the speaker's knowledge. In cases where knowledge of one name is associated with knowledge of other names, this could provide indirect evidence regarding knowledge of other names that could support generalizations used to update beliefs about CG. Using Bayesian approaches to language processing as a guiding framework, we predict that interlocutors can use their partner's choice of referring expression, in particular their use of an N+D form, to generate more accurate beliefs regarding their partner's knowledge of other names. In Experiment 1, we find that domain experts are able to use their partner's referring expression choices to generate more accurate estimates of CG. In Experiment 2, we find that interlocutors are able to infer from a partner's use of an N+D form which other names that partner is likely to know or not know. Our results suggest that interlocutors can use the information conveyed in their partner's choice of referring expression to make generalizations that contribute to more accurate beliefs about what is shared with their partner, and further, that models of CG for reference need to account not just for the status of referents, but the status of means of referring to those referents. PMID:26955361

  11. [Instrumental, directive, and affective communication in hospital leaflets].

    PubMed

    Vasconcellos-Silva, Paulo Roberto; Uribe Rivera, Francisco Javier; Castiel, Luis David

    2003-01-01

    This study focuses on the typical semantic systems extracted from hospital staff communicative resources which attempt to validate information as an "object" to be transferred to patients. We describe the models of textual communication in 58 patient information leaflets from five hospital units in Brazil, gathered from 1996 to 2002. Three categories were identified, based on the theory of speech acts (Austin, Searle, and Habermas): 1) cognitive-instrumental utterances: descriptions by means of technical terms validated by self-referred, incomplete, or inaccessible argumentation, with an implicit educational function; 2) technical-directive utterances: self-referred (to the context of the source domains), with a shifting of everyday acts to a technical terrain with a disciplinary function and impersonal features; and 3) expressive modulations: need for inter-subjective connections to strengthen bonds of trust and a tendency to use childish arguments. We conclude that the three categories displayed: fragmentary sources; assumption of univocal messages and invariable use of information (idealized motivations and interests, apart from individualized perspectives); and assumption of universal interests as generators of knowledge.

  12. Selection of appropriate reference genes for RT-qPCR analysis in a streptozotocin-induced Alzheimer's disease model of cynomolgus monkeys (Macaca fascicularis).

    PubMed

    Park, Sang-Je; Kim, Young-Hyun; Lee, Youngjeon; Kim, Kyoung-Min; Kim, Heui-Soo; Lee, Sang-Rae; Kim, Sun-Uk; Kim, Sang-Hyun; Kim, Ji-Su; Jeong, Kang-Jin; Lee, Kyoung-Min; Huh, Jae-Won; Chang, Kyu-Tae

    2013-01-01

    Reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) has been widely used to quantify relative gene expression because of the specificity, sensitivity, and accuracy of this technique. In order to obtain reliable gene expression data from RT-qPCR experiments, it is important to utilize optimal reference genes for the normalization of target gene expression under varied experimental conditions. Previously, we developed and validated a novel icv-STZ cynomolgus monkey model for Alzheimer's disease (AD) research. However, in order to enhance the reliability of this disease model, appropriate reference genes must be selected to allow meaningful analysis of the gene expression levels in the icv-STZ cynomolgus monkey brain. In this study, we assessed the expression stability of 9 candidate reference genes in 2 matched-pair brain samples (5 regions) of control cynomolgus monkeys and those who had received intracerebroventricular injection of streptozotocin (icv-STZ). Three well-known analytical programs geNorm, NormFinder, and BestKeeper were used to choose the suitable reference genes from the total sample group, control group, and icv-STZ group. Combination analysis of the 3 different programs clearly indicated that the ideal reference genes are RPS19 and YWHAZ in the total sample group, GAPDH and RPS19 in the control group, and ACTB and GAPDH in the icv-STZ group. Additionally, we validated the normalization accuracy of the most appropriate reference genes (RPS19 and YWHAZ) by comparison with the least stable gene (TBP) using quantification of the APP and MAPT genes in the total sample group. To the best of our knowledge, this research is the first study to identify and validate the appropriate reference genes in cynomolgus monkey brains. These findings provide useful information for future studies involving the expression of target genes in the cynomolgus monkey.

  13. Detailed clinical models: a review.

    PubMed

    Goossen, William; Goossen-Baremans, Anneke; van der Zel, Michael

    2010-12-01

    Due to the increasing use of electronic patient records and other health care information technology, we see an increase in requests to utilize these data. A highly level of standardization is required during the gathering of these data in the clinical context in order to use it for analyses. Detailed Clinical Models (DCM) have been created toward this purpose and several initiatives have been implemented in various parts of the world to create standardized models. This paper presents a review of DCM. Two types of analyses are presented; one comparing DCM against health care information architectures and a second bottom up approach from concept analysis to representation. In addition core parts of the draft ISO standard 13972 on DCM are used such as clinician involvement, data element specification, modeling, meta information, and repository and governance. SIX INITIATIVES WERE SELECTED: Intermountain Healthcare, 13606/OpenEHR Archetypes, Clinical Templates, Clinical Contents Models, Health Level 7 templates, and Dutch Detailed Clinical Models. Each model selected was reviewed for their overall development, involvement of clinicians, use of data types, code bindings, expressing semantics, modeling, meta information, use of repository and governance. Using both a top down and bottom up approach to comparison reveals many commonalties and differences between initiatives. Important differences include the use of or lack of a reference model and expressiveness of models. Applying clinical data element standards facilitates the use of conceptual DCM models in different technical representations.

  14. Using models to manage systems subject to sustainability indicators

    USGS Publications Warehouse

    Hill, M.C.

    2006-01-01

    Mathematical and numerical models can provide insight into sustainability indicators using relevant simulated quantities, which are referred to here as predictions. To be useful, many concerns need to be considered. Four are discussed here: (a) mathematical and numerical accuracy of the model; (b) the accuracy of the data used in model development, (c) the information observations provide to aspects of the model important to predictions of interest as measured using sensitivity analysis; and (d) the existence of plausible alternative models for a given system. The four issues are illustrated using examples from conservative and transport modelling, and using conceptual arguments. Results suggest that ignoring these issues can produce misleading conclusions.

  15. Emotional Effects on University Choice Behavior: The Influence of Experienced Narrators and Their Characteristics

    PubMed Central

    Callejas-Albiñana, Ana I.; Callejas-Albiñana, Fernando E.; Martínez-Rodríguez, Isabel

    2016-01-01

    This study analyzes the influence that experienced users of university resources might have as narrative sources of information for other students in the process of choosing their schools. Informative videos about the benefits of studying at the university provide a reference model. In these videos, a group of young people present their views and explain their reasons for choosing the university in which they are pursuing their degrees; the various narrators detail all the resources available. This study investigates whether the individual identifiers of these narrators (e.g., gender, age, physical appearance, nonverbal gestures such as smiling, posture) influence perceptions of the credibility of the information they provide. Among a sample of 150 students in their last year of pre-university training, the results demonstrate that the students' ability to identify with the narrators provides information and arouses emotions that inform their perceptions of reliability and therefore their consumption choices. None of these predictors appear to serve as determinants that can be generalized, but if emotional attitudes in response to narratives about the topic (i.e., the university) are positive, then they prompt a change in attitude toward that reference topic too. PMID:27252664

  16. 75 FR 43097 - Airworthiness Directives; The Boeing Company Model 757 Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-23

    ... must be sealed for lightning strike protection. Relevant Service Information AD 2008-23-19 referred to... additional fasteners in the main fuel tanks must be sealed for lightning strike protection. The Federal... bundles inside the left and right equipment cooling system bays, on the left and right rear spars, and on...

  17. The Evolution of SCORM to Tin Can API: Implications for Instructional Design

    ERIC Educational Resources Information Center

    Lindert, Lisa; Su, Bude

    2016-01-01

    Integrating and documenting formal and informal learning experiences is challenging using the current Shareable Content Object Reference Model (SCORM) eLearning standard, which limits the media and data that are obtained from eLearning. In response to SCORM's limitations, corporate, military, and academic institutions have collaborated to develop…

  18. Chapter 13: Tools for analysis

    Treesearch

    William Elliot; Kevin Hyde; Lee MacDonald; James McKean

    2007-01-01

    This chapter presents a synthesis of current computer modeling tools that are, or could be, adopted for use in evaluating the cumulative watershed effects of fuel management. The chapter focuses on runoff, soil erosion and slope stability predictive tools. Readers should refer to chapters on soil erosion and stability for more detailed information on the physical...

  19. Technology-Mediated Advising and Student Support: An Institutional Self-Assessment

    ERIC Educational Resources Information Center

    Community College Research Center, Teachers College, Columbia University, 2017

    2017-01-01

    The rubric aims to help community colleges and broad-access four-year colleges assess their work on technology-mediated advising and student support, sometimes referred to as Integrated Planning and Advising for Student Success (iPASS). This work involves moving from a model of advising focused on information provision or course registration to…

  20. Development, Implementation, and Analysis of an Environmental Simulation Information Reference Library and Archive (ESIRLA)

    DTIC Science & Technology

    1997-12-01

    of the DoD environmental science community to identify cloud modeling and other environmental capabilities that support or could potentially support...benefit of the DoD environmental science community. STC determined the detailed requirements for weather effects products and decision aids for specific Air Force operational electro-optical systems.

  1. Transactional, Cooperative, and Communal: Relating the Structure of Engineering Engagement Programs with the Nature of Partnerships

    ERIC Educational Resources Information Center

    Thompson, Julia D.; Jesiek, Brent K.

    2017-01-01

    This paper examines how the structural features of engineering engagement programs (EEPs) are related to the nature of their service-learning partnerships. "Structure" refers to formal and informal models, processes, and operations adopted or used to describe engagement programs, while "nature" signifies the quality of…

  2. On the Delusiveness of Adopting a Common Space for Modeling IR Objects: Are Queries Documents?

    ERIC Educational Resources Information Center

    Bollmann-Sdorra, Peter; Raghavan, Vjay V.

    1993-01-01

    Proposes that document space and query space have different structures in information retrieval and discusses similarity measures, term independence, and linear structure. Examples are given using the retrieval functions of dot-product, the cosine measure, the coefficient of Jaccard, and the overlap function. (Contains 28 references.) (LRW)

  3. Model (Undocumented) Minorities and "Illegal" Immigrants: Centering Asian Americans and US Carcerality in Undocumented Student Discourse

    ERIC Educational Resources Information Center

    Lachica Buenavista, Tracy

    2018-01-01

    As the numbers of immigrant apprehensions, detentions, and deportations increase, and in context of anti-immigrant sentiment, education scholars must better contend with the way that carcerality affects undocumented student experiences. Carcerality refers to social and political systems that formally and informally promote discipline, punishment,…

  4. Looking at a contrast object before speaking boosts referential informativeness, but is not essential.

    PubMed

    Davies, Catherine; Kreysa, Helene

    2017-07-01

    Variation in referential form has traditionally been accounted for by theoretical frameworks focusing on linguistic and discourse features. Despite the explosion of interest in eye tracking methods in psycholinguistics, the role of visual scanning behaviour in informative reference production is yet to be comprehensively investigated. Here we examine the relationship between speakers' fixations to relevant referents and the form of the referring expressions they produce. Overall, speakers were fully informative across simple and (to a lesser extent) more complex displays, providing appropriately modified referring expressions to enable their addressee to locate the target object. Analysis of contrast fixations revealed that looking at a contrast object boosts but is not essential for full informativeness. Contrast fixations which take place immediately before speaking provide the greatest boost. Informative referring expressions were also associated with later speech onsets than underinformative ones. Based on the finding that fixations during speech planning facilitate but do not fully predict informative referring, direct visual scanning is ruled out as a prerequisite for informativeness. Instead, pragmatic expectations of informativeness may play a more important role. Results are consistent with a goal-based link between eye movements and language processing, here applied for the first time to production processes. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Eastern Baltic region vs. Western Europe: modelling age related changes in the pubic symphysis and the auricular surface.

    PubMed

    Jatautis, Šarūnas; Jankauskas, Rimantas

    2018-02-01

    Objectives. The present study addresses the following two main questions: a) Is the pattern of skeletal ageing observed in well-known western European reference collections applicable to modern eastern Baltic populations, or are population-specific standards needed? b) What are the consequences for estimating the age-at-death distribution in the target population when differences in the estimates from reference data are not taken into account? Materials and methods. The dataset consists of a modern Lithuanian osteological reference collection, which is the only collection of this type in the eastern Baltic countries (n = 381); and two major western European reference collections, Coimbra (n = 264) and Spitalfields (n = 239). The age-related changes were evaluated using the scoring systems of Suchey-Brooks (Brooks & Suchey 1990) and Lovejoy et al. (1985), and were modelled via regression models for multinomial responses. A controlled experiment based on simulations and the Rostock Manifesto estimation protocol (Wood et al. 2002) was then carried out to assess the effect of using estimates from different reference samples and different regression models on estimates of the age-at-death distribution in the hypothetical target population. Results. The following key results were obtained in this study. a) The morphological alterations in the pubic symphysis were much faster among women than among men at comparable ages in all three reference samples. In contrast, we found no strong evidence in any of the reference samples that sex is an important factor to explain rate of changes in the auricular surface. b) The rate of ageing in the pubic symphysis seems to be similar across the three reference samples, but there is little evidence of a similar pattern in the auricular surface. That is, the estimated rate of age-related changes in the auricular surface was much faster in the LORC and the Coimbra samples than in the Spitalfields sample. c) The results of simulations showed that the differences in the estimates from the reference data result in noticeably different age-at-death distributions in the target population. Thus, a degree bias may be expected if estimates from the western European reference data are used to collect information on ages at death in the eastern Baltic region based on the changes in the auricular surface. d) Moreover, the bias is expected to be more pronounced if the fitted regression model improperly describes the reference data. Conclusions. Differences in the timing of age-related changes in skeletal traits are to be expected among European reference samples, and cannot be ignored when seeking to reliably estimate an age-at-death distribution in the target population. This form of bias should be taken into consideration in further studies of skeletal samples from the eastern Baltic region.

  6. Conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews depend on the analytic approach for comparing reported information to reference information.

    PubMed

    Baxter, Suzanne Domel; Smith, Albert F; Hardin, James W; Nichols, Michele D

    2007-04-01

    Validation study data are used to illustrate that conclusions about children's reporting accuracy for energy and macronutrients over multiple interviews (ie, time) depend on the analytic approach for comparing reported and reference information-conventional, which disregards accuracy of reported items and amounts, or reporting-error-sensitive, which classifies reported items as matches (eaten) or intrusions (not eaten), and amounts as corresponding or overreported. Children were observed eating school meals on 1 day (n=12), or 2 (n=13) or 3 (n=79) nonconsecutive days separated by >or=25 days, and interviewed in the morning after each observation day about intake the previous day. Reference (observed) and reported information were transformed to energy and macronutrients (ie, protein, carbohydrate, and fat), and compared. For energy and each macronutrient: report rates (reported/reference), correspondence rates (genuine accuracy measures), and inflation ratios (error measures). Mixed-model analyses. Using the conventional approach for analyzing energy and macronutrients, report rates did not vary systematically over interviews (all four P values >0.61). Using the reporting-error-sensitive approach for analyzing energy and macronutrients, correspondence rates increased over interviews (all four P values <0.04), indicating that reporting accuracy improved over time; inflation ratios decreased, although not significantly, over interviews, also suggesting that reporting accuracy improved over time. Correspondence rates were lower than report rates, indicating that reporting accuracy was worse than implied by conventional measures. When analyzed using the reporting-error-sensitive approach, children's dietary reporting accuracy for energy and macronutrients improved over time, but the conventional approach masked improvements and overestimated accuracy. The reporting-error-sensitive approach is recommended when analyzing data from validation studies of dietary reporting accuracy for energy and macronutrients.

  7. A multi-site cognitive task analysis for biomedical query mediation.

    PubMed

    Hruby, Gregory W; Rasmussen, Luke V; Hanauer, David; Patel, Vimla L; Cimino, James J; Weng, Chunhua

    2016-09-01

    To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: "Identify potential index phenotype," "If needed, request EHR database access rights," and "Perform query and present output to medical researcher", and 8 are invalid. We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. A Multi-Site Cognitive Task Analysis for Biomedical Query Mediation

    PubMed Central

    Hruby, Gregory W.; Rasmussen, Luke V.; Hanauer, David; Patel, Vimla; Cimino, James J.; Weng, Chunhua

    2016-01-01

    Objective To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. Materials and Methods We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. Results The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: “Identify potential index phenotype,” “If needed, request EHR database access rights,” and “Perform query and present output to medical researcher”, and 8 are invalid. Discussion We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. Conclusions We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. PMID:27435950

  9. Information on center characteristics as costs' determinants in multicenter clinical trials: is modeling center effect worth the effort?

    PubMed

    Petrinco, Michele; Pagano, Eva; Desideri, Alessandro; Bigi, Riccardo; Ghidina, Marco; Ferrando, Alberto; Cortigiani, Lauro; Merletti, Franco; Gregori, Dario

    2009-01-01

    Several methodological problems arise when health outcomes and resource utilization are collected at different sites. To avoid misleading conclusions in multi-center economic evaluations the center effect needs to be taken into adequate consideration. The aim of this article is to compare several models, which make use of a different amount of information about the enrolling center. To model the association of total medical costs with the levels of two sets of covariates, one at patient and one at center level, we considered four statistical models, based on the Gamma model in the class of the Generalized Linear Models with a log link, which use different amount of information on the enrolling centers. Models were applied to Cost of Strategies after Myocardial Infarction data, an international randomized trial on costs of uncomplicated acute myocardial infarction (AMI). The simple center effect adjustment based on a single random effect results in a more conservative estimation of the parameters as compared with approaches which make use of deeper information on the centers characteristics. This study shows, with reference to a real multicenter trial, that center information cannot be neglected and should be collected and inserted in the analysis, better in combination with one or more random effect, taking into account in this way also the heterogeneity among centers because of unobserved centers characteristics.

  10. Defining the Core Archive Data Standards of the International Planetary Data Alliance (IPDA)

    NASA Technical Reports Server (NTRS)

    Hughes, J. Steven; Crichton, Dan; Beebe, Reta; Guinness, Ed; Heather, David; Zender, Joe

    2007-01-01

    A goal of the International Planetary Data Alliance (lPDA) is to develop a set of archive data standards that enable the sharing of scientific data across international agencies and missions. To help achieve this goal, the IPDA steering committee initiated a six month proj ect to write requirements for and draft an information model based on the Planetary Data System (PDS) archive data standards. The project had a special emphasis on data formats. A set of use case scenarios were first developed from which a set of requirements were derived for the IPDA archive data standards. The special emphasis on data formats was addressed by identifying data formats that have been used by PDS nodes and other agencies in the creation of successful data sets for the Planetary Data System (PDS). The dependency of the IPDA information model on the PDS archive standards required the compilation of a formal specification of the archive standards currently in use by the PDS. An ontology modelling tool was chosen to capture the information model from various sources including the Planetary Science Data Dictionary [I] and the PDS Standards Reference [2]. Exports of the modelling information from the tool database were used to produce the information model document using an object-oriented notation for presenting the model. The tool exports can also be used for software development and are directly accessible by semantic web applications.

  11. Evolution of biological sequences implies an extreme value distribution of type I for both global and local pairwise alignment scores.

    PubMed

    Bastien, Olivier; Maréchal, Eric

    2008-08-07

    Confidence in pairwise alignments of biological sequences, obtained by various methods such as Blast or Smith-Waterman, is critical for automatic analyses of genomic data. Two statistical models have been proposed. In the asymptotic limit of long sequences, the Karlin-Altschul model is based on the computation of a P-value, assuming that the number of high scoring matching regions above a threshold is Poisson distributed. Alternatively, the Lipman-Pearson model is based on the computation of a Z-value from a random score distribution obtained by a Monte-Carlo simulation. Z-values allow the deduction of an upper bound of the P-value (1/Z-value2) following the TULIP theorem. Simulations of Z-value distribution is known to fit with a Gumbel law. This remarkable property was not demonstrated and had no obvious biological support. We built a model of evolution of sequences based on aging, as meant in Reliability Theory, using the fact that the amount of information shared between an initial sequence and the sequences in its lineage (i.e., mutual information in Information Theory) is a decreasing function of time. This quantity is simply measured by a sequence alignment score. In systems aging, the failure rate is related to the systems longevity. The system can be a machine with structured components, or a living entity or population. "Reliability" refers to the ability to operate properly according to a standard. Here, the "reliability" of a sequence refers to the ability to conserve a sufficient functional level at the folded and maturated protein level (positive selection pressure). Homologous sequences were considered as systems 1) having a high redundancy of information reflected by the magnitude of their alignment scores, 2) which components are the amino acids that can independently be damaged by random DNA mutations. From these assumptions, we deduced that information shared at each amino acid position evolved with a constant rate, corresponding to the information hazard rate, and that pairwise sequence alignment scores should follow a Gumbel distribution, which parameters could find some theoretical rationale. In particular, one parameter corresponds to the information hazard rate. Extreme value distribution of alignment scores, assessed from high scoring segments pairs following the Karlin-Altschul model, can also be deduced from the Reliability Theory applied to molecular sequences. It reflects the redundancy of information between homologous sequences, under functional conservative pressure. This model also provides a link between concepts of biological sequence analysis and of systems biology.

  12. A brief introduction to mixed effects modelling and multi-model inference in ecology

    PubMed Central

    Donaldson, Lynda; Correa-Cano, Maria Eugenia; Goodwin, Cecily E.D.

    2018-01-01

    The use of linear mixed effects models (LMMs) is increasingly common in the analysis of biological data. Whilst LMMs offer a flexible approach to modelling a broad range of data types, ecological data are often complex and require complex model structures, and the fitting and interpretation of such models is not always straightforward. The ability to achieve robust biological inference requires that practitioners know how and when to apply these tools. Here, we provide a general overview of current methods for the application of LMMs to biological data, and highlight the typical pitfalls that can be encountered in the statistical modelling process. We tackle several issues regarding methods of model selection, with particular reference to the use of information theory and multi-model inference in ecology. We offer practical solutions and direct the reader to key references that provide further technical detail for those seeking a deeper understanding. This overview should serve as a widely accessible code of best practice for applying LMMs to complex biological problems and model structures, and in doing so improve the robustness of conclusions drawn from studies investigating ecological and evolutionary questions. PMID:29844961

  13. A brief introduction to mixed effects modelling and multi-model inference in ecology.

    PubMed

    Harrison, Xavier A; Donaldson, Lynda; Correa-Cano, Maria Eugenia; Evans, Julian; Fisher, David N; Goodwin, Cecily E D; Robinson, Beth S; Hodgson, David J; Inger, Richard

    2018-01-01

    The use of linear mixed effects models (LMMs) is increasingly common in the analysis of biological data. Whilst LMMs offer a flexible approach to modelling a broad range of data types, ecological data are often complex and require complex model structures, and the fitting and interpretation of such models is not always straightforward. The ability to achieve robust biological inference requires that practitioners know how and when to apply these tools. Here, we provide a general overview of current methods for the application of LMMs to biological data, and highlight the typical pitfalls that can be encountered in the statistical modelling process. We tackle several issues regarding methods of model selection, with particular reference to the use of information theory and multi-model inference in ecology. We offer practical solutions and direct the reader to key references that provide further technical detail for those seeking a deeper understanding. This overview should serve as a widely accessible code of best practice for applying LMMs to complex biological problems and model structures, and in doing so improve the robustness of conclusions drawn from studies investigating ecological and evolutionary questions.

  14. BRENDA in 2013: integrated reactions, kinetic data, enzyme function data, improved disease classification: new options and contents in BRENDA.

    PubMed

    Schomburg, Ida; Chang, Antje; Placzek, Sandra; Söhngen, Carola; Rother, Michael; Lang, Maren; Munaretto, Cornelia; Ulas, Susanne; Stelzer, Michael; Grote, Andreas; Scheer, Maurice; Schomburg, Dietmar

    2013-01-01

    The BRENDA (BRaunschweig ENzyme DAtabase) enzyme portal (http://www.brenda-enzymes.org) is the main information system of functional biochemical and molecular enzyme data and provides access to seven interconnected databases. BRENDA contains 2.7 million manually annotated data on enzyme occurrence, function, kinetics and molecular properties. Each entry is connected to a reference and the source organism. Enzyme ligands are stored with their structures and can be accessed via their names, synonyms or via a structure search. FRENDA (Full Reference ENzyme DAta) and AMENDA (Automatic Mining of ENzyme DAta) are based on text mining methods and represent a complete survey of PubMed abstracts with information on enzymes in different organisms, tissues or organelles. The supplemental database DRENDA provides more than 910 000 new EC number-disease relations in more than 510 000 references from automatic search and a classification of enzyme-disease-related information. KENDA (Kinetic ENzyme DAta), a new amendment extracts and displays kinetic values from PubMed abstracts. The integration of the EnzymeDetector offers an automatic comparison, evaluation and prediction of enzyme function annotations for prokaryotic genomes. The biochemical reaction database BKM-react contains non-redundant enzyme-catalysed and spontaneous reactions and was developed to facilitate and accelerate the construction of biochemical models.

  15. The Role of Metarepresentation in the Production and Resolution of Referring Expressions.

    PubMed

    Horton, William S; Brennan, Susan E

    2016-01-01

    In this paper we consider the potential role of metarepresentation-the representation of another representation, or as commonly considered within cognitive science, the mental representation of another individual's knowledge and beliefs-in mediating definite reference and common ground in conversation. Using dialogues from a referential communication study in which speakers conversed in succession with two different addressees, we highlight ways in which interlocutors work together to successfully refer to objects, and achieve shared conceptualizations. We briefly review accounts of how such shared conceptualizations could be represented in memory, from simple associations between label and referent, to "triple co-presence" representations that track interlocutors in an episode of referring, to more elaborate metarepresentations that invoke theory of mind, mutual knowledge, or a model of a conversational partner. We consider how some forms of metarepresentation, once created and activated, could account for definite reference in conversation by appealing to ordinary processes in memory. We conclude that any representations that capture information about others' perspectives are likely to be relatively simple and subject to the same kinds of constraints on attention and memory that influence other kinds of cognitive representations.

  16. A post-Bertalanffy Systemics Healthcare Competitive Framework Proposal.

    PubMed

    Fiorini, Rodolfo A; Santacroce, Giulia F

    2014-01-01

    Health Information community can take advantage of a new evolutive categorization cybernetic framework. A systemic concept of principles organizing nature is proposed. It can be used as a multiscaling reference framework to develop successful and competitive antifragile system and new HRO information management strategies in advanced healthcare organization (HO) and high reliability organization (HRO) conveniently. Expected impacts are multifarious and quite articulated at different system scale level: major one is that, for the first time, Biomedical Engineering ideal system categorization levels can be matched exactly to practical system modeling interaction styles, with no paradigmatic operational ambiguity and information loss.

  17. A computer science approach to managing security in health care.

    PubMed

    Asirelli, P; Braccini, G; Caramella, D; Coco, A; Fabbrini, F

    2002-09-01

    The security of electronic medical information is very important for health care organisations, which have to ensure confidentiality, integrity and availability of the information provided. This paper will briefly outline the legal measures adopted by the European Community, Italy and the United States to regulate the use and disclosure of medical records. It will then go on to highlight how information technology can help to address these issues with special reference to the management of organisation policies. To this end, we will present a modelling example for the security policy of a radiological department.

  18. Biologically inspired information theory: Adaptation through construction of external reality models by living systems.

    PubMed

    Nakajima, Toshiyuki

    2015-12-01

    Higher animals act in the world using their external reality models to cope with the uncertain environment. Organisms that have not developed such information-processing organs may also have external reality models built in the form of their biochemical, physiological, and behavioral structures, acquired by natural selection through successful models constructed internally. Organisms subject to illusions would fail to survive in the material universe. How can organisms, or living systems in general, determine the external reality from within? This paper starts with a phenomenological model, in which the self constitutes a reality model developed through the mental processing of phenomena. Then, the it-from-bit concept is formalized using a simple mathematical model. For this formalization, my previous work on an algorithmic process is employed to constitute symbols referring to the external reality, called the inverse causality, with additional improvements to the previous work. Finally, as an extension of this model, the cognizers system model is employed to describe the self as one of many material entities in a world, each of which acts as a subject by responding to the surrounding entities. This model is used to propose a conceptual framework of information theory that can deal with both the qualitative (semantic) and quantitative aspects of the information involved in biological processes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. The Then and Now of Reference Conditions in Streams of the Central Plains

    NASA Astrophysics Data System (ADS)

    Huggins, D.; Angelo, R.; Baker, D. S.; Welker, G.

    2005-05-01

    Models of contemporary and pre-settlement reference conditions were constructed for streams that once drained the tallgrass prairies of Iowa, Nebraska, Kansas and Missouri (e.g. Western Corn Belt Plains ecoregion), and for streams within the heart of the mixed grass prairie (e.g. Southwestern Tablelands ecoregion). Data on watershed, habitat, chemistry and biology compiled for existing reference streams (least or minimally impacted systems) were used to characterize contemporary reference conditions. Contemporary reference conditions within these two prairie regions are contrasted against hypothetical pre-settlement conditions using information from the best streams (upper 25%) of the current reference population, historical accounts, museum records, natural heritage programs, Public Land Survey and current remote sensing data. Similar comparisons were made between historical and current reference conditions for the Southwestern Tablelands located in central Kansas and Oklahoma. Much of this region remains in mixed grass prairie; has limited hydrological alterations (e.g. impoundments, dewatering) and low human and livestock densities. Within the tablelands these factors have preserved reference conditions that resemble historic conditions. Qualitative and quantitative comparisons indicate that many regions within the Central Plains require caution when using "least disturbed" reference streams and conditions to identify regional biological integrity goals relative to the Clean Water Act.

  20. The Reality of Reference: Responsibilities and Competencies for Current Reference Librarians

    ERIC Educational Resources Information Center

    Saunders, Laura

    2012-01-01

    Academic reference services face great challenges as they cope with the pace of technological change, competition from other information service providers, and tight budgets. In fact, some critics suggest that reference services are no longer relevant or necessary as more information moves online. This study examines a nationwide survey that…

  1. 76 FR 33031 - Agency Information Collection (Request for Contact Information) Activity Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-07

    ... for Contact Information) Activity Under OMB Review AGENCY: Veterans Benefits Administration.... Please refer to ``OMB Control No. 2900-0660'' in any correspondence. FOR FURTHER INFORMATION CONTACT... refer to ``OMB Control No. 2900-0660.'' SUPPLEMENTARY INFORMATION: Title: Request for Contact...

  2. Moving towards ecosystem-based fisheries management: Options for parameterizing multi-species biological reference points

    NASA Astrophysics Data System (ADS)

    Moffitt, Elizabeth A.; Punt, André E.; Holsman, Kirstin; Aydin, Kerim Y.; Ianelli, James N.; Ortiz, Ivonne

    2016-12-01

    Multi-species models can improve our understanding of the effects of fishing so that it is possible to make informed and transparent decisions regarding fishery impacts. Broad application of multi-species assessment models to support ecosystem-based fisheries management (EBFM) requires the development and testing of multi-species biological reference points (MBRPs) for use in harvest-control rules. We outline and contrast several possible MBRPs that range from those that can be readily used in current frameworks to those belonging to a broader EBFM context. We demonstrate each of the possible MBRPs using a simple two species model, motivated by walleye pollock (Gadus chalcogrammus) and Pacific cod (Gadus macrocephalus) in the eastern Bering Sea, to illustrate differences among methods. The MBRPs we outline each differ in how they approach the multiple, potentially conflicting management objectives and trade-offs of EBFM. These options for MBRPs allow multi-species models to be readily adapted for EBFM across a diversity of management mandates and approaches.

  3. Constrained Maximum Likelihood Estimation for Model Calibration Using Summary-level Information from External Big Data Sources

    PubMed Central

    Chatterjee, Nilanjan; Chen, Yi-Hau; Maas, Paige; Carroll, Raymond J.

    2016-01-01

    Information from various public and private data sources of extremely large sample sizes are now increasingly available for research purposes. Statistical methods are needed for utilizing information from such big data sources while analyzing data from individual studies that may collect more detailed information required for addressing specific hypotheses of interest. In this article, we consider the problem of building regression models based on individual-level data from an “internal” study while utilizing summary-level information, such as information on parameters for reduced models, from an “external” big data source. We identify a set of very general constraints that link internal and external models. These constraints are used to develop a framework for semiparametric maximum likelihood inference that allows the distribution of covariates to be estimated using either the internal sample or an external reference sample. We develop extensions for handling complex stratified sampling designs, such as case-control sampling, for the internal study. Asymptotic theory and variance estimators are developed for each case. We use simulation studies and a real data application to assess the performance of the proposed methods in contrast to the generalized regression (GR) calibration methodology that is popular in the sample survey literature. PMID:27570323

  4. Evaluation of Used Fuel Disposition in Clay-Bearing Rock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jové Colón, Carlos F.; Weck, Philippe F.; Sassani, David H.

    2014-08-01

    Radioactive waste disposal in shale/argillite rock formations has been widely considered given its desirable isolation properties (low permeability), geochemically reduced conditions, anomalous groundwater pressures, and widespread geologic occurrence. Clay/shale rock formations are characterized by their high content of clay minerals such as smectites and illites where diffusive transport and chemisorption phenomena predominate. These, in addition to low permeability, are key attributes of shale to impede radionuclide mobility. Shale host-media has been comprehensively studied in international nuclear waste repository programs as part of underground research laboratories (URLs) programs in Switzerland, France, Belgium, and Japan. These investigations, in some cases a decademore » or more long, have produced a large but fundamental body of information spanning from site characterization data (geological, hydrogeological, geochemical, geomechanical) to controlled experiments on the engineered barrier system (EBS) (barrier clay and seals materials). Evaluation of nuclear waste disposal in shale formations in the USA was conducted in the late 70’s and mid 80’s. Most of these studies evaluated the potential for shale to host a nuclear waste repository but not at the programmatic level of URLs in international repository programs. This report covers various R&D work and capabilities relevant to disposal of heat-generating nuclear waste in shale/argillite media. Integration and cross-fertilization of these capabilities will be utilized in the development and implementation of the shale/argillite reference case planned for FY15. Disposal R&D activities under the UFDC in the past few years have produced state-of-the-art modeling capabilities for coupled Thermal-Hydrological-Mechanical-Chemical (THMC), used fuel degradation (source term), and thermodynamic modeling and database development to evaluate generic disposal concepts. The THMC models have been developed for shale repository leveraging in large part on the information garnered in URLs and laboratory data to test and demonstrate model prediction capability and to accurately represent behavior of the EBS and the natural (barrier) system (NS). In addition, experimental work to improve our understanding of clay barrier interactions and TM couplings at high temperatures are key to evaluate thermal effects as a result of relatively high heat loads from waste and the extent of sacrificial zones in the EBS. To assess the latter, experiments and modeling approaches have provided important information on the stability and fate of barrier materials under high heat loads. This information is central to the assessment of thermal limits and the implementation of the reference case when constraining EBS properties and the repository layout (e.g., waste package and drift spacing). This report is comprised of various parts, each one describing various R&D activities applicable to shale/argillite media. For example, progress made on modeling and experimental approaches to analyze physical and chemical interactions affecting clay in the EBS, NS, and used nuclear fuel (source term) in support of R&D objectives. It also describes the development of a reference case for shale/argillite media. The accomplishments of these activities are summarized as follows: Development of a reference case for shale/argillite; Investigation of Reactive Transport and Coupled THM Processes in EBS: FY14; Update on Experimental Activities on Buffer/Backfill Interactions at elevated Pressure and Temperature; and Thermodynamic Database Development: Evaluation Strategy, Modeling Tools, First-Principles Modeling of Clay, and Sorption Database Assessment;ANL Mixed Potential Model For Used Fuel Degradation: Application to Argillite and Crystalline Rock Environments.« less

  5. [Psychoactive drug advertising: analysis of scientific information].

    PubMed

    Mastroianni, Patrícia C; Noto, Ana Regina; Galduróz, José Carlos F

    2008-06-01

    According to the World Health Organization, medicinal drug promotion should be reliable, accurate, truthful, informative, balanced, up-to-date and capable of substantiation. The objective of the present study was to review psychoactive drug advertisements to physicians as for information consistency with the related references and accessibility of the cited references. Data was collected in the city of Araraquara, Southeastern Brazil, in 2005. There were collected and reviewed 152 drug advertisements, a total of 304 references. References were requested directly from pharmaceutical companies' customer services and searched in UNESP (Ibict, Athenas) and BIREME (SciELO, PubMed, free-access indexed journals) library network and CAPES journals. Advertisement statements were checked against references using content analysis. Of all references cited in the advertisements studied, 66.7% were accessed. Of 639 promotional statements identified, 346 (54%) were analyzed. The analysis showed that 67.7% of promotional statements in the advertisements were consistent with their references, while the remaining was either partially consistent or inconsistent. Of the material analyzed, an average 2.5 (1-28) references was cited per advertisement. In the text body, there were identified 639 pieces of information clearly associated with at least one cited reference (average 3.5 pieces of information per advertisement). The study results evidenced difficult access to the references. Messages on efficacy, safety and cost, among others, are not always supported by scientific studies. There is a need for regulation changes and effective monitoring of drug promotional materials.

  6. Assessment of TDEM data sensitivity to changes in geoelectric structure as a result of saltwater intrusion

    NASA Astrophysics Data System (ADS)

    Nenna, V.; Knight, R. J.

    2011-12-01

    Pressure from increasing population as well as agricultural and industrial use of freshwater coastal aquifers makes these groundwater resources increasingly vulnerable to saltwater intrusion. Effective management strategies are required to protect these aquifers from overuse and salination. However, development and implementation of these strategies is often complicated by limited information about the complex hydrogeologic structures, properties and processes that govern groundwater flow and saltwater intrusion. To justify the cost of acquiring additional information, water managers need to demonstrate that the value of the acquired information, in terms of the ability to make a decision, exceeds the cost. Traditional hydrologic measurements from wells can give accurate information on hydrogeologic properties, but they are costly and spatially limited. In this study, we propose the use of time-domain electromagnetic (TDEM) methods as a non-invasive alternative to traditional hydrologic measurements for characterizing saltwater intrusion in an unconfined aquifer in Northern California. The aquifer system in this region consists of the unconfined aquifer and an underlying confined freshwater aquifer, which are separated by a clay layer. At our research site, the water in the unconfined aquifer is saline in places, but the underlying, confined aquifer shows no evidence of saltwater intrusion. Water managers require information about the hydraulic connectivity of these two aquifers, as well as the extent of saltwater intrusion in the unconfined aquifer to mitigate the potential for saltwater intrusion into the confined freshwater aquifer. Prior to October 2007, four monitoring wells were drilled approximately 100 m inland from the coast and spanning roughly 300 m from south to north. Wells were drilled to depths between 280 m and 460 m. During construction, lithology information and drilling samples were collected on 1.5 m intervals. Induction logs were also collected, giving the formation conductivity approximately 1 m from the well casing. We use the existing hydrologic data to create a reference case 1D-layered conductivity model at each well location. Using the EM1DTM code developed by UBC-GIF, we simulate the TDEM response to the reference model at 30 time gates. We then investigate the sensitivity of the TDEM measurement by forward modeling the TDEM responses to a large number of model realizations that are consistent with the lithologic and induction logs from each of the four monitoring wells. For each model realization, we draw a layer thickness from a uniform distribution and a conductivity value for each layer from a normal distribution centered on the values from the reference case. We then examine the sensitivity of TDEM data to changes in the hydrogeologic properties. From these models, we evaluate the reliability of TDEM data as a means of determining the integrity and thickness of the clay layer as well as the salinity of the unconfined aquifer. We use these measures of the reliability of TDEM data in assessing the value of the derived information to decision makers as they consider alternatives for developing management strategies for the groundwater system.

  7. Exchanging honest employment references: tiptoeing between defamation and negligent hiring.

    PubMed

    McConnell, Charles R

    2007-01-01

    In present day reference checking, many of the same organizations that seek as much information as possible about people they wish to hire resist giving out more than a bare minimum of information to other organizations. The strongest force driving this minimal reference information release is fear of legal action taken because of something said about an individual ("defamation," supposedly). Many employers seem so frightened of being sued for libel or slander that they share nothing of substance, usually not realizing that in supposedly protecting themselves against defamation charges, they are increasing the risk of negligent hiring charges. However, truthful reference information can be provided with minimal risk if it is provided in good faith, given only to those who have a legitimate need to know, is strictly job related, and is not communicated maliciously. References must always be provided objectively with information verifiable in personnel files.

  8. Personhood, dementia policy and the Irish National Dementia Strategy.

    PubMed

    Hennelly, Niamh; O'Shea, Eamon

    2017-01-01

    Personhood and its realisation in person-centred care is part of the narrative, if not always the reality, of care for people with dementia. This paper examines how personhood is conceptualised and actualised in Ireland through a content analysis of organisational and individual submissions from stakeholders in the development of the Irish National Dementia Strategy, followed by an examination of the Strategy itself. The organisational submissions are further categorised into dementia care models. A structural analysis of the Strategy examines its principles, actions and outcomes in relation to personhood. Of the 72 organisational and individual submissions received in the formulation of the Strategy, 61% contained references to personhood and its synonyms. Of the 35 organisational submissions, 40% fit a biomedical model, 31% a social model and 29% a biopsychosocial model. The Strategy contains one direct reference to personhood and 33 to personhood synonyms. Half of these references were contained within its key principles and objectives; none were associated with priority actions or outcomes. While stakeholders value personhood and the Strategy identifies personhood as an overarching principle, clearer direction on how personhood and person-centred care can be supported in practice and through regulation is necessary in Ireland. The challenge, therefore, is to provide the information, knowledge, incentives and resources for personhood to take hold in dementia care in Ireland.

  9. Rain/No-Rain Identification from Bispectral Satellite Information using Deep Neural Networks

    NASA Astrophysics Data System (ADS)

    Tao, Y.

    2016-12-01

    Satellite-based precipitation estimation products have the advantage of high resolution and global coverage. However, they still suffer from insufficient accuracy. To accurately estimate precipitation from satellite data, there are two most important aspects: sufficient precipitation information in the satellite information and proper methodologies to extract such information effectively. This study applies the state-of-the-art machine learning methodologies to bispectral satellite information for Rain/No-Rain detection. Specifically, we use deep neural networks to extract features from infrared and water vapor channels and connect it to precipitation identification. To evaluate the effectiveness of the methodology, we first applies it to the infrared data only (Model DL-IR only), the most commonly used inputs for satellite-based precipitation estimation. Then we incorporates water vapor data (Model DL-IR + WV) to further improve the prediction performance. Radar stage IV dataset is used as ground measurement for parameter calibration. The operational product, Precipitation Estimation from Remotely Sensed Information Using Artificial Neural Networks Cloud Classification System (PERSIANN-CCS), is used as a reference to compare the performance of both models in both winter and summer seasons.The experiments show significant improvement for both models in precipitation identification. The overall performance gains in the Critical Success Index (CSI) are 21.60% and 43.66% over the verification periods for Model DL-IR only and Model DL-IR+WV model compared to PERSIANN-CCS, respectively. Moreover, specific case studies show that the water vapor channel information and the deep neural networks effectively help recover a large number of missing precipitation pixels under warm clouds while reducing false alarms under cold clouds.

  10. Consistency Analysis of Genome-Scale Models of Bacterial Metabolism: A Metamodel Approach

    PubMed Central

    Ponce-de-Leon, Miguel; Calle-Espinosa, Jorge; Peretó, Juli; Montero, Francisco

    2015-01-01

    Genome-scale metabolic models usually contain inconsistencies that manifest as blocked reactions and gap metabolites. With the purpose to detect recurrent inconsistencies in metabolic models, a large-scale analysis was performed using a previously published dataset of 130 genome-scale models. The results showed that a large number of reactions (~22%) are blocked in all the models where they are present. To unravel the nature of such inconsistencies a metamodel was construed by joining the 130 models in a single network. This metamodel was manually curated using the unconnected modules approach, and then, it was used as a reference network to perform a gap-filling on each individual genome-scale model. Finally, a set of 36 models that had not been considered during the construction of the metamodel was used, as a proof of concept, to extend the metamodel with new biochemical information, and to assess its impact on gap-filling results. The analysis performed on the metamodel allowed to conclude: 1) the recurrent inconsistencies found in the models were already present in the metabolic database used during the reconstructions process; 2) the presence of inconsistencies in a metabolic database can be propagated to the reconstructed models; 3) there are reactions not manifested as blocked which are active as a consequence of some classes of artifacts, and; 4) the results of an automatic gap-filling are highly dependent on the consistency and completeness of the metamodel or metabolic database used as the reference network. In conclusion the consistency analysis should be applied to metabolic databases in order to detect and fill gaps as well as to detect and remove artifacts and redundant information. PMID:26629901

  11. Estimating the ROI on Implementation of RFID at the Ammunition Storage Warehouse and the 40th Supply Depot: KVA as a Methodology

    DTIC Science & Technology

    2009-12-01

    Balanced Scorecard CAPM Capital Asset Pricing Model DIS Defense Information System DoD Department of...Measurement Tool (PMT) is the Balanced Scorecard (BSC) based on critical success factors and key performance indicators. The MND has referred to Jung’s...authors can replicate the methodology for multiple projects to generate a portfolio of projects. Similar to the Capital Asset Pricing Model ( CAPM ) or

  12. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    NASA Astrophysics Data System (ADS)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  13. Determination of reference ranges for elements in human scalp hair.

    PubMed

    Druyan, M E; Bass, D; Puchyr, R; Urek, K; Quig, D; Harmon, E; Marquardt, W

    1998-06-01

    Expected values, reference ranges, or reference limits are necessary to enable clinicians to apply analytical chemical data in the delivery of health care. Determination of references ranges is not straightforward in terms of either selecting a reference population or performing statistical analysis. In light of logistical, scientific, and economic obstacles, it is understandable that clinical laboratories often combine approaches in developing health associated reference values. A laboratory may choose to: 1. Validate either reference ranges of other laboratories or published data from clinical research or both, through comparison of patients test data. 2. Base the laboratory's reference values on statistical analysis of results from specimens assayed by the clinical reference laboratory itself. 3. Adopt standards or recommendations of regulatory agencies and governmental bodies. 4. Initiate population studies to validate transferred reference ranges or to determine them anew. Effects of external contamination and anecdotal information from clinicians may be considered. The clinical utility of hair analysis is well accepted for some elements. For others, it remains in the realm of clinical investigation. This article elucidates an approach for establishment of reference ranges for elements in human scalp hair. Observed levels of analytes from hair specimens from both our laboratory's total patient population and from a physician-defined healthy American population have been evaluated. Examination of levels of elements often associated with toxicity serves to exemplify the process of determining reference ranges in hair. In addition the approach serves as a model for setting reference ranges for analytes in a variety of matrices.

  14. Variational Iterative Refinement Source Term Estimation Algorithm Assessment for Rural and Urban Environments

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.

    2016-12-01

    It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.

  15. Cognitive models of pilot categorization and prioritization of flight-deck information

    NASA Technical Reports Server (NTRS)

    Jonsson, Jon E.; Ricks, Wendell R.

    1995-01-01

    In the past decade, automated systems on modern commercial flight decks have increased dramatically. Pilots now regularly interact and share tasks with these systems. This interaction has led human factors research to direct more attention to the pilot's cognitive processing and mental model of the information flow occurring on the flight deck. The experiment reported herein investigated how pilots mentally represent and process information typically available during flight. Fifty-two commercial pilots participated in tasks that required them to provide similarity ratings for pairs of flight-deck information and to prioritize this information under two contextual conditions. Pilots processed the information along three cognitive dimensions. These dimensions included the flight function and the flight action that the information supported and how frequently pilots refer to the information. Pilots classified the information as aviation, navigation, communications, or systems administration information. Prioritization results indicated a high degree of consensus among pilots, while scaling results revealed two dimensions along which information is prioritized. Pilot cognitive workload for flight-deck tasks and the potential for using these findings to operationalize cognitive metrics are evaluated. Such measures may be useful additions for flight-deck human performance evaluation.

  16. 32 CFR 2700.1 - References.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... INFORMATION REGULATIONS Introduction § 2700.1 References. (a) Executive Order 12065, “National Security Information,” June 28, 1978, (hereinafter E.O. 12065). (b) Information Security Oversight Office, Directive No. 1, “National Security Information,” October 2, 1978, (hereinafter ISOO Directive No. 1). ...

  17. Field Guide to Plant Model Systems

    PubMed Central

    Chang, Caren; Bowman, John L.; Meyerowitz, Elliot M.

    2016-01-01

    For the past several decades, advances in plant development, physiology, cell biology, and genetics have relied heavily on the model (or reference) plant Arabidopsis thaliana. Arabidopsis resembles other plants, including crop plants, in many but by no means all respects. Study of Arabidopsis alone provides little information on the evolutionary history of plants, evolutionary differences between species, plants that survive in different environments, or plants that access nutrients and photosynthesize differently. Empowered by the availability of large-scale sequencing and new technologies for investigating gene function, many new plant models are being proposed and studied. PMID:27716506

  18. Reference and Information Services: An Introduction. Third Edition. Library and Information Science Text Series.

    ERIC Educational Resources Information Center

    Bopp, Richard E., Ed.; Smith, Linda C., Ed.

    Like the first two editions, this third edition is designed primarily to provide the beginning student of library and information science with an overview both of the concepts and processes behind today's reference services and of the most important sources consulted in answering common types of reference questions. The first 12 chapters deal with…

  19. Education: A Guide to Reference and Information Sources. Second Edition. Reference Sources in the Social Sciences Series.

    ERIC Educational Resources Information Center

    O'Brien, Nancy Patricia

    The purpose of this guide is to provide information about the key reference and information resources in the field of education. Sources include items published from 1990 through 1998, with selective inclusion of significant or unique works published prior to 1990. The guide is divided into 14 categories that reflect different aspects of…

  20. Back to the Future: Have Remotely Sensed Digital Elevation Models Improved Hydrological Parameter Extraction?

    NASA Astrophysics Data System (ADS)

    Jarihani, B.

    2015-12-01

    Digital Elevation Models (DEMs) that accurately replicate both landscape form and processes are critical to support modeling of environmental processes. Pre-processing analysis of DEMs and extracting characteristics of the watershed (e.g., stream networks, catchment delineation, surface and subsurface flow paths) is essential for hydrological and geomorphic analysis and sediment transport. This study investigates the status of the current remotely-sensed DEMs in providing advanced morphometric information of drainage basins particularly in data sparse regions. Here we assess the accuracy of three available DEMs: (i) hydrologically corrected "H-DEM" of Geoscience Australia derived from the Shuttle Radar Topography Mission (SRTM) data; (ii) the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM) version2 1-arc-second (~30 m) data; and (iii) the 9-arc-second national GEODATA DEM-9S ver3 from Geoscience Australia and the Australian National University. We used ESRI's geospatial data model, Arc Hydro and HEC-GeoHMS, designed for building hydrologic information systems to synthesize geospatial and temporal water resources data that support hydrologic modeling and analysis. A coastal catchment in northeast Australia was selected as the study site where very high resolution LiDAR data are available for parts of the area as reference data to assess the accuracy of other lower resolution datasets. This study provides morphometric information for drainage basins as part of the broad research on sediment flux from coastal basins to Great Barrier Reef, Australia. After applying geo-referencing and elevation corrections, stream and sub basins were delineated for each DEM. Then physical characteristics for streams (i.e., length, upstream and downstream elevation, and slope) and sub-basins (i.e., longest flow lengths, area, relief and slopes) were extracted and compared with reference datasets from LiDAR. Results showed that, in the absence of high-precision and high resolution DEM data, ASTER GDEM or SRTM DEM can be used to extract common morphometric relationship which are widely used for hydrological and geomorphological modelling.

  1. A bi-articular model for scapular-humeral rhythm reconstruction through data from wearable sensors.

    PubMed

    Lorussi, Federico; Carbonaro, Nicola; De Rossi, Danilo; Tognetti, Alessandro

    2016-04-23

    Patient-specific performance assessment of arm movements in daily life activities is fundamental for neurological rehabilitation therapy. In most applications, the shoulder movement is simplified through a socket-ball joint, neglecting the movement of the scapular-thoracic complex. This may lead to significant errors. We propose an innovative bi-articular model of the human shoulder for estimating the position of the hand in relation to the sternum. The model takes into account both the scapular-toracic and gleno-humeral movements and their ratio governed by the scapular-humeral rhythm, fusing the information of inertial and textile-based strain sensors. To feed the reconstruction algorithm based on the bi-articular model, an ad-hoc sensing shirt was developed. The shirt was equipped with two inertial measurement units (IMUs) and an integrated textile strain sensor. We built the bi-articular model starting from the data obtained in two planar movements (arm abduction and flexion in the sagittal plane) and analysing the error between the reference data - measured through an optical reference system - and the socket-ball approximation of the shoulder. The 3D model was developed by extending the behaviour of the kinematic chain revealed in the planar trajectories through a parameter identification that takes into account the body structure of the subject. The bi-articular model was evaluated in five subjects in comparison with the optical reference system. The errors were computed in terms of distance between the reference position of the trochlea (end-effector) and the correspondent model estimation. The introduced method remarkably improved the estimation of the position of the trochlea (and consequently the estimation of the hand position during reaching activities) reducing position errors from 11.5 cm to 1.8 cm. Thanks to the developed bi-articular model, we demonstrated a reliable estimation of the upper arm kinematics with a minimal sensing system suitable for daily life monitoring of recovery.

  2. West Flank Coso, CA FORGE 3D geologic model

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    This is an x,y,z file of the West Flank FORGE 3D geologic model. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  3. Fallon FORGE 3D Geologic Model

    DOE Data Explorer

    Doug Blankenship

    2016-03-01

    An x,y,z scattered data file for the 3D geologic model of the Fallon FORGE site. Model created in Earthvision by Dynamic Graphic Inc. The model was constructed with a grid spacing of 100 m. Geologic surfaces were extrapolated from the input data using a minimum tension gridding algorithm. The data file is tabular data in a text file, with lithology data associated with X,Y,Z grid points. All the relevant information is in the file header (the spatial reference, the projection etc.) In addition all the fields in the data file are identified in the header.

  4. This person is saying bad things about you: The influence of physically and socially threatening context information on the processing of inherently neutral faces.

    PubMed

    Klein, Fabian; Iffland, Benjamin; Schindler, Sebastian; Wabnitz, Pascal; Neuner, Frank

    2015-12-01

    Recent studies have shown that the perceptual processing of human faces is affected by context information, such as previous experiences and information about the person represented by the face. The present study investigated the impact of verbally presented information about the person that varied with respect to affect (neutral, physically threatening, socially threatening) and reference (self-referred, other-referred) on the processing of faces with an inherently neutral expression. Stimuli were presented in a randomized presentation paradigm. Event-related potential (ERP) analysis demonstrated a modulation of the evoked potentials by reference at the EPN (early posterior negativity) and LPP (late positive potential) stage and an enhancing effect of affective valence on the LPP (700-1000 ms) with socially threatening context information leading to the most pronounced LPP amplitudes. We also found an interaction between reference and valence with self-related neutral context information leading to more pronounced LPP than other related neutral context information. Our results indicate an impact of self-reference on early, presumably automatic processing stages and also a strong impact of valence on later stages. Using a randomized presentation paradigm, this study confirms that context information affects the visual processing of faces, ruling out possible confounding factors such as facial configuration or conditional learning effects.

  5. Word learning emerges from the interaction of online referent selection and slow associative learning.

    PubMed

    McMurray, Bob; Horst, Jessica S; Samuelson, Larissa K

    2012-10-01

    Classic approaches to word learning emphasize referential ambiguity: In naming situations, a novel word could refer to many possible objects, properties, actions, and so forth. To solve this, researchers have posited constraints, and inference strategies, but assume that determining the referent of a novel word is isomorphic to learning. We present an alternative in which referent selection is an online process and independent of long-term learning. We illustrate this theoretical approach with a dynamic associative model in which referent selection emerges from real-time competition between referents and learning is associative (Hebbian). This model accounts for a range of findings including the differences in expressive and receptive vocabulary, cross-situational learning under high degrees of ambiguity, accelerating (vocabulary explosion) and decelerating (power law) learning, fast mapping by mutual exclusivity (and differences in bilinguals), improvements in familiar word recognition with development, and correlations between speed of processing and learning. Together it suggests that (a) association learning buttressed by dynamic competition can account for much of the literature; (b) familiar word recognition is subserved by the same processes that identify the referents of novel words (fast mapping); (c) online competition may allow the children to leverage information available in the task to augment performance despite slow learning; (d) in complex systems, associative learning is highly multifaceted; and (e) learning and referent selection, though logically distinct, can be subtly related. It suggests more sophisticated ways of describing the interaction between situation- and developmental-time processes and points to the need for considering such interactions as a primary determinant of development. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  6. New statistical potential for quality assessment of protein models and a survey of energy functions

    PubMed Central

    2010-01-01

    Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality. PMID:20226048

  7. Establishment of Low Energy Building materials and Equipment Database Based on Property Information

    NASA Astrophysics Data System (ADS)

    Kim, Yumin; Shin, Hyery; eon Lee, Seung

    2018-03-01

    The purpose of this study is to provide reliable service of materials information portal through the establishment of public big data by collecting and integrating scattered low energy building materials and equipment data. There were few cases of low energy building materials database in Korea have provided material properties as factors influencing material pricing. The framework of the database was defined referred with Korea On-line E-procurement system. More than 45,000 data were gathered by the specification of entities and with the gathered data, price prediction models for chillers were suggested. To improve the usability of the prediction model, detailed properties should be analysed for each item.

  8. Reliability measures in item response theory: manifest versus latent correlation functions.

    PubMed

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Verbeke, Geert; De Boeck, Paul

    2015-02-01

    For item response theory (IRT) models, which belong to the class of generalized linear or non-linear mixed models, reliability at the scale of observed scores (i.e., manifest correlation) is more difficult to calculate than latent correlation based reliability, but usually of greater scientific interest. This is not least because it cannot be calculated explicitly when the logit link is used in conjunction with normal random effects. As such, approximations such as Fisher's information coefficient, Cronbach's α, or the latent correlation are calculated, allegedly because it is easy to do so. Cronbach's α has well-known and serious drawbacks, Fisher's information is not meaningful under certain circumstances, and there is an important but often overlooked difference between latent and manifest correlations. Here, manifest correlation refers to correlation between observed scores, while latent correlation refers to correlation between scores at the latent (e.g., logit or probit) scale. Thus, using one in place of the other can lead to erroneous conclusions. Taylor series based reliability measures, which are based on manifest correlation functions, are derived and a careful comparison of reliability measures based on latent correlations, Fisher's information, and exact reliability is carried out. The latent correlations are virtually always considerably higher than their manifest counterparts, Fisher's information measure shows no coherent behaviour (it is even negative in some cases), while the newly introduced Taylor series based approximations reflect the exact reliability very closely. Comparisons among the various types of correlations, for various IRT models, are made using algebraic expressions, Monte Carlo simulations, and data analysis. Given the light computational burden and the performance of Taylor series based reliability measures, their use is recommended. © 2014 The British Psychological Society.

  9. Jobs and Economic Development Impact (JEDI) User Reference Guide: Fast Pyrolysis Biorefinery Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yimin; Goldberg, Marshall

    2015-02-01

    This guide -- the JEDI Fast Pyrolysis Biorefinery Model User Reference Guide -- was developed to assist users in operating and understanding the JEDI Fast Pyrolysis Biorefinery Model. The guide provides information on the model's underlying methodology, as well as the parameters and data sources used to develop the cost data utilized in the model. This guide also provides basic instruction on model add-in features and a discussion of how the results should be interpreted. Based on project-specific inputs from the user, the JEDI Fast Pyrolysis Biorefinery Model estimates local (e.g., county- or state-level) job creation, earnings, and output frommore » total economic activity for a given fast pyrolysis biorefinery. These estimates include the direct, indirect and induced economic impacts to the local economy associated with the construction and operation phases of biorefinery projects.Local revenue and supply chain impacts as well as induced impacts are estimated using economic multipliers derived from the IMPLAN software program. By determining the local economic impacts and job creation for a proposed biorefinery, the JEDI Fast Pyrolysis Biorefinery Model can be used to field questions about the added value biorefineries might bring to a local community.« less

  10. 47 CFR 301.10 - Cross-reference.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Cross-reference. 301.10 Section 301.10 Telecommunication NATIONAL TELECOMMUNICATIONS AND INFORMATION ADMINISTRATION, DEPARTMENT OF COMMERCE RELOCATION OF AND SPECTRUM SHARING BY FEDERAL GOVERNMENT STATIONS General Information § 301.10 Cross-reference. The...

  11. 47 CFR 301.10 - Cross-reference.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Cross-reference. 301.10 Section 301.10 Telecommunication NATIONAL TELECOMMUNICATIONS AND INFORMATION ADMINISTRATION, DEPARTMENT OF COMMERCE RELOCATION OF AND SPECTRUM SHARING BY FEDERAL GOVERNMENT STATIONS General Information § 301.10 Cross-reference. The...

  12. Preceiving Patterns of Reference Service: A Survey

    ERIC Educational Resources Information Center

    Blakely, Florence

    1971-01-01

    Reference librarians must, if they hope to survive, retool in preparation for becoming the interface between the patron and computer-based information systems. This involves sharpening the interview technique and understanding where to plug into the information flow process. (4 references) (Author)

  13. Using a high-dimensional graph of semantic space to model relationships among words

    PubMed Central

    Jackson, Alice F.; Bolger, Donald J.

    2014-01-01

    The GOLD model (Graph Of Language Distribution) is a network model constructed based on co-occurrence in a large corpus of natural language that may be used to explore what information may be present in a graph-structured model of language, and what information may be extracted through theoretically-driven algorithms as well as standard graph analysis methods. The present study will employ GOLD to examine two types of relationship between words: semantic similarity and associative relatedness. Semantic similarity refers to the degree of overlap in meaning between words, while associative relatedness refers to the degree to which two words occur in the same schematic context. It is expected that a graph structured model of language constructed based on co-occurrence should easily capture associative relatedness, because this type of relationship is thought to be present directly in lexical co-occurrence. However, it is hypothesized that semantic similarity may be extracted from the intersection of the set of first-order connections, because two words that are semantically similar may occupy similar thematic or syntactic roles across contexts and thus would co-occur lexically with the same set of nodes. Two versions the GOLD model that differed in terms of the co-occurence window, bigGOLD at the paragraph level and smallGOLD at the adjacent word level, were directly compared to the performance of a well-established distributional model, Latent Semantic Analysis (LSA). The superior performance of the GOLD models (big and small) suggest that a single acquisition and storage mechanism, namely co-occurrence, can account for associative and conceptual relationships between words and is more psychologically plausible than models using singular value decomposition (SVD). PMID:24860525

  14. Using a high-dimensional graph of semantic space to model relationships among words.

    PubMed

    Jackson, Alice F; Bolger, Donald J

    2014-01-01

    The GOLD model (Graph Of Language Distribution) is a network model constructed based on co-occurrence in a large corpus of natural language that may be used to explore what information may be present in a graph-structured model of language, and what information may be extracted through theoretically-driven algorithms as well as standard graph analysis methods. The present study will employ GOLD to examine two types of relationship between words: semantic similarity and associative relatedness. Semantic similarity refers to the degree of overlap in meaning between words, while associative relatedness refers to the degree to which two words occur in the same schematic context. It is expected that a graph structured model of language constructed based on co-occurrence should easily capture associative relatedness, because this type of relationship is thought to be present directly in lexical co-occurrence. However, it is hypothesized that semantic similarity may be extracted from the intersection of the set of first-order connections, because two words that are semantically similar may occupy similar thematic or syntactic roles across contexts and thus would co-occur lexically with the same set of nodes. Two versions the GOLD model that differed in terms of the co-occurence window, bigGOLD at the paragraph level and smallGOLD at the adjacent word level, were directly compared to the performance of a well-established distributional model, Latent Semantic Analysis (LSA). The superior performance of the GOLD models (big and small) suggest that a single acquisition and storage mechanism, namely co-occurrence, can account for associative and conceptual relationships between words and is more psychologically plausible than models using singular value decomposition (SVD).

  15. E&V (Evaluation and Validation) Reference Manual, Version 1.0.

    DTIC Science & Technology

    1988-07-01

    references featured in the Reference Manual. G-05097a GENERAL REFERENCE INFORMATION EXTRACTED , FROM * INDEXES AND CROSS REFERENCES CHAPTER 4...at E&V techniques through many different paths, and provides a means to extract useful information along the way. /^c^^s; /r^ ^yr*•**•»» * L...electronically (preferred) to szymansk@ajpo.sei.cmu.edu or by regular mail to Mr. Raymond Szymanski . AFWAUAAAF, Wright Patterson AFB, OH 45433-6543. ES-2

  16. Basic anatomical and physiological data for use in radiological protection: reference values. A report of age- and gender-related differences in the anatomical and physiological characteristics of reference individuals. ICRP Publication 89.

    PubMed

    2002-01-01

    This report presents detailed information on age- and gender-related differences in the anatomical and physiological characteristics of reference individuals. These reference values provide needed input to prospective dosimetry calculations for radiation protection purposes for both workers and members of the general public. The purpose of this report is to consolidate and unify in one publication, important new information on reference anatomical and physiological values that has become available since Publication 23 was published by the ICRP in 1975. There are two aspects of this work. The first is to revise and extend the information in Publication 23 as appropriate. The second is to provide additional information on individual variation among grossly normal individuals resulting from differences in age, gender, race, or other factors. This publication collects, unifies, and expands the updated ICRP reference values for the purpose of providing a comprehensive and consistent set of age- and gender-specific reference values for anatomical and physiological features of the human body pertinent to radiation dosimetry. The reference values given in this report are based on: (a) anatomical and physiological information not published before by the ICRP; (b) recent ICRP publications containing reference value information; and (c) information in Publication 23 that is still considered valid and appropriate for radiation protection purposes. Moving from the past emphasis on 'Reference Man', the new report presents a series of reference values for both male and female subjects of six different ages: newborn, 1 year, 5 years, 10 years, 15 years, and adult. In selecting reference values, the Commission has used data on Western Europeans and North Americans because these populations have been well studied with respect to antomy, body composition, and physiology. When appropriate, comparisons are made between the chosen reference values and data from several Asian populations. The first section of the report provides summary tables of all the anatomical and physiological parameters given as reference values in this publication. These results give a comprehensive view of reference values for an individual as influenced by age and gender. The second section describes characteristics of dosimetric importance for the embryo and fetus. Information is provided on the development of the total body and the timing of appearance and development of the various organ systems. Reference values are provided on the mass of the total body and selected organs and tissues, as well as a number of physiological parameters. The third section deals with reference values of important anatomical and physiological characteristics of reference individuals from birth to adulthood. This section begins with details on the growth and composition of the total body in males and females. It then describes and quantifies anatomical and physiological characteristics of various organ systems and changes in these characteristics during growth, maturity, and pregnancy. Reference values are specified for characteristics of dosimetric importance. The final section gives a brief summary of the elemental composition of individuals. Focusing on the elements of dosimetric importance, information is presented on the body content of 13 elements: calcium, carbon, chloride, hydrogen, iodine, iron, magnesium, nitrogen, oxygen, potassium, sodium, sulphur, and phosphorus.

  17. Suicide Awareness Training for Faculty and Staff: A Training Model for School Counselors

    ERIC Educational Resources Information Center

    Gibbons, Melinda M.; Studer, Jeannine R.

    2008-01-01

    Suicide among school-aged youth is a growing concern, and school personnel have a legal obligation to provide suicide prevention programming to faculty and staff. School counselors have the skills to provide such training, as well as to inform staff and faculty of school policy and procedures for referring potentially suicidal students. A…

  18. Defining Quality in Assisted Living: Comparing Apples, Oranges, and Broccoli

    ERIC Educational Resources Information Center

    Hawes, Catherine; Phillips, Charles D.

    2007-01-01

    Purpose: The purpose of this article is to discuss and describe various measures of quality, quality indicators, and uses of information on quality with specific reference to the role or purpose of assisted living. Design and Methods: We reviewed a variety of major studies of assisted living quality. We elaborated models of assisted living based…

  19. Query Health: standards-based, cross-platform population health surveillance

    PubMed Central

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Objective Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Materials and methods Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. Results We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. Discussions This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Conclusions Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. PMID:24699371

  20. Query Health: standards-based, cross-platform population health surveillance.

    PubMed

    Klann, Jeffrey G; Buck, Michael D; Brown, Jeffrey; Hadley, Marc; Elmore, Richard; Weber, Griffin M; Murphy, Shawn N

    2014-01-01

    Understanding population-level health trends is essential to effectively monitor and improve public health. The Office of the National Coordinator for Health Information Technology (ONC) Query Health initiative is a collaboration to develop a national architecture for distributed, population-level health queries across diverse clinical systems with disparate data models. Here we review Query Health activities, including a standards-based methodology, an open-source reference implementation, and three pilot projects. Query Health defined a standards-based approach for distributed population health queries, using an ontology based on the Quality Data Model and Consolidated Clinical Document Architecture, Health Quality Measures Format (HQMF) as the query language, the Query Envelope as the secure transport layer, and the Quality Reporting Document Architecture as the result language. We implemented this approach using Informatics for Integrating Biology and the Bedside (i2b2) and hQuery for data analytics and PopMedNet for access control, secure query distribution, and response. We deployed the reference implementation at three pilot sites: two public health departments (New York City and Massachusetts) and one pilot designed to support Food and Drug Administration post-market safety surveillance activities. The pilots were successful, although improved cross-platform data normalization is needed. This initiative resulted in a standards-based methodology for population health queries, a reference implementation, and revision of the HQMF standard. It also informed future directions regarding interoperability and data access for ONC's Data Access Framework initiative. Query Health was a test of the learning health system that supplied a functional methodology and reference implementation for distributed population health queries that has been validated at three sites. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  1. Accuracy and coverage of the modernized Polish Maritime differential GPS system

    NASA Astrophysics Data System (ADS)

    Specht, Cezary

    2011-01-01

    The DGPS navigation service augments The NAVSTAR Global Positioning System by providing localized pseudorange correction factors and ancillary information which are broadcast over selected marine reference stations. The DGPS service position and integrity information satisfy requirements in coastal navigation and hydrographic surveys. Polish Maritime DGPS system has been established in 1994 and modernized (in 2009) to meet the requirements set out in IMO resolution for a future GNSS, but also to preserve backward signal compatibility of user equipment. Having finalized installation of the new technology L1, L2 reference equipment performance tests were performed.The paper presents results of the coverage modeling and accuracy measuring campaign based on long-term signal analyses of the DGPS reference station Rozewie, which was performed for 26 days in July 2009. Final results allowed to verify the coverage area of the differential signal from reference station and calculated repeatable and absolute accuracy of the system, after the technical modernization. Obtained field strength level area and position statistics (215,000 fixes) were compared to past measurements performed in 2002 (coverage) and 2005 (accuracy), when previous system infrastructure was in operation.So far, no campaigns were performed on differential Galileo. However, as signals, signal processing and receiver techniques are comparable to those know from DGPS. Because all satellite differential GNSS systems use the same transmission standard (RTCM), maritime DGPS Radiobeacons are standardized in all radio communication aspects (frequency, binary rate, modulation), then the accuracy results of differential Galileo can be expected as a similar to DGPS.Coverage of the reference station was calculated based on unique software, which calculate the signal strength level based on transmitter parameters or field signal strength measurement campaign, done in the representative points. The software works based on Baltic sea vector map, ground electric parameters and models atmospheric noise level in the transmission band.

  2. Modeling Requirements for Cohort and Register IT.

    PubMed

    Stäubert, Sebastian; Weber, Ulrike; Michalik, Claudia; Dress, Jochen; Ngouongo, Sylvie; Stausberg, Jürgen; Winter, Alfred

    2016-01-01

    The project KoRegIT (funded by TMF e.V.) aimed to develop a generic catalog of requirements for research networks like cohort studies and registers (KoReg). The catalog supports such kind of research networks to build up and to manage their organizational and IT infrastructure. To make transparent the complex relationships between requirements, which are described in use cases from a given text catalog. By analyzing and modeling the requirements a better understanding and optimizations of the catalog are intended. There are two subgoals: a) to investigate one cohort study and two registers and to model the current state of their IT infrastructure; b) to analyze the current state models and to find simplifications within the generic catalog. Processing the generic catalog was performed by means of text extraction, conceptualization and concept mapping. Then methods of enterprise architecture planning (EAP) are used to model the extracted information. To work on objective a) questionnaires are developed by utilizing the model. They are used for semi-structured interviews, whose results are evaluated via qualitative content analysis. Afterwards the current state was modeled. Objective b) was done by model analysis. A given generic text catalog of requirements was transferred into a model. As result of objective a) current state models of one existing cohort study and two registers are created and analyzed. An optimized model called KoReg-reference-model is the result of objective b). It is possible to use methods of EAP to model requirements. This enables a better overview of the partly connected requirements by means of visualization. The model based approach also enables the analysis and comparison of the empirical data from the current state models. Information managers could reduce the effort of planning the IT infrastructure utilizing the KoReg-reference-model. Modeling the current state and the generation of reports from the model, which could be used as requirements specification for bids, is supported, too.

  3. Literature review of models on tire-pavement interaction noise

    NASA Astrophysics Data System (ADS)

    Li, Tan; Burdisso, Ricardo; Sandu, Corina

    2018-04-01

    Tire-pavement interaction noise (TPIN) becomes dominant at speeds above 40 km/h for passenger vehicles and 70 km/h for trucks. Several models have been developed to describe and predict the TPIN. However, these models do not fully reveal the physical mechanisms or predict TPIN accurately. It is well known that all the models have both strengths and weaknesses, and different models fit different investigation purposes or conditions. The numerous papers that present these models are widely scattered among thousands of journals, and it is difficult to get the complete picture of the status of research in this area. This review article aims at presenting the history and current state of TPIN models systematically, making it easier to identify and distribute the key knowledge and opinions, and providing insight into the future research trend in this field. In this work, over 2000 references related to TPIN were collected, and 74 models were reviewed from nearly 200 selected references; these were categorized into deterministic models (37), statistical models (18), and hybrid models (19). The sections explaining the models are self-contained with key principles, equations, and illustrations included. The deterministic models were divided into three sub-categories: conventional physics models, finite element and boundary element models, and computational fluid dynamics models; the statistical models were divided into three sub-categories: traditional regression models, principal component analysis models, and fuzzy curve-fitting models; the hybrid models were divided into three sub-categories: tire-pavement interface models, mechanism separation models, and noise propagation models. At the end of each category of models, a summary table is presented to compare these models with the key information extracted. Readers may refer to these tables to find models of their interest. The strengths and weaknesses of the models in different categories were then analyzed. Finally, the modeling trend and future direction in this area are given.

  4. A Financial Market Model Incorporating Herd Behaviour.

    PubMed

    Wray, Christopher M; Bishop, Steven R

    2016-01-01

    Herd behaviour in financial markets is a recurring phenomenon that exacerbates asset price volatility, and is considered a possible contributor to market fragility. While numerous studies investigate herd behaviour in financial markets, it is often considered without reference to the pricing of financial instruments or other market dynamics. Here, a trader interaction model based upon informational cascades in the presence of information thresholds is used to construct a new model of asset price returns that allows for both quiescent and herd-like regimes. Agent interaction is modelled using a stochastic pulse-coupled network, parametrised by information thresholds and a network coupling probability. Agents may possess either one or two information thresholds that, in each case, determine the number of distinct states an agent may occupy before trading takes place. In the case where agents possess two thresholds (labelled as the finite state-space model, corresponding to agents' accumulating information over a bounded state-space), and where coupling strength is maximal, an asymptotic expression for the cascade-size probability is derived and shown to follow a power law when a critical value of network coupling probability is attained. For a range of model parameters, a mixture of negative binomial distributions is used to approximate the cascade-size distribution. This approximation is subsequently used to express the volatility of model price returns in terms of the model parameter which controls the network coupling probability. In the case where agents possess a single pulse-coupling threshold (labelled as the semi-infinite state-space model corresponding to agents' accumulating information over an unbounded state-space), numerical evidence is presented that demonstrates volatility clustering and long-memory patterns in the volatility of asset returns. Finally, output from the model is compared to both the distribution of historical stock returns and the market price of an equity index option.

  5. 33 Shafts Category of Transuranic Waste Stored Below Ground within Area G

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hargis, Kenneth Marshall; Monk, Thomas H

    This report compiles information to support the evaluation of alternatives and analysis of regulatory paths forward for the 33 shafts. The historical information includes a form completed by waste generators for each waste package (Reference 6) that included a waste description, estimates of Pu-239 and uranium-235 (U-235) based on an accounting technique, and calculations of mixed fission products (MFP) based on radiation measurements. A 1979 letter and questionnaire (Reference 7) provides information on waste packaging of hot cell waste and the configuration of disposal shafts as storage in the 33 Shafts was initiated. Tables of data by waste package weremore » developed during a review of historical documents that was performed in 2005 (Reference 8). Radiological data was coupled with material-type data to estimate the initial isotopic content of each waste package and an Oak Ridge National Laboratory computer code was used to calculate 2009 decay levels. Other sources of information include a waste disposal logbook for the 33 shafts (Reference 9), reports that summarize remote-handled waste generated at the CMR facility (Reference 10) and placement of waste in the 33 shafts (Reference 11), a report on decommissioning of the LAMPRE reactor (Reference 12), interviews with an employee and manager involved in placing waste in the 33 shafts (References 13 and 14), an interview with a long-time LANL employee involved in waste operations (Reference 15), a 2002 plan for disposition of remote-handled TRU waste (Reference 16), and photographs obtained during field surveys of several shafts in 2007. The WIPP Central Characterization Project (CCP) completed an Acceptable Knowledge (AK) summary report for 16 canisters of remote-handled waste from the CMR Facility that contains information relevant to the 33 Shafts on hot-cell operations and timeline (Reference 17).« less

  6. With the future behind them: convergent evidence from aymara language and gesture in the crosslinguistic comparison of spatial construals of time.

    PubMed

    Núñez, Rafael E; Sweetser, Eve

    2006-05-06

    Cognitive research on metaphoric concepts of time has focused on differences between moving Ego and moving time models, but even more basic is the contrast between Ego- and temporal-reference-point models. Dynamic models appear to be quasi-universal cross-culturally, as does the generalization that in Ego-reference-point models, FUTURE IS IN FRONT OF EGO and PAST IS IN BACK OF EGO. The Aymara language instead has a major static model of time wherein FUTURE IS BEHIND EGO and PAST IS IN FRONT OF EGO; linguistic and gestural data give strong confirmation of this unusual culture-specific cognitive pattern. Gestural data provide crucial information unavailable to purely linguistic analysis, suggesting that when investigating conceptual systems both forms of expression should be analyzed complementarily. Important issues in embodied cognition are raised: how fully shared are bodily grounded motivations for universal cognitive patterns, what makes a rare pattern emerge, and what are the cultural entailments of such patterns? 2006 Lawrence Erlbaum Associates, Inc.

  7. Strategic development of a multivariate calibration model for the uniformity testing of tablets by transmission NIR analysis.

    PubMed

    Sasakura, D; Nakayama, K; Sakamoto, T; Chikuma, T

    2015-05-01

    The use of transmission near infrared spectroscopy (TNIRS) is of particular interest in the pharmaceutical industry. This is because TNIRS does not require sample preparation and can analyze several tens of tablet samples in an hour. It has the capability to measure all relevant information from a tablet, while still on the production line. However, TNIRS has a narrow spectrum range and overtone vibrations often overlap. To perform content uniformity testing in tablets by TNIRS, various properties in the tableting process need to be analyzed by a multivariate prediction model, such as a Partial Least Square Regression modeling. One issue is that typical approaches require several hundred reference samples to act as the basis of the method rather than a strategically designed method. This means that many batches are needed to prepare the reference samples; this requires time and is not cost effective. Our group investigated the concentration dependence of the calibration model with a strategic design. Consequently, we developed a more effective approach to the TNIRS calibration model than the existing methodology.

  8. Technology: Trigger for Change in Reference Librarianship.

    ERIC Educational Resources Information Center

    Hallman, Clark N.

    1990-01-01

    Discussion of the influence of technological developments on social change focuses on the effects of information technology on academic reference librarianship. Highlights include reference skills; electronic resources; microcomputer technology; online catalogs; interaction and communication with users; the need to teach information skills; and…

  9. MRAC Control with Prior Model Knowledge for Asymmetric Damaged Aircraft

    PubMed Central

    Zhang, Jing

    2015-01-01

    This paper develops a novel state-tracking multivariable model reference adaptive control (MRAC) technique utilizing prior knowledge of plant models to recover control performance of an asymmetric structural damaged aircraft. A modification of linear model representation is given. With prior knowledge on structural damage, a polytope linear parameter varying (LPV) model is derived to cover all concerned damage conditions. An MRAC method is developed for the polytope model, of which the stability and asymptotic error convergence are theoretically proved. The proposed technique reduces the number of parameters to be adapted and thus decreases computational cost and requires less input information. The method is validated by simulations on NASA generic transport model (GTM) with damage. PMID:26180839

  10. Estimating groundwater recharge

    USGS Publications Warehouse

    Healy, Richard W.; Scanlon, Bridget R.

    2010-01-01

    Understanding groundwater recharge is essential for successful management of water resources and modeling fluid and contaminant transport within the subsurface. This book provides a critical evaluation of the theory and assumptions that underlie methods for estimating rates of groundwater recharge. Detailed explanations of the methods are provided - allowing readers to apply many of the techniques themselves without needing to consult additional references. Numerous practical examples highlight benefits and limitations of each method. Approximately 900 references allow advanced practitioners to pursue additional information on any method. For the first time, theoretical and practical considerations for selecting and applying methods for estimating groundwater recharge are covered in a single volume with uniform presentation. Hydrogeologists, water-resource specialists, civil and agricultural engineers, earth and environmental scientists and agronomists will benefit from this informative and practical book. It can serve as the primary text for a graduate-level course on groundwater recharge or as an adjunct text for courses on groundwater hydrology or hydrogeology.

  11. Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE

    NASA Astrophysics Data System (ADS)

    Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas

    2016-04-01

    Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.

  12. Determining the True Cost to Deliver Total Hip and Knee Arthroplasty Over the Full Cycle of Care: Preparing for Bundling and Reference-Based Pricing.

    PubMed

    DiGioia, Anthony M; Greenhouse, Pamela K; Giarrusso, Michelle L; Kress, Justina M

    2016-01-01

    The Affordable Care Act accelerates health care providers' need to prepare for new care delivery platforms and payment models such as bundling and reference-based pricing (RBP). Thriving in this environment will be difficult without knowing the true cost of care delivery at the level of the clinical condition over the full cycle of care. We describe a project in which we identified true costs for both total hip and total knee arthroplasty. With the same tool, we identified cost drivers in each segment of care delivery and collected patient experience information. Combining cost and experience information with outcomes data we already collect allows us to drive costs down while protecting outcomes and experiences, and compete successfully in bundling and RBP programs. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. [Dental education for college students based on WeChat public platform].

    PubMed

    Chen, Chuan-Jun; Sun, Tan

    2016-06-01

    The authors proposed a model for dental education based on WeChat public platform. In this model, teachers send various kinds of digital teaching information such as PPT,word and video to the WeChat public platform and students share the information for preview before class and differentiate the key-point knowledge from those information for in-depth learning in class. Teachers also send reference materials for expansive learning after class. Questionaire through the WeChat public platform is used to evaluate teaching effect of teachers and improvement may be taken based on the feedback questionnaire. A discussion and interaction based on WeCchat between students and teacher can be aroused on a specific topic to reach a proper solution. With technique development of mobile terminal, mobile class will come true in near future.

  14. Expanding rural primary care training by employing information technologies: the need for participation by medical reference librarians.

    PubMed

    Coggan, J M; Crandall, L A

    1995-01-01

    The use of rural sites to train badly needed primary care providers requires access to sophisticated medical information not traditionally available outside of academic health centers. Medical reference librarians can play a key role in the development of primary care training sites in rural settings. Electronic information technologies, with proactive support from medical reference librarians, can provide current and detailed information without concern for distance from the health science center library. This paper discusses recent developments in technology, describes current challenges to the application of this technology in rural settings, and provides policy recommendations for medical reference librarians to enhance rural primary care training.

  15. Genotype imputation in a coalescent model with infinitely-many-sites mutation

    PubMed Central

    Huang, Lucy; Buzbas, Erkan O.; Rosenberg, Noah A.

    2012-01-01

    Empirical studies have identified population-genetic factors as important determinants of the properties of genotype-imputation accuracy in imputation-based disease association studies. Here, we develop a simple coalescent model of three sequences that we use to explore the theoretical basis for the influence of these factors on genotype-imputation accuracy, under the assumption of infinitely-many-sites mutation. Employing a demographic model in which two populations diverged at a given time in the past, we derive the approximate expectation and variance of imputation accuracy in a study sequence sampled from one of the two populations, choosing between two reference sequences, one sampled from the same population as the study sequence and the other sampled from the other population. We show that under this model, imputation accuracy—as measured by the proportion of polymorphic sites that are imputed correctly in the study sequence—increases in expectation with the mutation rate, the proportion of the markers in a chromosomal region that are genotyped, and the time to divergence between the study and reference populations. Each of these effects derives largely from an increase in information available for determining the reference sequence that is genetically most similar to the sequence targeted for imputation. We analyze as a function of divergence time the expected gain in imputation accuracy in the target using a reference sequence from the same population as the target rather than from the other population. Together with a growing body of empirical investigations of genotype imputation in diverse human populations, our modeling framework lays a foundation for extending imputation techniques to novel populations that have not yet been extensively examined. PMID:23079542

  16. Variability analysis of SAR from 20 MHz to 2.4 GHz for different adult and child models using finite-difference time-domain

    NASA Astrophysics Data System (ADS)

    Conil, E.; Hadjem, A.; Lacroux, F.; Wong, M. F.; Wiart, J.

    2008-03-01

    This paper deals with the variability of body models used in numerical dosimetry studies. Six adult anthropomorphic voxel models have been collected and used to build 5-, 8- and 12-year-old children using a morphing method respecting anatomical parameters. Finite-difference time-domain calculations of a specific absorption rate (SAR) have been performed for a range of frequencies from 20 MHz to 2.4 GHz for isolated models illuminated by plane waves. A whole-body-averaged SAR is presented as well as the average on specific tissues such as skin, muscles, fat or bones and the average on specific parts of the body such as head, legs, arms or torso. Results point out the variability of adult models. The standard deviation of whole-body-averaged SAR of adult models can reach 40%. All phantoms are exposed to the ICNIRP reference levels. Results show that for adults, compliance with reference levels ensures compliance with basic restrictions, but concerning children models involved in this study, the whole-body-averaged SAR goes over the fundamental safety limits up to 40%. For more information on this article, see medicalphysicsweb.org

  17. Fuzzy portfolio model with fuzzy-input return rates and fuzzy-output proportions

    NASA Astrophysics Data System (ADS)

    Tsaur, Ruey-Chyn

    2015-02-01

    In the finance market, a short-term investment strategy is usually applied in portfolio selection in order to reduce investment risk; however, the economy is uncertain and the investment period is short. Further, an investor has incomplete information for selecting a portfolio with crisp proportions for each chosen security. In this paper we present a new method of constructing fuzzy portfolio model for the parameters of fuzzy-input return rates and fuzzy-output proportions, based on possibilistic mean-standard deviation models. Furthermore, we consider both excess or shortage of investment in different economic periods by using fuzzy constraint for the sum of the fuzzy proportions, and we also refer to risks of securities investment and vagueness of incomplete information during the period of depression economics for the portfolio selection. Finally, we present a numerical example of a portfolio selection problem to illustrate the proposed model and a sensitivity analysis is realised based on the results.

  18. New York State energy-analytic information system: first-stage implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allentuck, J.; Carroll, O.; Fiore, L.

    1979-09-01

    So that energy policy by state government may be formulated within the constraints imposed by policy determined at the national level - yet reflect the diverse interests of its citizens - large quantities of data and sophisticated analytic capabilities are required. This report presents the design of an energy-information/analytic system for New York State, the data for a base year, 1976, and projections of these data. At the county level, 1976 energy-supply demand data and electric generating plant data are provided as well. Data-base management is based on System 2000. Three computerized models provide the system's basic analytic capacity. Themore » Brookhaven Energy System Network Simulator provides an integrating framework while a price-response model and a weather sensitive energy demand model furnished a short-term energy response estimation capability. The operation of these computerized models is described. 62 references, 25 figures, 39 tables.« less

  19. Measures and limits of models of fixation selection.

    PubMed

    Wilming, Niklas; Betz, Torsten; Kietzmann, Tim C; König, Peter

    2011-01-01

    Models of fixation selection are a central tool in the quest to understand how the human mind selects relevant information. Using this tool in the evaluation of competing claims often requires comparing different models' relative performance in predicting eye movements. However, studies use a wide variety of performance measures with markedly different properties, which makes a comparison difficult. We make three main contributions to this line of research: First we argue for a set of desirable properties, review commonly used measures, and conclude that no single measure unites all desirable properties. However the area under the ROC curve (a classification measure) and the KL-divergence (a distance measure of probability distributions) combine many desirable properties and allow a meaningful comparison of critical model performance. We give an analytical proof of the linearity of the ROC measure with respect to averaging over subjects and demonstrate an appropriate correction of entropy-based measures like KL-divergence for small sample sizes in the context of eye-tracking data. Second, we provide a lower bound and an upper bound of these measures, based on image-independent properties of fixation data and between subject consistency respectively. Based on these bounds it is possible to give a reference frame to judge the predictive power of a model of fixation selection. We provide open-source python code to compute the reference frame. Third, we show that the upper, between subject consistency bound holds only for models that predict averages of subject populations. Departing from this we show that incorporating subject-specific viewing behavior can generate predictions which surpass that upper bound. Taken together, these findings lay out the required information that allow a well-founded judgment of the quality of any model of fixation selection and should therefore be reported when a new model is introduced.

  20. Investigation Of Integrating Three-Dimensional (3-D) Geometry Into The Visual Anatomical Injury Descriptor (Visual AID) Using WebGL

    DTIC Science & Technology

    2011-08-01

    generated using the Zygote Human Anatomy 3-D model (3). Use of a reference anatomy independent of personal identification, such as Zygote, allows Visual...Zygote Human Anatomy 3D Model, 2010. http://www.zygote.com/ (accessed July 26, 2011). 4. Khronos Group Web site. Khronos to Create New Open Standard for...understanding of the information at hand. In order to fulfill the medical illustration track, I completed a concentration in science, focusing on human

  1. Current NASA Earth Remote Sensing Observations

    NASA Technical Reports Server (NTRS)

    Luvall, Jeffrey C.; Sprigg, William A.; Huete, Alfredo; Pejanovic, Goran; Nickovic, Slobodan; Ponce-Campos, Guillermo; Krapfl, Heide; Budge, Amy; Zelicoff, Alan; Myers, Orrin; hide

    2011-01-01

    This slide presentation reviews current NASA Earth Remote Sensing observations in specific reference to improving public health information in view of pollen sensing. While pollen sampling has instrumentation, there are limitations, such as lack of stations, and reporting lag time. Therefore it is desirable use remote sensing to act as early warning system for public health reasons. The use of Juniper Pollen was chosen to test the possibility of using MODIS data and a dust transport model, Dust REgional Atmospheric Model (DREAM) to act as an early warning system.

  2. Development Strategy for Mobilecommunications Market in Chinese Rural Area

    NASA Astrophysics Data System (ADS)

    Zhang, Liwei; Zhang, Yanjun; Xu, Liying; Li, Daoliang

    Based on full analysis of rural mobile communication market, in order to explore mobile operators in rural areas of information services for sustainable development model, this paper presents three different aspects, including rural mobile communications market demand, the rural market for mobile communications business model and development strategies for rural mobile communications market research business. It supplies some valuable references for operators to develop rural users rapidly, develop the rural market effectively and to get access to develop a broad space.

  3. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    NASA Astrophysics Data System (ADS)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  4. System requirements specification for SMART structures mode

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Specified here are the functional and informational requirements for software modules which address the geometric and data modeling needs of the aerospace structural engineer. The modules are to be included as part of the Solid Modeling Aerospace Research Tool (SMART) package developed for the Vehicle Analysis Branch (VAB) at the NASA Langley Research Center (LaRC). The purpose is to precisely state what the SMART Structures modules will do, without consideration of how it will be done. Each requirement is numbered for reference in development and testing.

  5. Using a Hierarchical Approach to Model Regional Source Sink Dynamics for Neotropical Nearctic Songbirds to Inform Management Practices on Department of Defense Installations

    DTIC Science & Technology

    2017-03-20

    comparison with the more intensive demographic study . We found support for spatial variation in productivity at both location and station scales. At location...the larger intensive demographic monitoring study , we also fit a productivity model that included a covariate calculated for the 12 stations included...Reference herein to any specific commercial product , process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarily

  6. Is the Sun Setting on Lecture-based Education?

    PubMed Central

    Lowe, Whitney

    2011-01-01

    Lecture-based instructional models have been the mainstay of education for centuries. They excel primarily at delivering information from the one to the many. Educators refer to this model as “the sage on the stage”. Clearly there are educators who relish this role and are strongly opposed to moving away from it. Yet, educational research and new innovative technologies are suggesting that lecture-based classes may no longer be the most effective teaching method for many situations, especially clinical practice. PMID:22211152

  7. Employment references: walking scared between the minefield of defamation and the specter of negligent hiring.

    PubMed

    McConnell, C R

    2000-12-01

    In present day reference checking, many of the same organizations that seek as much information as possible about people they wish to hire resist giving out more than a bare minimum of information to other organizations. The strongest force driving this minimal reference information release is fear of legal action taken because of something said about an individual (defamation, supposedly). Many employers appear so frightened of being sued for libel or slander that they share nothing of substance, usually not realizing that in trying to protect themselves against defamation charges they are increasing their legal risk associated with negligent hiring charges. However, truthful reference information can be provided with minimal risk if it is provided in good faith, given only to someone who has a legitimate need to know, is related strictly to job in character, and is not communicated maliciously. Also, reference always must be answered completely objectively with information verifiable in the individual's personnel file.

  8. Long-term Science Data Curation Using a Digital Object Model and Open-Source Frameworks

    NASA Astrophysics Data System (ADS)

    Pan, J.; Lenhardt, W.; Wilson, B. E.; Palanisamy, G.; Cook, R. B.

    2010-12-01

    Scientific digital content, including Earth Science observations and model output, has become more heterogeneous in format and more distributed across the Internet. In addition, data and metadata are becoming necessarily linked internally and externally on the Web. As a result, such content has become more difficult for providers to manage and preserve and for users to locate, understand, and consume. Specifically, it is increasingly harder to deliver relevant metadata and data processing lineage information along with the actual content consistently. Readme files, data quality information, production provenance, and other descriptive metadata are often separated in the storage level as well as in the data search and retrieval interfaces available to a user. Critical archival metadata, such as auditing trails and integrity checks, are often even more difficult for users to access, if they exist at all. We investigate the use of several open-source software frameworks to address these challenges. We use Fedora Commons Framework and its digital object abstraction as the repository, Drupal CMS as the user-interface, and the Islandora module as the connector from Drupal to Fedora Repository. With the digital object model, metadata of data description and data provenance can be associated with data content in a formal manner, so are external references and other arbitrary auxiliary information. Changes are formally audited on an object, and digital contents are versioned and have checksums automatically computed. Further, relationships among objects are formally expressed with RDF triples. Data replication, recovery, metadata export are supported with standard protocols, such as OAI-PMH. We provide a tentative comparative analysis of the chosen software stack with the Open Archival Information System (OAIS) reference model, along with our initial results with the existing terrestrial ecology data collections at NASA’s ORNL Distributed Active Archive Center for Biogeochemical Dynamics (ORNL DAAC).

  9. An ambiguity of information content and error in an ill-posed satellite inversion

    NASA Astrophysics Data System (ADS)

    Koner, Prabhat

    According to Rodgers (2000, stochastic approach), the averaging kernel (AK) is the representational matrix to understand the information content in a scholastic inversion. On the other hand, in deterministic approach this is referred to as model resolution matrix (MRM, Menke 1989). The analysis of AK/MRM can only give some understanding of how much regularization is imposed on the inverse problem. The trace of the AK/MRM matrix, which is the so-called degree of freedom from signal (DFS; stochastic) or degree of freedom in retrieval (DFR; deterministic). There are no physical/mathematical explanations in the literature: why the trace of the matrix is a valid form to calculate this quantity? We will present an ambiguity between information and error using a real life problem of SST retrieval from GOES13. The stochastic information content calculation is based on the linear assumption. The validity of such mathematics in satellite inversion will be questioned because it is based on the nonlinear radiative transfer and ill-conditioned inverse problems. References: Menke, W., 1989: Geophysical data analysis: discrete inverse theory. San Diego academic press. Rodgers, C.D., 2000: Inverse methods for atmospheric soundings: theory and practice. Singapore :World Scientific.

  10. Three-Dimensional Computer Simulation as an Important Competence Based Aspect of a Modern Mining Professional

    NASA Astrophysics Data System (ADS)

    Aksenova, Olesya; Pachkina, Anna

    2017-11-01

    The article deals with the problem of necessity of educational process transformation to meet the requirements of modern miming industry; cooperative developing of new educational programs and implementation of educational process taking into account modern manufacturability. The paper proves the idea of introduction into mining professionals learning process studying of three-dimensional models of surface technological complex, ore reserves and underground digging complex as well as creating these models in different graphic editors and working with the information analysis model obtained on the basis of these three-dimensional models. The technological process of manless coal mining at the premises of the mine Polysaevskaya controlled by the information analysis models built on the basis of three-dimensional models of individual objects and technological process as a whole, and at the same time requiring the staff able to use the programs of three-dimensional positioning in the miners and equipment global frame of reference is covered.

  11. Impacts of suppressing guide on information spreading

    NASA Astrophysics Data System (ADS)

    Xu, Jinghong; Zhang, Lin; Ma, Baojun; Wu, Ye

    2016-02-01

    It is quite common that guides are introduced to suppress the information spreading in modern society for different purposes. In this paper, an agent-based model is established to quantitatively analyze the impacts of suppressing guides on information spreading. We find that the spreading threshold depends on the attractiveness of the information and the topology of the social network with no suppressing guides at all. Usually, one would expect that the existence of suppressing guides in the spreading procedure may result in less diffusion of information within the overall network. However, we find that sometimes the opposite is true: the manipulating nodes of suppressing guides may lead to more extensive information spreading when there are audiences with the reversal mind. These results can provide valuable theoretical references to public opinion guidance on various information, e.g., rumor or news spreading.

  12. Model-independent plot of dynamic PET data facilitates data interpretation and model selection.

    PubMed

    Munk, Ole Lajord

    2012-02-21

    When testing new PET radiotracers or new applications of existing tracers, the blood-tissue exchange and the metabolism need to be examined. However, conventional plots of measured time-activity curves from dynamic PET do not reveal the inherent kinetic information. A novel model-independent volume-influx plot (vi-plot) was developed and validated. The new vi-plot shows the time course of the instantaneous distribution volume and the instantaneous influx rate. The vi-plot visualises physiological information that facilitates model selection and it reveals when a quasi-steady state is reached, which is a prerequisite for the use of the graphical analyses by Logan and Gjedde-Patlak. Both axes of the vi-plot have direct physiological interpretation, and the plot shows kinetic parameter in close agreement with estimates obtained by non-linear kinetic modelling. The vi-plot is equally useful for analyses of PET data based on a plasma input function or a reference region input function. The vi-plot is a model-independent and informative plot for data exploration that facilitates the selection of an appropriate method for data analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. A statistical approach to develop a detailed soot growth model using PAH characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raj, Abhijeet; Celnik, Matthew; Shirley, Raphael

    A detailed PAH growth model is developed, which is solved using a kinetic Monte Carlo algorithm. The model describes the structure and growth of planar PAH molecules, and is referred to as the kinetic Monte Carlo-aromatic site (KMC-ARS) model. A detailed PAH growth mechanism based on reactions at radical sites available in the literature, and additional reactions obtained from quantum chemistry calculations are used to model the PAH growth processes. New rates for the reactions involved in the cyclodehydrogenation process for the formation of 6-member rings on PAHs are calculated in this work based on density functional theory simulations. Themore » KMC-ARS model is validated by comparing experimentally observed ensembles on PAHs with the computed ensembles for a C{sub 2}H{sub 2} and a C{sub 6}H{sub 6} flame at different heights above the burner. The motivation for this model is the development of a detailed soot particle population balance model which describes the evolution of an ensemble of soot particles based on their PAH structure. However, at present incorporating such a detailed model into a population balance is computationally unfeasible. Therefore, a simpler model referred to as the site-counting model has been developed, which replaces the structural information of the PAH molecules by their functional groups augmented with statistical closure expressions. This closure is obtained from the KMC-ARS model, which is used to develop correlations and statistics in different flame environments which describe such PAH structural information. These correlations and statistics are implemented in the site-counting model, and results from the site-counting model and the KMC-ARS model are in good agreement. Additionally the effect of steric hindrance in large PAH structures is investigated and correlations for sites unavailable for reaction are presented. (author)« less

  14. Annual Energy Outlook 2016 With Projections to 2040

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The Annual Energy Outlook 2016 (AEO2016), prepared by the U.S. Energy Information Administration (EIA), presents long-term projections of energy supply, demand, and prices through 2040. The projections, focused on U.S. energy markets, are based on results from EIA’s National Energy Modeling System (NEMS). NEMS enables EIA to make projections under alternative, internallyconsistent sets of assumptions. The analysis in AEO2016 focuses on the Reference case and 17 alternative cases. EIA published an Early Release version of the AEO2016 Reference case (including U.S. Environmental Protection Agency’s (EPA) Clean Power Plan (CPP)) and a No CPP case (excluding the CPP) in May 2016.

  15. STAGS Example Problems Manual

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Rankin, Charles C.

    2006-01-01

    This document summarizes the STructural Analysis of General Shells (STAGS) development effort, STAGS performance for selected demonstration problems, and STAGS application problems illustrating selected advanced features available in the STAGS Version 5.0. Each problem is discussed including selected background information and reference solutions when available. The modeling and solution approach for each problem is described and illustrated. Numerical results are presented and compared with reference solutions, test data, and/or results obtained from mesh refinement studies. These solutions provide an indication of the overall capabilities of the STAGS nonlinear finite element analysis tool and provide users with representative cases, including input files, to explore these capabilities that may then be tailored to other applications.

  16. Reference values of clinical chemistry and hematology parameters in rhesus monkeys (Macaca mulatta).

    PubMed

    Chen, Younan; Qin, Shengfang; Ding, Yang; Wei, Lingling; Zhang, Jie; Li, Hongxia; Bu, Hong; Lu, Yanrong; Cheng, Jingqiu

    2009-01-01

    Rhesus monkey models are valuable to the studies of human biology. Reference values for clinical chemistry and hematology parameters of rhesus monkeys are required for proper data interpretation. Whole blood was collected from 36 healthy Chinese rhesus monkeys (Macaca mulatta) of either sex, 3 to 5 yr old. Routine chemistry and hematology parameters, and some special coagulation parameters including thromboelastograph and activities of coagulation factors were tested. We presented here the baseline values of clinical chemistry and hematology parameters in normal Chinese rhesus monkeys. These data may provide valuable information for veterinarians and investigators using rhesus monkeys in experimental studies.

  17. Quality assurance and stability reference (QUASAR) monitoring concept for calibration/validation

    NASA Astrophysics Data System (ADS)

    Teillet, Philippe M.; Horler, D. N.; O'Neill, Norman T.

    1997-12-01

    The paper introduces the concept that calibration/validation (cal/val) can play an essential role in bringing remote sensing to mainstream consumers in an information-based society, provided that cal/val is an integral part of a quality-assurance strategy. A market model for remote sensing is introduced and used to demonstrate that quality assurance is the key to bridging the gap between early adopters of technology and mainstream markets. The paper goes on to propose the semi-continuous monitoring of quality assurance and stability reference (QUASAR) sites as an important first step towards a cal/val infrastructure beneficial to mainstream users. Prospective QUASAR test sites are described.

  18. Successful Reference Training on a Shoestring.

    ERIC Educational Resources Information Center

    Kalvee, Debbie

    1996-01-01

    The Anchorage Library Information Network created the STAR (State of the Art Reference) group to deliver reference training to Alaskan librarians and information providers. Librarians were trained to run workshops that focused on acquiring communications skills through practice, coaching, and reinforcement on the job. Due to budget constraints,…

  19. 77 FR 56712 - Agency Information Collection (Homeless Providers Grant and Per Diem Program) Activities Under...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ... DEPARTMENT OF VETERANS AFFAIRS [OMB Control No. 2900-0554] Agency Information Collection (Homeless...-7316. Please refer to ``OMB Control No. 2900-0554'' in any correspondence. FOR FURTHER INFORMATION... . Please refer to ``OMB Control No. 2900-0554.'' SUPPLEMENTARY INFORMATION: Titles: a. Homeless Providers...

  20. Advertising, patient decision making, and self-referral for computed tomographic and magnetic resonance imaging.

    PubMed

    Illes, Judy; Kann, Dylan; Karetsky, Kim; Letourneau, Phillip; Raffin, Thomas A; Schraedley-Desmond, Pamela; Koenig, Barbara A; Atlas, Scott W

    Self-referred imaging is one of the latest health care services to be marketed directly to consumers. Most aspects of these services are unregulated, and little is known about the messages in advertising used to attract potential consumers. We conducted a detailed analysis of print advertisements and informational brochures for self-referred imaging with respect to themes, content, accuracy, and emotional valence. Forty print advertisements from US newspapers around the country and 20 informational brochures were analyzed by 2 independent raters according to 7 major themes: health care technology; emotion, empowerment, and assurance; incentives; limited supporting evidence; popular appeal; statistics; and images. The Fisher exact test was used to identify significant differences in information content. Both the advertisements and the brochures emphasized health care and technology information and provided assurances of good health and incentives to self-refer. These materials also encouraged consumers to seek further information from company resources; virtually none referred to noncomplying sources of information or to the risks of having a scan. Images of people commonly portrayed European Americans. We found statistical differences between newspaper advertisements and mailed brochures for references to "prevalence of disease" (P<.001), "death" (P<.003), and "radiation" (P<.001). Statements lacking clear scientific evidence were identified in 38% of the advertisements (n = 15) and 25% of the brochures (n = 5). Direct-to-consumer marketing of self-referred imaging services, in both print advertisements and informational brochures, fails to provide prospective consumers with comprehensive balanced information vital to informed autonomous decision making. Professional guidelines and oversight for advertising and promotion of these services are needed.

  1. A reflection and evaluation model of comparative thinking.

    PubMed

    Markman, Keith D; McMullen, Matthew N

    2003-01-01

    This article reviews research on counterfactual, social, and temporal comparisons and proposes a Reflection and Evaluation Model (REM) as an organizing framework. At the heart of the model is the assertion that 2 psychologically distinct modes of mental simulation operate during comparative thinking: reflection, an experiential ("as if") mode of thinking characterized by vividly simulating that information about the comparison standard is true of, or part of, the self; and evaluation, an evaluative mode of thinking characterized by the use of information about the standard as a reference point against which to evaluate one's present standing. Reflection occurs when information about the standard is included in one's self-construal, and evaluation occurs when such information is excluded. The result of reflection is that standard-consistent cognitions about the self become highly accessible, thereby yielding affective assimilation; whereas the result of evaluation is that comparison information is used as a standard against which one's present standing is evaluated, thereby yielding affective contrast. The resulting affect leads to either an increase or decrease in behavioral persistence as a function of the type of task with which one is engaged, and a combination of comparison-derived causal inferences and regulatory focus strategies direct one toward adopting specific future action plans.

  2. Automatic summary generating technology of vegetable traceability for information sharing

    NASA Astrophysics Data System (ADS)

    Zhenxuan, Zhang; Minjing, Peng

    2017-06-01

    In order to solve problems of excessive data entries and consequent high costs for data collection in vegetable traceablility for farmers in traceability applications, the automatic summary generating technology of vegetable traceability for information sharing was proposed. The proposed technology is an effective way for farmers to share real-time vegetable planting information in social networking platforms to enhance their brands and obtain more customers. In this research, the influencing factors in the vegetable traceablility for customers were analyzed to establish the sub-indicators and target indicators and propose a computing model based on the collected parameter values of the planted vegetables and standard legal systems on food safety. The proposed standard parameter model involves five steps: accessing database, establishing target indicators, establishing sub-indicators, establishing standard reference model and computing scores of indicators. On the basis of establishing and optimizing the standards of food safety and traceability system, this proposed technology could be accepted by more and more farmers and customers.

  3. Shuttle/spacelab contamination environment and effects handbook

    NASA Technical Reports Server (NTRS)

    Bareiss, L. E.; Payton, R. M.; Papazian, H. A.

    1986-01-01

    This handbook is intended to assist users of the Spacelab/Space Transportation System by providing contamination environments and effects information that may be of value in planning, designing, manufacturing, and operating a space flight experiment. A summary of available molecular and particulate contamination data on the Space Transportation System and its facilities is presented. Contamination models, contamination effects, and protection methods information are also presented. In addition to contamination, the effects of the space environments at STS altitudes on spacecraft materials are included. Extensive references, bibliographies, and contacts are provided.

  4. Towards Precise Metadata-set for Discovering 3D Geospatial Models in Geo-portals

    NASA Astrophysics Data System (ADS)

    Zamyadi, A.; Pouliot, J.; Bédard, Y.

    2013-09-01

    Accessing 3D geospatial models, eventually at no cost and for unrestricted use, is certainly an important issue as they become popular among participatory communities, consultants, and officials. Various geo-portals, mainly established for 2D resources, have tried to provide access to existing 3D resources such as digital elevation model, LIDAR or classic topographic data. Describing the content of data, metadata is a key component of data discovery in geo-portals. An inventory of seven online geo-portals and commercial catalogues shows that the metadata referring to 3D information is very different from one geo-portal to another as well as for similar 3D resources in the same geo-portal. The inventory considered 971 data resources affiliated with elevation. 51% of them were from three geo-portals running at Canadian federal and municipal levels whose metadata resources did not consider 3D model by any definition. Regarding the remaining 49% which refer to 3D models, different definition of terms and metadata were found, resulting in confusion and misinterpretation. The overall assessment of these geo-portals clearly shows that the provided metadata do not integrate specific and common information about 3D geospatial models. Accordingly, the main objective of this research is to improve 3D geospatial model discovery in geo-portals by adding a specific metadata-set. Based on the knowledge and current practices on 3D modeling, and 3D data acquisition and management, a set of metadata is proposed to increase its suitability for 3D geospatial models. This metadata-set enables the definition of genuine classes, fields, and code-lists for a 3D metadata profile. The main structure of the proposal contains 21 metadata classes. These classes are classified in three packages as General and Complementary on contextual and structural information, and Availability on the transition from storage to delivery format. The proposed metadata set is compared with Canadian Geospatial Data Infrastructure (CGDI) metadata which is an implementation of North American Profile of ISO-19115. The comparison analyzes the two metadata against three simulated scenarios about discovering needed 3D geo-spatial datasets. Considering specific metadata about 3D geospatial models, the proposed metadata-set has six additional classes on geometric dimension, level of detail, geometric modeling, topology, and appearance information. In addition classes on data acquisition, preparation, and modeling, and physical availability have been specialized for 3D geospatial models.

  5. An evaluation of selected herbal reference texts and comparison to published reports of adverse herbal events.

    PubMed

    Haller, Christine A; Anderson, Ilene B; Kim, Susan Y; Blanc, Paul D

    2002-01-01

    There has been a recent proliferation of medical reference texts intended to guide practitioners whose patients use herbal therapies. We systematically assessed six herbal reference texts to evaluate the information they contain on herbal toxicity. We selected six major herbal references published from 1996 to 2000 to evaluate the adequacy of their toxicological information in light of published adverse events. To identify herbs most relevant to toxicology, we reviewed herbal-related calls to our regional California Poison Control System, San Francisco division (CPCS-SF) in 1998 and identified the 12 herbs (defined as botanical dietary supplements) most frequently involved in these CPCS-SF referrals. We searched Medline (1966 to 2000) to identify published reports of adverse effects potentially related to these same 12 herbs. We scored each herbal reference text on the basis of information inclusiveness for the target 12 herbs, with a maximal overall score of 3. The herbs, identified on the basis of CPCS-SF call frequency were: St John's wort, ma huang, echinacea, guarana, ginkgo, ginseng, valerian, tea tree oil, goldenseal, arnica, yohimbe and kava kava. The overall herbal reference scores ranged from 2.2 to 0.4 (median 1.1). The Natural Medicines Comprehensive Database received the highest overall score and was the most complete and useful reference source. All of the references, however, lacked sufficient information on management of herbal medicine overdose, and several had incorrect overdose management guidelines that could negatively impact patient care. Current herbal reference texts do not contain sufficient information for the assessment and management of adverse health effects of botanical therapies.

  6. Word learning emerges from the interaction of online referent selection and slow associative learning

    PubMed Central

    McMurray, Bob; Horst, Jessica S.; Samuelson, Larissa K.

    2013-01-01

    Classic approaches to word learning emphasize the problem of referential ambiguity: in any naming situation the referent of a novel word must be selected from many possible objects, properties, actions, etc. To solve this problem, researchers have posited numerous constraints, and inference strategies, but assume that determining the referent of a novel word is isomorphic to learning. We present an alternative model in which referent selection is an online process that is independent of long-term learning. This two timescale approach creates significant power in the developing system. We illustrate this with a dynamic associative model in which referent selection is simulated as dynamic competition between competing referents, and learning is simulated using associative (Hebbian) learning. This model can account for a range of findings including the delay in expressive vocabulary relative to receptive vocabulary, learning under high degrees of referential ambiguity using cross-situational statistics, accelerating (vocabulary explosion) and decelerating (power-law) learning rates, fast-mapping by mutual exclusivity (and differences in bilinguals), improvements in familiar word recognition with development, and correlations between individual differences in speed of processing and learning. Five theoretical points are illustrated. 1) Word learning does not require specialized processes – general association learning buttressed by dynamic competition can account for much of the literature. 2) The processes of recognizing familiar words are not different than those that support novel words (e.g., fast-mapping). 3) Online competition may allow the network (or child) to leverage information available in the task to augment performance or behavior despite what might be relatively slow learning or poor representations. 4) Even associative learning is more complex than previously thought – a major contributor to performance is the pruning of incorrect associations between words and referents. 5) Finally, the model illustrates that learning and referent selection/word recognition, though logically distinct, can be deeply and subtly related as phenomena like speed of processing and mutual exclusivity may derive in part from the way learning shapes the system. As a whole, this suggests more sophisticated ways of describing the interaction between situation- and developmental-time processes and points to the need for considering such interactions as a primary determinant of development and processing in children. PMID:23088341

  7. Data requirements for simulation of hydrogeologic effects of liquid waste injection, Harrison and Jackson Counties, Mississippi

    USGS Publications Warehouse

    Rebich, Richard A.

    1994-01-01

    Available literature and data were reviewed to quantify data requirements for computer simulation of hydrogeologic effects of liquid waste injection in southeastern Mississippi. Emphasis of each review was placed on quantifying physical properties of current Class I injection zones in Harrison and Jackson Counties. Class I injection zones are zones that are used for injection of hazardous or non-hazardous liquid waste below a formation containing the lowermost underground source of drinking water located within one-quarter of a mile of the injection well. Several mathematical models have been developed to simulate injection effects. The Basic Plume Method was selected because it is commonly used in permit applications, and the Intercomp model was selected because it is generally accepted and used in injection-related research. The input data requirements of the two models were combined into a single data requirement list inclusive of physical properties of injection zones only; injected waste and well properties are not included because such information is site-specific by industry, which is beyond the scope of this report. Results of the reviews of available literature and data indicated that Class I permit applications and standard-reference chemistry and physics texts were the primary sources of information to quantify physical properties of injection zones in Harrison and Jackson Counties. With the exception of a few reports and supplementary data for one injection zone in Jackson County, very little additional information pertaining to physical properties of the injection zones was available in sources other than permit applications and standard-reference texts.

  8. 12 CFR 404.3 - Public reference facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Public reference facilities. 404.3 Section 404.3 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES INFORMATION DISCLOSURE Procedures for Disclosure of Records Under the Freedom of Information Act. § 404.3 Public reference facilities. Ex-Im Bank...

  9. Thermal quantum coherence and correlation in the extended XY spin chain

    NASA Astrophysics Data System (ADS)

    Sha, Ya-Ting; Wang, Yue; Sun, Zheng-Hang; Hou, Xi-Wen

    2018-05-01

    Quantum coherence and correlation of thermal states in the extended XY spin chain are studied in terms of the recently proposed l1 norm, skew information, and Bures distance of geometry discord (BGD), respectively. The entanglement measured via concurrence is calculated for reference. A two-dimensional susceptibility is introduced to explore their capability in highlighting the critical lines associated with quantum phase transitions in the model. It is shown that the susceptibility of the skew information and BGD is a genuine indicator of quantum phase transitions, and characterizes the factorization. However, the l1 norm is trivial for the factorization. An explicit scaling law of BGD is captured at low temperature in the XY model. In contrast to the entanglement, quantum coherence reveals a kind of long-range nonclassical correlation. Moreover, the obvious relation among model parameters is extracted for the factorized line in the extended model. Those are instructive for the understanding of quantum coherence and correlation in the theory of quantum information, and quantum phase transitions and factorization in condensed-matter physics.

  10. The modular modality frame model: continuous body state estimation and plausibility-weighted information fusion.

    PubMed

    Ehrenfeld, Stephan; Butz, Martin V

    2013-02-01

    Humans show admirable capabilities in movement planning and execution. They can perform complex tasks in various contexts, using the available sensory information very effectively. Body models and continuous body state estimations appear necessary to realize such capabilities. We introduce the Modular Modality Frame (MMF) model, which maintains a highly distributed, modularized body model continuously updating, modularized probabilistic body state estimations over time. Modularization is realized with respect to modality frames, that is, sensory modalities in particular frames of reference and with respect to particular body parts. We evaluate MMF performance on a simulated, nine degree of freedom arm in 3D space. The results show that MMF is able to maintain accurate body state estimations despite high sensor and motor noise. Moreover, by comparing the sensory information available in different modality frames, MMF can identify faulty sensory measurements on the fly. In the near future, applications to lightweight robot control should be pursued. Moreover, MMF may be enhanced with neural encodings by introducing neural population codes and learning techniques. Finally, more dexterous goal-directed behavior should be realized by exploiting the available redundant state representations.

  11. Optimal design of focused experiments and surveys

    NASA Astrophysics Data System (ADS)

    Curtis, Andrew

    1999-10-01

    Experiments and surveys are often performed to obtain data that constrain some previously underconstrained model. Often, constraints are most desired in a particular subspace of model space. Experiment design optimization requires that the quality of any particular design can be both quantified and then maximized. This study shows how the quality can be defined such that it depends on the amount of information that is focused in the particular subspace of interest. In addition, algorithms are presented which allow one particular focused quality measure (from the class of focused measures) to be evaluated efficiently. A subclass of focused quality measures is also related to the standard variance and resolution measures from linearized inverse theory. The theory presented here requires that the relationship between model parameters and data can be linearized around a reference model without significant loss of information. Physical and financial constraints define the space of possible experiment designs. Cross-well tomographic examples are presented, plus a strategy for survey design to maximize information about linear combinations of parameters such as bulk modulus, κ =λ+ 2μ/3.

  12. A geographic information system-based 3D city estate modeling and simulation system

    NASA Astrophysics Data System (ADS)

    Chong, Xiaoli; Li, Sha

    2015-12-01

    This paper introduces a 3D city simulation system which is based on geographic information system (GIS), covering all commercial housings of the city. A regional- scale, GIS-based approach is used to capture, describe, and track the geographical attributes of each house in the city. A sorting algorithm of "Benchmark + Parity Rate" is developed to cluster houses with similar spatial and construction attributes. This system is applicable for digital city modeling, city planning, housing evaluation, housing monitoring, and visualizing housing transaction. Finally, taking Jingtian area of Shenzhen as an example, the each unit of 35,997 houses in the area could be displayed, tagged, and easily tracked by the GIS-based city modeling and simulation system. The match market real conditions well and can be provided to house buyers as reference.

  13. Instructional Media Production for Early Childhood Education: A. B. C. Jig-Saw Puzzle, a Model

    ERIC Educational Resources Information Center

    Yusuf, Mudashiru Olalere; Olanrewaju, Olatayo Solomon; Soetan, Aderonke K.

    2015-01-01

    In this paper, a. b. c. jig-saw puzzle was produced for early childhood education using local materials. This study was a production based type of research, to serve as a supplemental or total learning resource. Its production followed four phases of development referred to as information, design, production and evaluation. The storyboard cards,…

  14. 75 FR 5821 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-04

    ... Exchange proposes to clarify Nasdaq Rule 7023 to make clear that Historical ModelView information will be available via NasdaqTrader.com and that references to the Historical TotalView data product will be deleted... text of the proposed rule change is available from Nasdaq's Web site at http://nasdaq.cchwallstreet.com...

  15. 78 FR 32078 - Special Conditions: Gulfstream Model G280 Airplane, Enhanced Flight Vision System (EFVS) With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... document refers to a system comprised of a head-up display, imaging sensor(s), and avionics interfaces that display the sensor imagery on the HUD, and which overlay that imagery with alpha-numeric and symbolic... the sensor imagery, with or without other flight information, on a head-down display. For clarity, the...

  16. Using the Activity Model of Inquiry to Enhance General Chemistry Students' Understanding of Nature of Science

    ERIC Educational Resources Information Center

    Marchlewicz, Sara C.; Wink, Donald J.

    2011-01-01

    Nature of science refers to the processes of scientific activity and the social and cultural premises involved in the creation of scientific knowledge. Having an informed view of nature of science is important in the development of scientifically literate citizens. However, students often come to the classroom with misconceptions about nature of…

  17. Application Level Protocol Development for Library and Information Science Applications. Volume 1: Service Definition. Volume 2: Protocol Specification. Report No. TG.1.5; TG.50.

    ERIC Educational Resources Information Center

    Aagaard, James S.; And Others

    This two-volume document specifies a protocol that was developed using the Reference Model for Open Systems Interconnection (OSI), which provides a framework for communications within a heterogeneous network environment. The protocol implements the features necessary for bibliographic searching, record maintenance, and mail transfer between…

  18. [The significance of introducing registry study in the post-marketing safety research for Chinese medicine and pharmacy].

    PubMed

    Liao, Xing; Xie, Yan-Ming; Yang, Wei; Chang, Yan-Peng

    2014-03-01

    There is a new research model named 'registry study/patient registry' in Western medicine, which could be referred to by Chinese medicine researchers, such as active safety surveillance. This article will introduce registry study from different aspects as the developing history, features, and application in order to inform Chinese medicine researchers of future studies.

  19. Developing Competencies in the Entrepreneurial Small Firm for Use of the Internet in the Management of Customer Relations.

    ERIC Educational Resources Information Center

    McGowan, Pauric; Durkin, Mark G.; Allen, Lynsey; Dougan, Colette; Nixon, Sheena

    2001-01-01

    Using an adoption model depicting levels of awareness and value regarding the Internet, interviews with 25 entrepreneurs found the predominant reason for Internet adoption was information gathering and provision. Only one used it as a customer relations tool. Key competencies to enhance Internet use were identified. (Contains 56 references.) (SK)

  20. What Seams Do We Remove in Mobile-Assisted Seamless Learning? A Critical Review of the Literature

    ERIC Educational Resources Information Center

    Wong, Lung-Hsiang; Looi, Chee-Kit

    2011-01-01

    Seamless learning refers to the seamless integration of the learning experiences across various dimensions including formal and informal learning contexts, individual and social learning, and physical world and cyberspace. Inspired by the exposition by Chan et al. (2006) on the seamless learning model supported by the setting of one or more mobile…

  1. Four-Year-Olds Use a Mixture of Spatial Reference Frames

    PubMed Central

    Negen, James; Nardini, Marko

    2015-01-01

    Keeping track of unseen objects is an important spatial skill. In order to do this, people must situate the object in terms of different frames of reference, including body position (egocentric frame of reference), landmarks in the surrounding environment (extrinsic frame reference), or other attached features (intrinsic frame of reference). Nardini et al. hid a toy in one of 12 cups in front of children, turned the array when they were not looking, and then asked them to point to the cup with the toy. This forced children to use the intrinsic frame (information about the array of cups) to locate the hidden toy. Three-year-olds made systematic errors by using the wrong frame of reference, 4-year-olds were at chance, and only 5- and 6-year-olds were successful. Can we better understand the developmental change that takes place at four years? This paper uses a modelling approach to re-examine the data and distinguish three possible strategies that could lead to the previous results at four years: (1) Children were choosing cups randomly, (2) Children were pointing between the egocentric/extrinsic-cued location and the correct target, and (3) Children were pointing near the egocentric/extrinsic-cued location on some trials and near the target on the rest. Results heavily favor the last possibility: 4-year-olds were not just guessing or trying to combine the available frames of reference. They were using the intrinsic frame on some trials, but not doing so consistently. These insights suggest that accounts of improving spatial performance at 4 years need to explain why there is a mixture of responses. Further application of the selected model also suggests that children become both more reliant on the correct frame and more accurate with any chosen frame as they mature. PMID:26133990

  2. Application of geo-spatial technology in schistosomiasis modelling in Africa: a review.

    PubMed

    Manyangadze, Tawanda; Chimbari, Moses John; Gebreslasie, Michael; Mukaratirwa, Samson

    2015-11-04

    Schistosomiasis continues to impact socio-economic development negatively in sub-Saharan Africa. The advent of spatial technologies, including geographic information systems (GIS), Earth observation (EO) and global positioning systems (GPS) assist modelling efforts. However, there is increasing concern regarding the accuracy and precision of the current spatial models. This paper reviews the literature regarding the progress and challenges in the development and utilization of spatial technology with special reference to predictive models for schistosomiasis in Africa. Peer-reviewed papers identified through a PubMed search using the following keywords: geo-spatial analysis OR remote sensing OR modelling OR earth observation OR geographic information systems OR prediction OR mapping AND schistosomiasis AND Africa were used. Statistical uncertainty, low spatial and temporal resolution satellite data and poor validation were identified as some of the factors that compromise the precision and accuracy of the existing predictive models. The need for high spatial resolution of remote sensing data in conjunction with ancillary data viz. ground-measured climatic and environmental information, local presence/absence intermediate host snail surveys as well as prevalence and intensity of human infection for model calibration and validation are discussed. The importance of a multidisciplinary approach in developing robust, spatial data capturing, modelling techniques and products applicable in epidemiology is highlighted.

  3. A Collaborative Secure Localization Algorithm Based on Trust Model in Underwater Wireless Sensor Networks

    PubMed Central

    Han, Guangjie; Liu, Li; Jiang, Jinfang; Shu, Lei; Rodrigues, Joel J.P.C.

    2016-01-01

    Localization is one of the hottest research topics in Underwater Wireless Sensor Networks (UWSNs), since many important applications of UWSNs, e.g., event sensing, target tracking and monitoring, require location information of sensor nodes. Nowadays, a large number of localization algorithms have been proposed for UWSNs. How to improve location accuracy are well studied. However, few of them take location reliability or security into consideration. In this paper, we propose a Collaborative Secure Localization algorithm based on Trust model (CSLT) for UWSNs to ensure location security. Based on the trust model, the secure localization process can be divided into the following five sub-processes: trust evaluation of anchor nodes, initial localization of unknown nodes, trust evaluation of reference nodes, selection of reference node, and secondary localization of unknown node. Simulation results demonstrate that the proposed CSLT algorithm performs better than the compared related works in terms of location security, average localization accuracy and localization ratio. PMID:26891300

  4. Inferential revision in narrative texts: An ERP study.

    PubMed

    Pérez, Ana; Cain, Kate; Castellanos, María C; Bajo, Teresa

    2015-11-01

    We evaluated the process of inferential revision during text comprehension in adults. Participants with high or low working memory read short texts, in which the introduction supported two plausible concepts (e.g., 'guitar/violin'), although one was more probable ('guitar'). There were three possible continuations: a neutral sentence, which did not refer back to either concept; a no-revise sentence, which referred to a general property consistent with either concept (e.g., '…beautiful curved body'); and a revise sentence, which referred to a property that was consistent with only the less likely concept (e.g., '…matching bow'). Readers took longer to read the sentence in the revise condition, indicating that they were able to evaluate their comprehension and detect a mismatch. In a final sentence, a target noun referred to the alternative concept supported in the revise condition (e.g., 'violin'). ERPs indicated that both working memory groups were able to evaluate their comprehension of the text (P3a), but only high working memory readers were able to revise their initial incorrect interpretation (P3b) and integrate the new information (N400) when reading the revise sentence. Low working memory readers had difficulties inhibiting the no-longer-relevant interpretation and thus failed to revise their situation model, and they experienced problems integrating semantically related information into an accurate memory representation.

  5. Comparing estimates of genetic variance across different relationship models.

    PubMed

    Legarra, Andres

    2016-02-01

    Use of relationships between individuals to estimate genetic variances and heritabilities via mixed models is standard practice in human, plant and livestock genetics. Different models or information for relationships may give different estimates of genetic variances. However, comparing these estimates across different relationship models is not straightforward as the implied base populations differ between relationship models. In this work, I present a method to compare estimates of variance components across different relationship models. I suggest referring genetic variances obtained using different relationship models to the same reference population, usually a set of individuals in the population. Expected genetic variance of this population is the estimated variance component from the mixed model times a statistic, Dk, which is the average self-relationship minus the average (self- and across-) relationship. For most typical models of relationships, Dk is close to 1. However, this is not true for very deep pedigrees, for identity-by-state relationships, or for non-parametric kernels, which tend to overestimate the genetic variance and the heritability. Using mice data, I show that heritabilities from identity-by-state and kernel-based relationships are overestimated. Weighting these estimates by Dk scales them to a base comparable to genomic or pedigree relationships, avoiding wrong comparisons, for instance, "missing heritabilities". Copyright © 2015 Elsevier Inc. All rights reserved.

  6. An experimental loop design for the detection of constitutional chromosomal aberrations by array CGH

    PubMed Central

    2009-01-01

    Background Comparative genomic hybridization microarrays for the detection of constitutional chromosomal aberrations is the application of microarray technology coming fastest into routine clinical application. Through genotype-phenotype association, it is also an important technique towards the discovery of disease causing genes and genomewide functional annotation in human. When using a two-channel microarray of genomic DNA probes for array CGH, the basic setup consists in hybridizing a patient against a normal reference sample. Two major disadvantages of this setup are (1) the use of half of the resources to measure a (little informative) reference sample and (2) the possibility that deviating signals are caused by benign copy number variation in the "normal" reference instead of a patient aberration. Instead, we apply an experimental loop design that compares three patients in three hybridizations. Results We develop and compare two statistical methods (linear models of log ratios and mixed models of absolute measurements). In an analysis of 27 patients seen at our genetics center, we observed that the linear models of the log ratios are advantageous over the mixed models of the absolute intensities. Conclusion The loop design and the performance of the statistical analysis contribute to the quick adoption of array CGH as a routine diagnostic tool. They lower the detection limit of mosaicisms and improve the assignment of copy number variation for genetic association studies. PMID:19925645

  7. An experimental loop design for the detection of constitutional chromosomal aberrations by array CGH.

    PubMed

    Allemeersch, Joke; Van Vooren, Steven; Hannes, Femke; De Moor, Bart; Vermeesch, Joris Robert; Moreau, Yves

    2009-11-19

    Comparative genomic hybridization microarrays for the detection of constitutional chromosomal aberrations is the application of microarray technology coming fastest into routine clinical application. Through genotype-phenotype association, it is also an important technique towards the discovery of disease causing genes and genomewide functional annotation in human. When using a two-channel microarray of genomic DNA probes for array CGH, the basic setup consists in hybridizing a patient against a normal reference sample. Two major disadvantages of this setup are (1) the use of half of the resources to measure a (little informative) reference sample and (2) the possibility that deviating signals are caused by benign copy number variation in the "normal" reference instead of a patient aberration. Instead, we apply an experimental loop design that compares three patients in three hybridizations. We develop and compare two statistical methods (linear models of log ratios and mixed models of absolute measurements). In an analysis of 27 patients seen at our genetics center, we observed that the linear models of the log ratios are advantageous over the mixed models of the absolute intensities. The loop design and the performance of the statistical analysis contribute to the quick adoption of array CGH as a routine diagnostic tool. They lower the detection limit of mosaicisms and improve the assignment of copy number variation for genetic association studies.

  8. Practical use of a framework for network science experimentation

    NASA Astrophysics Data System (ADS)

    Toth, Andrew; Bergamaschi, Flavio

    2014-06-01

    In 2006, the US Army Research Laboratory (ARL) and the UK Ministry of Defence (MoD) established a collaborative research alliance with academia and industry, called the International Technology Alliance (ITA)1 In Network and Information Sciences, to address fundamental issues concerning Network and Information Sciences that will enhance decision making for coalition operations and enable rapid, secure formation of ad hoc teams in coalition environments and enhance US and UK capabilities to conduct coalition warfare. Research conducted under the ITA was extended through collaboration between ARL and IBM UK to characterize and dene a software stack and tooling that has become the reference framework for network science experimentation in support for validation of theoretical research. This paper discusses the composition of the reference framework for experimentation resulting from the ARL/IBM UK collaboration and its use, by the Network Science Collaborative Technology Alliance (NS CTA)2 , in a recent network science experiment conducted at ARL. It also discusses how the experiment was modeled using the reference framework, the integration of two new components, the Apollo Fact-Finder3 tool and the Medusa Crowd Sensing4 application, the limitations identified and how they shall be addressed in future work.

  9. Unified Bayesian Estimator of EEG Reference at Infinity: rREST (Regularized Reference Electrode Standardization Technique).

    PubMed

    Hu, Shiang; Yao, Dezhong; Valdes-Sosa, Pedro A

    2018-01-01

    The choice of reference for the electroencephalogram (EEG) is a long-lasting unsolved issue resulting in inconsistent usages and endless debates. Currently, both the average reference (AR) and the reference electrode standardization technique (REST) are two primary, apparently irreconcilable contenders. We propose a theoretical framework to resolve this reference issue by formulating both (a) estimation of potentials at infinity, and (b) determination of the reference, as a unified Bayesian linear inverse problem, which can be solved by maximum a posterior estimation. We find that AR and REST are very particular cases of this unified framework: AR results from biophysically non-informative prior; while REST utilizes the prior based on the EEG generative model. To allow for simultaneous denoising and reference estimation, we develop the regularized versions of AR and REST, named rAR and rREST, respectively. Both depend on a regularization parameter that is the noise to signal variance ratio. Traditional and new estimators are evaluated with this framework, by both simulations and analysis of real resting EEGs. Toward this end, we leverage the MRI and EEG data from 89 subjects which participated in the Cuban Human Brain Mapping Project. Generated artificial EEGs-with a known ground truth, show that relative error in estimating the EEG potentials at infinity is lowest for rREST. It also reveals that realistic volume conductor models improve the performances of REST and rREST. Importantly, for practical applications, it is shown that an average lead field gives the results comparable to the individual lead field. Finally, it is shown that the selection of the regularization parameter with Generalized Cross-Validation (GCV) is close to the "oracle" choice based on the ground truth. When evaluated with the real 89 resting state EEGs, rREST consistently yields the lowest GCV. This study provides a novel perspective to the EEG reference problem by means of a unified inverse solution framework. It may allow additional principled theoretical formulations and numerical evaluation of performance.

  10. Field Guide to Plant Model Systems.

    PubMed

    Chang, Caren; Bowman, John L; Meyerowitz, Elliot M

    2016-10-06

    For the past several decades, advances in plant development, physiology, cell biology, and genetics have relied heavily on the model (or reference) plant Arabidopsis thaliana. Arabidopsis resembles other plants, including crop plants, in many but by no means all respects. Study of Arabidopsis alone provides little information on the evolutionary history of plants, evolutionary differences between species, plants that survive in different environments, or plants that access nutrients and photosynthesize differently. Empowered by the availability of large-scale sequencing and new technologies for investigating gene function, many new plant models are being proposed and studied. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. A Time Series Analysis of Cancer-Related Information Seeking: Hints From the Health Information National Trends Survey (HINTS) 2003-2014.

    PubMed

    Huerta, Timothy R; Walker, Daniel M; Johnson, Tyler; Ford, Eric W

    2016-09-01

    Recent technological changes, such as the growth of the Internet, have made cancer information widely available. However, it remains unknown whether changes in access have resulted in concomitant changes in information seeking behavior. Previous work explored the cancer information seeking behaviors of the general population using the 2003 Health Information National Trends Survey (HINTS). This article aims to reproduce, replicate, and extend that existing analysis using the original dataset and five additional iterations of HINTS (2007, 2011, 2012, 2013, 2014). This approach builds on the earlier work by quantifying the magnitude of change in information seeking behaviors. Bivariate comparison of the 2003 and 2014 data revealed very similar results; however, the multivariate model including all years of data indicated differences between the original and extended models: individuals age 65 and older were no longer less likely to seek cancer information than the 18-35 reference population, and Hispanics were also no longer less likely to be cancer information seekers. The results of our analysis indicate an overall shift in cancer information seeking behaviors and also illuminate the impact of increased Internet usage over the past decade, suggesting specific demographic groups that may benefit from cancer information seeking encouragement.

  12. Cumulative effects of restoration efforts on ecological characteristics of an open water area within the Upper Mississippi River

    USGS Publications Warehouse

    Gray, B.R.; Shi, W.; Houser, J.N.; Rogala, J.T.; Guan, Z.; Cochran-Biederman, J. L.

    2011-01-01

    Ecological restoration efforts in large rivers generally aim to ameliorate ecological effects associated with large-scale modification of those rivers. This study examined whether the effects of restoration efforts-specifically those of island construction-within a largely open water restoration area of the Upper Mississippi River (UMR) might be seen at the spatial scale of that 3476ha area. The cumulative effects of island construction, when observed over multiple years, were postulated to have made the restoration area increasingly similar to a positive reference area (a proximate area comprising contiguous backwater areas) and increasingly different from two negative reference areas. The negative reference areas represented the Mississippi River main channel in an area proximate to the restoration area and an open water area in a related Mississippi River reach that has seen relatively little restoration effort. Inferences on the effects of restoration were made by comparing constrained and unconstrained models of summer chlorophyll a (CHL), summer inorganic suspended solids (ISS) and counts of benthic mayfly larvae. Constrained models forced trends in means or in both means and sampling variances to become, over time, increasingly similar to those in the positive reference area and increasingly dissimilar to those in the negative reference areas. Trends were estimated over 12- (mayflies) or 14-year sampling periods, and were evaluated using model information criteria. Based on these methods, restoration effects were observed for CHL and mayflies while evidence in favour of restoration effects on ISS was equivocal. These findings suggest that the cumulative effects of island building at relatively large spatial scales within large rivers may be estimated using data from large-scale surveillance monitoring programs. Published in 2010 by John Wiley & Sons, Ltd.

  13. Research of ad hoc network based on SINCGARS network

    NASA Astrophysics Data System (ADS)

    Nie, Hao; Cai, Xiaoxia; Chen, Hong; Chen, Jian; Weng, Pengfei

    2016-03-01

    In today's world, science and technology make a spurt of progress, so society has entered the era of information technology, network. Only the comprehensive use of electronic warfare and network warfare means can we maximize their access to information and maintain the information superiority. Combined with the specific combat mission and operational requirements, the research design and construction in accordance with the actual military which are Suitable for the future of information technology needs of the tactical Adhoc network, tactical internet, will greatly improve the operational efficiency of the command of the army. Through the study of the network of the U.S. military SINCGARS network, it can explore the routing protocol and mobile model, to provide a reference for the research of our army network.

  14. Attributions of blame and responsibility in sexual harassment: reexamining a psychological model.

    PubMed

    Klein, Kristen M; Apple, Kevin J; Kahn, Arnold S

    2011-04-01

    Kelley's (Nebr Symp Motiv 15:192-238, 1967) attribution theory can inform sexual harassment research by identifying how observers use consensus, consistency, and distinctiveness information in determining whether a target or perpetrator is responsible for a sexual harassment situation. In this study, Kelley's theory is applied to a scenario in which a male perpetrator sexually harasses a female target in a university setting. Results from 314 predominantly female college students indicate that consistency and consensus information significantly affect participants' judgments of blame and responsibility for the situation. The authors discuss the importance of the reference groups used to derive consensus and distinctiveness information, and reintroduce Kelley's attribution theory as a means of understanding observers' perceptions of sexual harassment.

  15. Modeling and Implementation of Multi-Position Non-Continuous Rotation Gyroscope North Finder.

    PubMed

    Luo, Jun; Wang, Zhiqian; Shen, Chengwu; Kuijper, Arjan; Wen, Zhuoman; Liu, Shaojin

    2016-09-20

    Even when the Global Positioning System (GPS) signal is blocked, a rate gyroscope (gyro) north finder is capable of providing the required azimuth reference information to a certain extent. In order to measure the azimuth between the observer and the north direction very accurately, we propose a multi-position non-continuous rotation gyro north finding scheme. Our new generalized mathematical model analyzes the elements that affect the azimuth measurement precision and can thus provide high precision azimuth reference information. Based on the gyro's principle of detecting a projection of the earth rotation rate on its sensitive axis and the proposed north finding scheme, we are able to deduct an accurate mathematical model of the gyro outputs against azimuth with the gyro and shaft misalignments. Combining the gyro outputs model and the theory of propagation of uncertainty, some approaches to optimize north finding are provided, including reducing the gyro bias error, constraining the gyro random error, increasing the number of rotation points, improving rotation angle measurement precision, decreasing the gyro and the shaft misalignment angles. According them, a north finder setup is built and the azimuth uncertainty of 18" is obtained. This paper provides systematic theory for analyzing the details of the gyro north finder scheme from simulation to implementation. The proposed theory can guide both applied researchers in academia and advanced practitioners in industry for designing high precision robust north finder based on different types of rate gyroscopes.

  16. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    PubMed

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test batches were used to examine the predictive ability of the model. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Building Trades. Block VIII. Interior Trim.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This curriculum for interior trim provides instructional materials for 18 informational and manipulative lessons. A list of 11 references precedes the course materials. The instructor's plan for each informational lesson begins by providing this information: subject, aim, required teaching aids, required materials, references, and prerequisite…

  18. Model documentation report: Transportation sector model of the National Energy Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-03-01

    This report documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model. This document serves three purposes. First, it is a reference document providing a detailed description of TRAN for model analysts, users, and the public. Second, this report meets the legal requirements of the Energy Information Administration (EIA) to provide adequate documentation in support of its statistical and forecast reports (Public Law 93-275, 57(b)(1)). Third, it permits continuity inmore » model development by providing documentation from which energy analysts can undertake model enhancements, data updates, and parameter refinements.« less

  19. Stability and Performance Metrics for Adaptive Flight Control

    NASA Technical Reports Server (NTRS)

    Stepanyan, Vahram; Krishnakumar, Kalmanje; Nguyen, Nhan; VanEykeren, Luarens

    2009-01-01

    This paper addresses the problem of verifying adaptive control techniques for enabling safe flight in the presence of adverse conditions. Since the adaptive systems are non-linear by design, the existing control verification metrics are not applicable to adaptive controllers. Moreover, these systems are in general highly uncertain. Hence, the system's characteristics cannot be evaluated by relying on the available dynamical models. This necessitates the development of control verification metrics based on the system's input-output information. For this point of view, a set of metrics is introduced that compares the uncertain aircraft's input-output behavior under the action of an adaptive controller to that of a closed-loop linear reference model to be followed by the aircraft. This reference model is constructed for each specific maneuver using the exact aerodynamic and mass properties of the aircraft to meet the stability and performance requirements commonly accepted in flight control. The proposed metrics are unified in the sense that they are model independent and not restricted to any specific adaptive control methods. As an example, we present simulation results for a wing damaged generic transport aircraft with several existing adaptive controllers.

  20. Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference

    PubMed Central

    Ehrenfeld, Stephan; Herbort, Oliver; Butz, Martin V.

    2013-01-01

    This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control. PMID:24191151

Top