Sample records for information processing approach

  1. The effect of low versus high approach-motivated positive affect on memory for peripherally versus centrally presented information.

    PubMed

    Gable, Philip A; Harmon-Jones, Eddie

    2010-08-01

    Emotions influence attention and processes involved in memory. Although some research has suggested that positive affect categorically influences these processes differently than neutral affect, recent research suggests that motivational intensity of positive affective states influences these processes. The present experiments examined memory for centrally or peripherally presented information after the evocation of approach-motivated positive affect. Experiment 1 found that, relative to neutral conditions, pregoal, approach-motivated positive affect (caused by a monetary incentives task) enhanced memory for centrally presented information, whereas postgoal, low approach-motivated positive affect enhanced memory for peripherally presented information. Experiment 2 found that, relative to a neutral condition, high approach-motivated positive affect (caused by appetitive pictures) enhanced memory for centrally presented information but hindered memory for peripheral information. These results suggest a more complex relationship between positive affect and memory processes and highlight the importance of considering the motivational intensity of positive affects in cognitive processes. Copyright 2010 APA

  2. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  3. A proven knowledge-based approach to prioritizing process information

    NASA Technical Reports Server (NTRS)

    Corsberg, Daniel R.

    1991-01-01

    Many space-related processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect is rapid analysis of the changing process information. During a disturbance, this task can overwhelm humans as well as computers. Humans deal with this by applying heuristics in determining significant information. A simple, knowledge-based approach to prioritizing information is described. The approach models those heuristics that humans would use in similar circumstances. The approach described has received two patents and was implemented in the Alarm Filtering System (AFS) at the Idaho National Engineering Laboratory (INEL). AFS was first developed for application in a nuclear reactor control room. It has since been used in chemical processing applications, where it has had a significant impact on control room environments. The approach uses knowledge-based heuristics to analyze data from process instrumentation and respond to that data according to knowledge encapsulated in objects and rules. While AFS cannot perform the complete diagnosis and control task, it has proven to be extremely effective at filtering and prioritizing information. AFS was used for over two years as a first level of analysis for human diagnosticians. Given the approach's proven track record in a wide variety of practical applications, it should be useful in both ground- and space-based systems.

  4. Performance Approach, Performance Avoidance and Depth of Information Processing: A Fresh Look at Relations between Students' Academic Motivation and Cognition.

    ERIC Educational Resources Information Center

    Barker, Katrina; Dowson, Martin

    This study combines a trichotomous motivational variable (mastery goal, performance approach, and performance avoidance goal) with an information-processing variable referred to as depth of processing, to investigate the effects of motivation on the encoding and recall of verbal information with a sample of infants and primary grade students…

  5. A social information processing approach to job attitudes and task design.

    PubMed

    Salancik, G R; Pfeffer, J

    1978-06-01

    This article outlines a social information processing approach to explain job attitudes. In comparison with need-satisfaction and expectancy models to job attitudes and motivation, the social information processing perspective emphasizes the effects of context and the consequences of past choices, rather than individual predispositions and rational decision-making processes. When an individual develops statements about attitude or needs, he or she uses social information--information about past behavior and about what others think. The process of attributing attitudes or needs from behavior is itself affected by commitment processes, by the saliency and relevance of information, and by the need to develop socially acceptable and legitimate rationalizations for actions. Both attitudes and need statements, as well as characterizations of jobs, are affected by informational social influence. The implications of the social information processing perspective for organization development efforts and programs of job redesign are discussed.

  6. Another Look: The Process Approach to Composition Instruction.

    ERIC Educational Resources Information Center

    Pollard, Rita H.

    1991-01-01

    Responds to Thomas Devine's indictment of the process approach to writing instruction, arguing that teaching practices reflecting misapplication of research are often wrongly labeled the process approach and a more precise definition of the process approach should inform debates over its value. Questions Devine's conclusions. (DMM)

  7. Incorporating Edge Information into Best Merge Region-Growing Segmentation

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Pasolli, Edoardo

    2014-01-01

    We have previously developed a best merge region-growing approach that integrates nonadjacent region object aggregation with the neighboring region merge process usually employed in region growing segmentation approaches. This approach has been named HSeg, because it provides a hierarchical set of image segmentation results. Up to this point, HSeg considered only global region feature information in the region growing decision process. We present here three new versions of HSeg that include local edge information into the region growing decision process at different levels of rigor. We then compare the effectiveness and processing times of these new versions HSeg with each other and with the original version of HSeg.

  8. Basic disturbances of information processing in psychosis prediction.

    PubMed

    Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan

    2013-01-01

    The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.

  9. A Cognitive Information Processing Approach to Employment Problem Solving and Decision Making.

    ERIC Educational Resources Information Center

    Sampson, James P., Jr.; Lenz, Janet G.; Reardon, Robert C.; Peterson, Gary W.

    1999-01-01

    Applies a cognitive information processing approach to the specific process of employment problem solving and decision making. Definitions and accompanying employment examples are followed by an exploration of the nature of employment problems. Examples of positive and negative cognitions that have an impact on the effectiveness of employment…

  10. Process-in-Network: A Comprehensive Network Processing Approach

    PubMed Central

    Urzaiz, Gabriel; Villa, David; Villanueva, Felix; Lopez, Juan Carlos

    2012-01-01

    A solid and versatile communications platform is very important in modern Ambient Intelligence (AmI) applications, which usually require the transmission of large amounts of multimedia information over a highly heterogeneous network. This article focuses on the concept of Process-in-Network (PIN), which is defined as the possibility that the network processes information as it is being transmitted, and introduces a more comprehensive approach than current network processing technologies. PIN can take advantage of waiting times in queues of routers, idle processing capacity in intermediate nodes, and the information that passes through the network. PMID:22969390

  11. Conceptual information processing: A robust approach to KBS-DBMS integration

    NASA Technical Reports Server (NTRS)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  12. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach.

    PubMed

    Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-07-20

    Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.

  13. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787

  14. Applying the Cognitive Information Processing Approach to Career Problem Solving and Decision Making to Women's Career Development.

    ERIC Educational Resources Information Center

    McLennan, Natasha A.; Arthur, Nancy

    1999-01-01

    Outlines an expanded framework of the Cognitive Information Processing (CIP) approach to career problem solving and decision making for career counseling with women. Addresses structural and individual barriers in women's career development and provides practical suggestions for applying and evaluating the CIP approach in career counseling.…

  15. Parallel photonic information processing at gigabyte per second data rates using transient states

    NASA Astrophysics Data System (ADS)

    Brunner, Daniel; Soriano, Miguel C.; Mirasso, Claudio R.; Fischer, Ingo

    2013-01-01

    The increasing demands on information processing require novel computational concepts and true parallelism. Nevertheless, hardware realizations of unconventional computing approaches never exceeded a marginal existence. While the application of optics in super-computing receives reawakened interest, new concepts, partly neuro-inspired, are being considered and developed. Here we experimentally demonstrate the potential of a simple photonic architecture to process information at unprecedented data rates, implementing a learning-based approach. A semiconductor laser subject to delayed self-feedback and optical data injection is employed to solve computationally hard tasks. We demonstrate simultaneous spoken digit and speaker recognition and chaotic time-series prediction at data rates beyond 1Gbyte/s. We identify all digits with very low classification errors and perform chaotic time-series prediction with 10% error. Our approach bridges the areas of photonic information processing, cognitive and information science.

  16. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  17. Performance Approach, Performance Avoidance and Depth of Information Processing: A Fresh Look at Relations between Students' Academic Motivation and Cognition.

    ERIC Educational Resources Information Center

    Barker, Katrina L.; McInerney, Dennis M.; Dowson, Martin

    2002-01-01

    Examines effects of the motivational approach on the recall of verbal information processed at shallow and deep levels. Explains that students were assigned to a mastery focused condition, performance approach condition, or a control group. Reports that students remembered more stimulus words during cued recall than free recall. Includes…

  18. Informational approach to the analysis of acoustic signals

    NASA Astrophysics Data System (ADS)

    Senkevich, Yuriy; Dyuk, Vyacheslav; Mishchenko, Mikhail; Solodchuk, Alexandra

    2017-10-01

    The example of linguistic processing of acoustic signals of a seismic event would be an information approach to the processing of non-stationary signals. The method for converting an acoustic signal into an information message is described by identifying repetitive self-similar patterns. The definitions of the event selection indicators in the symbolic recording of the acoustic signal are given. The results of processing an acoustic signal by a computer program realizing the processing of linguistic data are shown. Advantages and disadvantages of using software algorithms are indicated.

  19. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems.

    PubMed

    Araújo, Luciano V; Malkowski, Simon; Braghetto, Kelly R; Passos-Bueno, Maria R; Zatz, Mayana; Pu, Calton; Ferreira, João E

    2011-12-22

    Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.

  20. A rigorous approach to facilitate and guarantee the correctness of the genetic testing management in human genome information systems

    PubMed Central

    2011-01-01

    Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces. PMID:22369688

  1. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    PubMed

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  2. Process-Driven Culture Learning in American KFL Classroom Settings

    ERIC Educational Resources Information Center

    Byon, Andrew Sangpil

    2007-01-01

    Teaching second language (L2) culture can be either content- or process-driven. The content-driven approach refers to explicit instruction of L2 cultural information. On the other hand, the process-driven approach focuses on students' active participation in cultural learning processes. In this approach, teachers are not only information…

  3. Conceptual models of information processing

    NASA Technical Reports Server (NTRS)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  4. A methodology proposal for collaborative business process elaboration using a model-driven approach

    NASA Astrophysics Data System (ADS)

    Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé

    2015-05-01

    Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).

  5. Learning on the Fly: Exploring the Informal Learning Process of Aviation Instructors

    ERIC Educational Resources Information Center

    Wofford, Michael Grant; Ellinger, Andrea D.; Watkins, Karen E.

    2013-01-01

    Purpose: This study aims to examine the process of informal learning of aviation instructors. Design/methodology/approach: A qualitative instrumental case study design was used for this study. In-depth, multiple semi-structured interviews and document review were the primary approaches to data collection and the data were analyzed using constant…

  6. Dividing Attention within and between Hemispheres: Testing a Multiple Resources Approach to Limited-Capacity Information Processing.

    ERIC Educational Resources Information Center

    Friedman, Alinda; And Others

    1982-01-01

    Two experiments tested the limiting case of a multiple resources approach to resource allocation in information processing. Results contradict a single-capacity model, supporting the idea that the hemispheres' resource supplies are independent and have implications for both cerebral specialization and divided attention issues. (Author/PN)

  7. Social Information Processing in Preschool Children: Relations to Sociodemographic Risk and Problem Behavior

    ERIC Educational Resources Information Center

    Ziv, Yair; Sorongon, Alberto

    2011-01-01

    Using a multicomponent, process-oriented approach, the links between social information processing during the preschool years and (a) sociodemographic risk and (b) behavior problems in preschool were examined in a community sample of 196 children. Findings provided support for our initial hypotheses that aspects of social information processing in…

  8. The Career Motivation Process Program

    ERIC Educational Resources Information Center

    Garrison, Clifford; And Others

    1975-01-01

    Describes the Career Motivation Process (CMP) program, an experimental approach to career counseling incorporating both the "personality" approach, which centers around personal self-examination, and the "decision-making" approach, which emphasizes the collection of information about possible career options. (JG)

  9. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    PubMed

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  10. System approach to modeling of industrial technologies

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, E. S.

    2018-03-01

    The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.

  11. Achieving a Risk-Informed Decision-Making Environment at NASA: The Emphasis of NASA's Risk Management Policy

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews the evolution of risk management (RM) at NASA. The aim of the RM approach at NASA is to promote an approach that is heuristic, proactive, and coherent across all of NASA. Risk Informed Decision Making (RIDM) is a decision making process that uses a diverse set of performance measures along with other considerations within a deliberative process to inform decision making. RIDM is invoked for key decisions such as architecture and design decisions, make-buy decisions, and budget reallocation. The RIDM process and how it relates to the continuous Risk Management (CRM) process is reviewed.

  12. Understanding price discovery in interconnected markets: Generalized Langevin process approach and simulation

    NASA Astrophysics Data System (ADS)

    Schenck, Natalya A.; Horvath, Philip A.; Sinha, Amit K.

    2018-02-01

    While the literature on price discovery process and information flow between dominant and satellite market is exhaustive, most studies have applied an approach that can be traced back to Hasbrouck (1995) or Gonzalo and Granger (1995). In this paper, however, we propose a Generalized Langevin process with asymmetric double-well potential function, with co-integrated time series and interconnected diffusion processes to model the information flow and price discovery process in two, a dominant and a satellite, interconnected markets. A simulated illustration of the model is also provided.

  13. Value Driven Information Processing and Fusion

    DTIC Science & Technology

    2016-03-01

    consensus approach allows a decentralized approach to achieve the optimal error exponent of the centralized counterpart, a conclusion that is signifi...SECURITY CLASSIFICATION OF: The objective of the project is to develop a general framework for value driven decentralized information processing...including: optimal data reduction in a network setting for decentralized inference with quantization constraint; interactive fusion that allows queries and

  14. A Holistic Approach to Networked Information Systems Design and Analysis

    DTIC Science & Technology

    2016-04-15

    attain quite substantial savings. 11. Optimal algorithms for energy harvesting in wireless networks. We use a Markov- decision-process (MDP) based...approach to obtain optimal policies for transmissions . The key advantage of our approach is that it holistically considers information and energy in a...Coding technique to minimize delays and the number of transmissions in Wireless Systems. As we approach an era of ubiquitous computing with information

  15. Enhancing the Teaching-Learning Process: A Knowledge Management Approach

    ERIC Educational Resources Information Center

    Bhusry, Mamta; Ranjan, Jayanthi

    2012-01-01

    Purpose: The purpose of this paper is to emphasize the need for knowledge management (KM) in the teaching-learning process in technical educational institutions (TEIs) in India, and to assert the impact of information technology (IT) based KM intervention in the teaching-learning process. Design/methodology/approach: The approach of the paper is…

  16. Information in general medical practices: the information processing model.

    PubMed

    Crowe, Sarah; Tully, Mary P; Cantrill, Judith A

    2010-04-01

    The need for effective communication and handling of secondary care information in general practices is paramount. To explore practice processes on receiving secondary care correspondence in a way that integrates the information needs and perceptions of practice staff both clinical and administrative. Qualitative study using semi-structured interviews with a wide range of practice staff (n = 36) in nine practices in the Northwest of England. Analysis was based on the framework approach using N-Vivo software and involved transcription, familiarization, coding, charting, mapping and interpretation. The 'information processing model' was developed to describe the six stages involved in practice processing of secondary care information. These included the amendment or updating of practice records whilst simultaneously or separately actioning secondary care recommendations, using either a 'one-step' or 'two-step' approach, respectively. Many factors were found to influence each stage and impact on the continuum of patient care. The primary purpose of processing secondary care information is to support patient care; this study raises the profile of information flow and usage within practices as an issue requiring further consideration.

  17. Virtual HRD and National Culture: An Information Processing Perspective

    ERIC Educational Resources Information Center

    Chung, Chih-Hung; Angnakoon, Putthachat; Li, Jessica; Allen, Jeff

    2016-01-01

    Purpose: The purpose of this study is to provide researchers with a better understanding of the cultural impact on information processing in virtual learning environment. Design/methodology/approach: This study uses a causal loop diagram to depict the cultural impact on information processing in the virtual human resource development (VHRD)…

  18. Information Processing Approaches to Cognitive Development

    DTIC Science & Technology

    1989-08-04

    O’Connor (Eds.), Intelligence and learning . New York: Plenum Press. Deloache, J.S. (1988). The development of representation in young chidren . In H.W...Klahr, D., & Carver, S.M. (1988). Cognitive objectives in a LOGO debugging curriculum: Instruction, Learning , and Transfer. Cognitive Psychology, 20...Production system models of learning and development. Cambridge, MA: MIT Press. TWO KINDS OF INFORMATION PROCESSING APPROACHES TO COGNITIVE DEVELOPMENT

  19. [Cognitive experimental approach to anxiety disorders].

    PubMed

    Azaïs, F

    1995-01-01

    Cognitive psychology is proposing a functional model to explain the mental organisation leading to emotional disorders. Among these disorders, anxiety spectrum represents a domain in which this model seems to be interesting for an efficient and comprehensive approach of the pathology. Number of behavioral or cognitive psychotherapeutic methods are relating to these cognitive references, but the theorical concepts of cognitive "shemata" or cognitive "processes" evoked to describe mental functioning in anxiety need an experimental approach for a better rational understanding. Cognitive function as perception, attention or memory can be explored in this domaine in an efficient way, allowing a more precise study of each stage of information processing. The cognitive model proposed in the psychopathology of anxiety suggests that anxious subjects are characterized by biases in processing of emotionally valenced information. This hypothesis suggests functional interference in information processing in these subjects, leading to an anxious response to the most of different stimuli. Experimental approach permit to explore this hypothesis, using many tasks for testing different cognitive dysfunction evoked in the anxious cognitive organisation. Impairments revealed in anxiety disorders seem to result from specific biases in threat-related information processing, involving several stages of cognitive processes. Semantic interference, attentional bias, implicit memory bias and priming effect are the most often disorders observed in anxious pathology, like simple phobia, generalised anxiety, panic disorder or post-traumatic stress disorder. These results suggest a top-down organisation of information processing in anxious subjects, who tend to detect, perceive and label many situations as threatening experience. The processes of reasoning and elaboration are consequently impaired in their adaptative function to threat, leading to the anxious response observed in clinical condition. The cognitive, behavioral and emotional components of this anxious reaction maintain the stressful experience for the subject, in which the self cognitive competence remain pathologically decreased. Cognitive psychology proposes an interesting model for the understanding of anxiety, in a domain in which subjectivity could benefit from an experimental approach.(ABSTRACT TRUNCATED AT 400 WORDS)

  20. HMI conventions for process control graphics.

    PubMed

    Pikaar, Ruud N

    2012-01-01

    Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.

  1. A Semantic Approach for Geospatial Information Extraction from Unstructured Documents

    NASA Astrophysics Data System (ADS)

    Sallaberry, Christian; Gaio, Mauro; Lesbegueries, Julien; Loustau, Pierre

    Local cultural heritage document collections are characterized by their content, which is strongly attached to a territory and its land history (i.e., geographical references). Our contribution aims at making the content retrieval process more efficient whenever a query includes geographic criteria. We propose a core model for a formal representation of geographic information. It takes into account characteristics of different modes of expression, such as written language, captures of drawings, maps, photographs, etc. We have developed a prototype that fully implements geographic information extraction (IE) and geographic information retrieval (IR) processes. All PIV prototype processing resources are designed as Web Services. We propose a geographic IE process based on semantic treatment as a supplement to classical IE approaches. We implement geographic IR by using intersection computing algorithms that seek out any intersection between formal geocoded representations of geographic information in a user query and similar representations in document collection indexes.

  2. Individual Differences in Depth and Breadth of Processing.

    ERIC Educational Resources Information Center

    Schmeck, Ronald R.; McCarthy, Patricia

    Memory has been defined as traces left behind by past information processing. One approach to the study of everyday memory is to isolate reliable differences between individuals in the ways in which they process information when preparing for test events. The Inventory of Learning Processes, consisting of four scales, i.e., Deep Processing,…

  3. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.

  4. Automating "Word of Mouth" to Recommend Classes to Students: An Application of Social Information Filtering Algorithms

    ERIC Educational Resources Information Center

    Booker, Queen Esther

    2009-01-01

    An approach used to tackle the problem of helping online students find the classes they want and need is a filtering technique called "social information filtering," a general approach to personalized information filtering. Social information filtering essentially automates the process of "word-of-mouth" recommendations: items are recommended to a…

  5. Managing Approach Plate Information Study (MAPLIST): An Information Requirements Analysis of Approach Chart Use

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Jonnson, Jon E.; Barry, John S.

    1996-01-01

    Adequately presenting all necessary information on an approach chart represents a challenge for cartographers. Since many tasks associated with using approach charts are cognitive (e.g., planning the approach and monitoring its progress), and since the characteristic of a successful interface is one that conforms to the users' mental models, understanding pilots' underlying models of approach chart information would greatly assist cartographers. To provide such information, a new methodology was developed for this study that enhances traditional information requirements analyses by combining psychometric scaling techniques with a simulation task to provide quantifiable links between pilots' cognitive representations of approach information and their use of approach information. Results of this study should augment previous information requirements analyses by identifying what information is acquired, when it is acquired, and what presentation concepts might facilitate its efficient use by better matching the pilots' cognitive model of the information. The primary finding in this study indicated that pilots mentally organize approach chart information into ten primary categories: communications, geography, validation, obstructions, navigation, missed approach, final items, other runways, visibility requirement, and navigation aids. These similarity categories were found to underlie the pilots' information acquisitions, other mental models, and higher level cognitive processes that are used to accomplish their approach and landing tasks.

  6. Socio-Pedagogical Priorities of the Educational Process at the University: The Didactic Aspect of Information Technology

    ERIC Educational Resources Information Center

    Rassolov, Ilya M.; Sidyacheva, Natalya V.; Zotova, Larisa E.; Salitova, Feride Sch.; Konyushenko, Svetlana M.; Gzhemskaya, Nuriya Kh.

    2016-01-01

    The relevance of the study is conditioned by intensive introduction of information technologies in the educational process of the University. Analysis of practical activities of University groups shows that in the absence of science-based approaches to the implementation of information technologies in the educational process, there are increasing…

  7. Mission informed needed information: discoverable, available sensing sources (MINI-DASS): the operators and process flows the magic rabbits must negotiate

    NASA Astrophysics Data System (ADS)

    Kolodny, Michael A.

    2017-05-01

    Today's battlefield space is extremely complex, dealing with an enemy that is neither well-defined nor well-understood. Adversaries are comprised of widely-distributed, loosely-networked groups engaging in nefarious activities. Situational understanding is needed by decision makers; understanding of adversarial capabilities and intent is essential. Information needed at any time is dependent on the mission/task at hand. Information sources potentially providing mission-relevant information are disparate and numerous; they include sensors, social networks, fusion engines, internet, etc. Management of these multi-dimensional informational sources is critical. This paper will present a new approach being undertaken to answer the challenge of enhancing battlefield understanding by optimizing the utilization of available informational sources (means) to required missions/tasks as well as determining the "goodness'" of the information acquired in meeting the capabilities needed. Requirements are usually expressed in terms of a presumed technology solution (e.g., imagery). A metaphor of the "magic rabbits" was conceived to remove presumed technology solutions from requirements by claiming the "required" technology is obsolete. Instead, intelligent "magic rabbits" are used to provide needed information. The question then becomes: "WHAT INFORMATION DO YOU NEED THE RABBITS TO PROVIDE YOU?" This paper will describe a new approach called Mission-Informed Needed Information - Discoverable, Available Sensing Sources (MINI-DASS) that designs a process that builds information acquisition missions and determines what the "magic rabbits" need to provide in a manner that is machine understandable. Also described is the Missions and Means Framework (MMF) model used, the process flow utilized, the approach to developing an ontology of information source means and the approach for determining the value of the information acquired.

  8. A Social Information Processing Approach to Job Attitudes and Task Design

    ERIC Educational Resources Information Center

    Salancik, Gerald R.; Pfeffer, Jeffrey

    1978-01-01

    In comparison with need-satisfaction and expectancy models of job attitudes and motivation, the social information processing perspective emphasizes the effects of context and the consequences of past choices, rather than individual predispositions and rational decision-making processes. (Author)

  9. A Qualitative Case Study Approach To Examine Information Resources Management. (Utilisation d'une Approche Qualitative par Methode de cas pour Etudier la Gestion des Ressources D'information).

    ERIC Educational Resources Information Center

    Bergeron, Pierrette

    1997-01-01

    Illustrates how a qualitative approach was used to study the complex and poorly defined concept of information resources management. Explains the general approach to data collection, its advantages and limitations, and the process used to analyze the data. Presents results, along with lessons learned through using method. (Author/AEF)

  10. Understanding Language: An Information-Processing Analysis of Speech Perception, Reading, and Psycholinguistics.

    ERIC Educational Resources Information Center

    Massaro, Dominic W., Ed.

    In an information-processing approach to language processing, language processing is viewed as a sequence of psychological stages that occur between the initial presentation of the language stimulus and the meaning in the mind of the language processor. This book defines each of the processes and structures involved, explains how each of them…

  11. An Ontological Informatics Framework for Pharmaceutical Product Development: Milling as a Case Study

    ERIC Educational Resources Information Center

    Akkisetty, Venkata Sai Pavan Kumar

    2009-01-01

    Pharmaceutical product development is an expensive, time consuming and information intensive process. Providing the right information at the right time is of great importance in pharmaceutical industry. To achieve this, knowledge management is the approach to deal with the humongous quantity of information. Ontological approach proposed in Venkat…

  12. Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols

    PubMed Central

    2016-01-01

    The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM–0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information. PMID:27385047

  13. Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols.

    PubMed

    Liu, Zhengchun; Liu, Yi; Kim, Eunkyoung; Bentley, William E; Payne, Gregory F

    2016-07-19

    The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM-0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information.

  14. Modeling the dynamics of multipartite quantum systems created departing from two-level systems using general local and non-local interactions

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco

    2017-12-01

    Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.

  15. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    ERIC Educational Resources Information Center

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  16. Quantum Approach to Informatics

    NASA Astrophysics Data System (ADS)

    Stenholm, Stig; Suominen, Kalle-Antti

    2005-08-01

    An essential overview of quantum information Information, whether inscribed as a mark on a stone tablet or encoded as a magnetic domain on a hard drive, must be stored in a physical object and thus made subject to the laws of physics. Traditionally, information processing such as computation occurred in a framework governed by laws of classical physics. However, information can also be stored and processed using the states of matter described by non-classical quantum theory. Understanding this quantum information, a fundamentally different type of information, has been a major project of physicists and information theorists in recent years, and recent experimental research has started to yield promising results. Quantum Approach to Informatics fills the need for a concise introduction to this burgeoning new field, offering an intuitive approach for readers in both the physics and information science communities, as well as in related fields. Only a basic background in quantum theory is required, and the text keeps the focus on bringing this theory to bear on contemporary informatics. Instead of proofs and other highly formal structures, detailed examples present the material, making this a uniquely accessible introduction to quantum informatics. Topics covered include: * An introduction to quantum information and the qubit * Concepts and methods of quantum theory important for informatics * The application of information concepts to quantum physics * Quantum information processing and computing * Quantum gates * Error correction using quantum-based methods * Physical realizations of quantum computing circuits A helpful and economical resource for understanding this exciting new application of quantum theory to informatics, Quantum Approach to Informatics provides students and researchers in physics and information science, as well as other interested readers with some scientific background, with an essential overview of the field.

  17. PROCRU: A model for analyzing crew procedures in approach to landing

    NASA Technical Reports Server (NTRS)

    Baron, S.; Muralidharan, R.; Lancraft, R.; Zacharias, G.

    1980-01-01

    A model for analyzing crew procedures in approach to landing is developed. The model employs the information processing structure used in the optimal control model and in recent models for monitoring and failure detection. Mechanisms are added to this basic structure to model crew decision making in this multi task environment. Decisions are based on probability assessments and potential mission impact (or gain). Sub models for procedural activities are included. The model distinguishes among external visual, instrument visual, and auditory sources of information. The external visual scene perception models incorporate limitations in obtaining information. The auditory information channel contains a buffer to allow for storage in memory until that information can be processed.

  18. Reshaping the Enterprise through an Information Architecture and Process Reengineering.

    ERIC Educational Resources Information Center

    Laudato, Nicholas C.; DeSantis, Dennis J.

    1995-01-01

    The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…

  19. A Physiological Approach to the Study of Human Information Processing.

    ERIC Educational Resources Information Center

    Fletcher, James E.

    Soviet neuropsychologist Sokolov's notions of tonic and phasic orienting responses and of defense responses are examined for relevance to individual information processing. The phasic orienting response provides an index to attention and to information demands generated by the cerebral cortex. The sum of orienting responses elicted by a message…

  20. Informations in Models of Evolutionary Dynamics

    NASA Astrophysics Data System (ADS)

    Rivoire, Olivier

    2016-03-01

    Biological organisms adapt to changes by processing informations from different sources, most notably from their ancestors and from their environment. We review an approach to quantify these informations by analyzing mathematical models of evolutionary dynamics and show how explicit results are obtained for a solvable subclass of these models. In several limits, the results coincide with those obtained in studies of information processing for communication, gambling or thermodynamics. In the most general case, however, information processing by biological populations shows unique features that motivate the analysis of specific models.

  1. Organisation of biotechnological information into knowledge.

    PubMed

    Boh, B

    1996-09-01

    The success of biotechnological research, development and marketing depends to a large extent on the international transfer of information and on the ability to organise biotechnology information into knowledge. To increase the efficiency of information-based approaches, an information strategy has been developed and consists of the following stages: definition of the problem, its structure and sub-problems; acquisition of data by targeted processing of computer-supported bibliographic, numeric, textual and graphic databases; analysis of data and building of specialized in-house information systems; information processing for structuring data into systems, recognition of trends and patterns of knowledge, particularly by information synthesis using the concept of information density; design of research hypotheses; testing hypotheses in the laboratory and/or pilot plant; repeated evaluation and optimization of hypotheses by information methods and testing them by further laboratory work. The information approaches are illustrated by examples from the university-industry joint projects in biotechnology, biochemistry and agriculture.

  2. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  3. Trait-Treatment Interactions (TTI), Cognitive Processes and Research on Communication Media.

    ERIC Educational Resources Information Center

    Di Vesta, Francis J.

    The Trait Treatment Interaction (TTI) Process approach is particularly adapted to the study of information-processing by receivers of information presented in the media. Differences in people's experiences do lead to different cognitive structures. Different people use the same machinery of perceiving, coding, storing, and retrieving. Neverthless,…

  4. Information Processing and Dynamics in Minimally Cognitive Agents

    ERIC Educational Resources Information Center

    Beer, Randall D.; Williams, Paul L.

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we…

  5. From open source communications to knowledge

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Roberts, Colin; Rogers, David; Webberley, Will; Innes, Martin; Braines, Dave

    2016-05-01

    Rapid processing and exploitation of open source information, including social media sources, in order to shorten decision-making cycles, has emerged as an important issue in intelligence analysis in recent years. Through a series of case studies and natural experiments, focussed primarily upon policing and counter-terrorism scenarios, we have developed an approach to information foraging and framing to inform decision making, drawing upon open source intelligence, in particular Twitter, due to its real-time focus and frequent use as a carrier for links to other media. Our work uses a combination of natural language (NL) and controlled natural language (CNL) processing to support information collection from human sensors, linking and schematising of collected information, and the framing of situational pictures. We illustrate the approach through a series of vignettes, highlighting (1) how relatively lightweight and reusable knowledge models (schemas) can rapidly be developed to add context to collected social media data, (2) how information from open sources can be combined with reports from trusted observers, for corroboration or to identify con icting information; and (3) how the approach supports users operating at or near the tactical edge, to rapidly task information collection and inform decision-making. The approach is supported by bespoke software tools for social media analytics and knowledge management.

  6. Frontiers in Human Information Processing Conference

    DTIC Science & Technology

    2008-02-25

    Frontiers in Human Information Processing - Vision, Attention , Memory , and Applications: A Tribute to George Sperling, a Festschrift. We are grateful...with focus on the formal, computational, and mathematical approaches that unify the areas of vision, attention , and memory . The conference also...Information Processing Conference Final Report AFOSR GRANT # FA9550-07-1-0346 The AFOSR Grant # FA9550-07-1-0346 provided partial support for the Conference

  7. Approach--avoidance motivation and information processing: a cross-cultural analysis.

    PubMed

    Hamamura, Takeshi; Meijer, Zita; Heine, Steven J; Kamaya, Kengo; Hori, Izumi

    2009-04-01

    Much recent research suggests that North Americans more frequently experience approach motivations and East Asians more frequently experience avoidance motivations. The current research explores some cognitive implications of this cultural difference. North Americans should be more attentive to approach-oriented information, whereas East Asians should be more attentive to avoidance-oriented information. Three studies confirmed this hypothesis. When asked to recall information framed in either approach or avoidance terms, a predicted interaction between culture and information frame was observed (Study 1 and 2). Moreover, analyses of consumer book reviews found that among reviews that were rated as helpful, approach-focused content was more prevalent in American reviews compared to Japanese reviews, in which avoidance-focused content was more prevalent (Study 3). Findings from the current research add to the growing literature of cross-cultural research on approach-avoidance motivations.

  8. 75 FR 45112 - Call for Information: Information on Greenhouse Gas Emissions Associated With Bioenergy and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-02

    ... information and viewpoints from interested parties on approaches to accounting for greenhouse gas emissions... (BACT) review process under PSD? In addition, the first full sentence of the third bulleted item in... is: ``The Clean Air Act (CAA) provisions typically apply at the unit, process, or facility scale...

  9. Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation and Completion of Episodic Information.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aimone, James Bradley; Bernard, Michael Lewis; Vineyard, Craig Michael

    2014-10-01

    Adult neurogenesis in the hippocampus region of the brain is a neurobiological process that is believed to contribute to the brain's advanced abilities in complex pattern recognition and cognition. Here, we describe how realistic scale simulations of the neurogenesis process can offer both a unique perspective on the biological relevance of this process and confer computational insights that are suggestive of novel machine learning techniques. First, supercomputer based scaling studies of the neurogenesis process demonstrate how a small fraction of adult-born neurons have a uniquely larger impact in biologically realistic scaled networks. Second, we describe a novel technical approach bymore » which the information content of ensembles of neurons can be estimated. Finally, we illustrate several examples of broader algorithmic impact of neurogenesis, including both extending existing machine learning approaches and novel approaches for intelligent sensing.« less

  10. Effectiveness of Information Processing Strategy Training on Academic Task Performance in Children with Learning Disabilities: A Pilot Study.

    PubMed

    Juntorn, Sutinun; Sriphetcharawut, Sarinya; Munkhetvit, Peeraya

    2017-01-01

    Learning disabilities (LD) can be associated with problems in the four stages of information processing used in learning: input, throughput, output, and feedback. These problems affect the child's ability to learn and perform activities in daily life, especially during academic activities. This study is a pilot study aimed at investigating the effectiveness of information processing strategy training using a combination of two approaches that address the ability to apply processing strategies during academic activities in children with LD. The two approaches are the Perceive, Recall, Plan, and Perform (PRPP) System of Intervention, which is a strategy training intervention, and the Four-Quadrant Model (4QM) of Facilitated Learning approach, which is a systematic facilitator technique. Twenty children with LD were assigned to two groups: the experimental group ( n = 10) and the control group ( n = 10). Children in the experimental group received the intervention twice a week for 6 consecutive weeks. Each treatment session took approximately 50 minutes. Children in the control group received traditional intervention twice a week for 6 consecutive weeks. The results indicated that the combination of the PRPP System of Intervention and the 4QM may improve the participants' ability to apply information processing strategies during academic activities.

  11. Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)

    NASA Astrophysics Data System (ADS)

    Budi Santoso, Halim; Delima, Rosa

    2018-03-01

    Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.

  12. Parallel approach to incorporating face image information into dialogue processing

    NASA Astrophysics Data System (ADS)

    Ren, Fuji

    2000-10-01

    There are many kinds of so-called irregular expressions in natural dialogues. Even if the content of a conversation is the same in words, different meanings can be interpreted by a person's feeling or face expression. To have a good understanding of dialogues, it is required in a flexible dialogue processing system to infer the speaker's view properly. However, it is difficult to obtain the meaning of the speaker's sentences in various scenes using traditional methods. In this paper, a new approach for dialogue processing that incorporates information from the speaker's face is presented. We first divide conversation statements into several simple tasks. Second, we process each simple task using an independent processor. Third, we employ some speaker's face information to estimate the view of the speakers to solve ambiguities in dialogues. The approach presented in this paper can work efficiently, because independent processors run in parallel, writing partial results to a shared memory, incorporating partial results at appropriate points, and complementing each other. A parallel algorithm and a method for employing the face information in a dialogue machine translation will be discussed, and some results will be included in this paper.

  13. Processing Coordinated Verb Phrases: The Relevance of Lexical-Semantic, Conceptual, and Contextual Information towards Establishing Verbal Parallelism

    ERIC Educational Resources Information Center

    Tutunjian, Damon A.

    2010-01-01

    This dissertation examines the influence of lexical-semantic representations, conceptual similarity, and contextual fit on the processing of coordinated verb phrases. The study integrates information gleaned from current linguistic theory with current psycholinguistic approaches to examining the processing of coordinated verb phrases. It has…

  14. Information Design: A New Approach to Teaching Technical Writing Service Courses

    ERIC Educational Resources Information Center

    McKee, Candie DeLane

    2012-01-01

    This study used a needs assessment, process analysis, process design, and textbook design to develop a new process and new textbook, based on Cargile-Cook's layered literacies, Quesenbery's five qualities of usability, and Carliner's information design theories, for use in technical writing service learning courses. The needs assessment was based…

  15. Validating commercial remote sensing and spatial information (CRS&SI) technologies for streamlining environmental and planning processes in transportation projects.

    DOT National Transportation Integrated Search

    2010-03-01

    Transportation corridor-planning processes are well understood, and consensus exists among practitioners : about common practices for stages and tasks included in traditional EIS approaches. However, traditional approaches do : not typically employ f...

  16. Real-Time Nonlinear Optical Information Processing.

    DTIC Science & Technology

    1979-06-01

    operations aree presented. One approach realizes the halftone method of nonlinear optical processing in real time by replacing the conventional...photographic recording medium with a real-time image transducer. In the second approach halftoning is eliminated and the real-time device is used directly

  17. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  18. Clinical modeling--a critical analysis.

    PubMed

    Blobel, Bernd; Goossen, William; Brochhausen, Mathias

    2014-01-01

    Modeling clinical processes (and their informational representation) is a prerequisite for optimally enabling and supporting high quality and safe care through information and communication technology and meaningful use of gathered information. The paper investigates existing approaches to clinical modeling, thereby systematically analyzing the underlying principles, the consistency with and the integration opportunity to other existing or emerging projects, as well as the correctness of representing the reality of health and health services. The analysis is performed using an architectural framework for modeling real-world systems. In addition, fundamental work on the representation of facts, relations, and processes in the clinical domain by ontologies is applied, thereby including the integration of advanced methodologies such as translational and system medicine. The paper demonstrates fundamental weaknesses and different maturity as well as evolutionary potential in the approaches considered. It offers a development process starting with the business domain and its ontologies, continuing with the Reference Model-Open Distributed Processing (RM-ODP) related conceptual models in the ICT ontology space, the information and the computational view, and concluding with the implementation details represented as engineering and technology view, respectively. The existing approaches reflect at different levels the clinical domain, put the main focus on different phases of the development process instead of first establishing the real business process representation and therefore enable quite differently and partially limitedly the domain experts' involvement. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Medication incident reporting in residential aged care facilities: Limitations and risks to residents’ safety

    PubMed Central

    2012-01-01

    Background Medication incident reporting (MIR) is a key safety critical care process in residential aged care facilities (RACFs). Retrospective studies of medication incident reports in aged care have identified the inability of existing MIR processes to generate information that can be used to enhance residents’ safety. However, there is little existing research that investigates the limitations of the existing information exchange process that underpins MIR, despite the considerable resources that RACFs’ devote to the MIR process. The aim of this study was to undertake an in-depth exploration of the information exchange process involved in MIR and identify factors that inhibit the collection of meaningful information in RACFs. Methods The study was undertaken in three RACFs (part of a large non-profit organisation) in NSW, Australia. A total of 23 semi-structured interviews and 62 hours of observation sessions were conducted between May to July 2011. The qualitative data was iteratively analysed using a grounded theory approach. Results The findings highlight significant gaps in the design of the MIR artefacts as well as information exchange issues in MIR process execution. Study results emphasized the need to: a) design MIR artefacts that facilitate identification of the root causes of medication incidents, b) integrate the MIR process within existing information systems to overcome key gaps in information exchange execution, and c) support exchange of information that can facilitate a multi-disciplinary approach to medication incident management in RACFs. Conclusions This study highlights the advantages of viewing MIR process holistically rather than as segregated tasks, as a means to identify gaps in information exchange that need to be addressed in practice to improve safety critical processes. PMID:23122411

  20. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  1. An information-based approach to change-point analysis with applications to biophysics and cell biology.

    PubMed

    Wiggins, Paul A

    2015-07-21

    This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  3. Toward an integrative understanding of narrative and emotion processes in Emotion-focused therapy of depression: implications for theory, research and practice.

    PubMed

    Angus, Lynne

    2012-01-01

    This paper addresses the fundamental contributions of client narrative disclosure in psychotherapy and its importance for the elaboration of new emotional meanings and self understanding in the context of Emotion-focused therapy (EFT) of depression. An overview of the multi-methodological steps undertaken to empirically investigate the contributions of client story telling, emotional differentiation and meaning-making processes (Narrative Processes Coding System; Angus et al., 1999) in EFT treatments of depression is provided, followed by a summary of key research findings that informed the development of a narrative-informed approach to Emotion-focused therapy of depression (Angus & Greenberg, 2011). Finally, the clinical practice and training implications of adopting a research-informed approach to working with narrative and emotion processes in EFT are described, and future research directions discussed.

  4. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  5. Utilization of a Multi-Disciplinary Approach to Building Effective Command Centers: Process and Products

    DTIC Science & Technology

    2005-06-01

    cognitive task analysis , organizational information dissemination and interaction, systems engineering, collaboration and communications processes, decision-making processes, and data collection and organization. By blending these diverse disciplines command centers can be designed to support decision-making, cognitive analysis, information technology, and the human factors engineering aspects of Command and Control (C2). This model can then be used as a baseline when dealing with work in areas of business processes, workflow engineering, information management,

  6. Overview of Computer Security Certification and Accreditation. Final Report.

    ERIC Educational Resources Information Center

    Ruthberg, Zella G.; Neugent, William

    Primarily intended to familiarize ADP (automatic data processing) policy and information resource managers with the approach to computer security certification and accreditation found in "Guideline to Computer Security Certification and Accreditation," Federal Information Processing Standards Publications (FIPS-PUB) 102, this overview…

  7. Toward an Information-Processing Theory of Client Change in Counseling.

    ERIC Educational Resources Information Center

    Martin, Jack

    1985-01-01

    Information-processing models of client-centered and rational-emotive counseling are constructed that relate counseling skills and strategies employed in these approaches to hypothesized client cognitive changes. An integrated view of client cognitive change in counseling also is presented. (Author/BL)

  8. Patterns-Based IS Change Management in SMEs

    NASA Astrophysics Data System (ADS)

    Makna, Janis; Kirikova, Marite

    The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.

  9. Teaching Business Process Management with Simulation in Graduate Business Programs: An Integrative Approach

    ERIC Educational Resources Information Center

    Saraswat, Satya Prakash; Anderson, Dennis M.; Chircu, Alina M.

    2014-01-01

    This paper describes the development and evaluation of a graduate level Business Process Management (BPM) course with process modeling and simulation as its integral component, being offered at an accredited business university in the Northeastern U.S. Our approach is similar to that found in other Information Systems (IS) education papers, and…

  10. Using fuzzy fractal features of digital images for the material surface analisys

    NASA Astrophysics Data System (ADS)

    Privezentsev, D. G.; Zhiznyakov, A. L.; Astafiev, A. V.; Pugin, E. V.

    2018-01-01

    Edge detection is an important task in image processing. There are a lot of approaches in this area: Sobel, Canny operators and others. One of the perspective techniques in image processing is the use of fuzzy logic and fuzzy sets theory. They allow us to increase processing quality by representing information in its fuzzy form. Most of the existing fuzzy image processing methods switch to fuzzy sets on very late stages, so this leads to some useful information loss. In this paper, a novel method of edge detection based on fuzzy image representation and fuzzy pixels is proposed. With this approach, we convert the image to fuzzy form on the first step. Different approaches to this conversion are described. Several membership functions for fuzzy pixel description and requirements for their form and view are given. A novel approach to edge detection based on Sobel operator and fuzzy image representation is proposed. Experimental testing of developed method was performed on remote sensing images.

  11. Closed-Loop Estimation of Retinal Network Sensitivity by Local Empirical Linearization

    PubMed Central

    2018-01-01

    Abstract Understanding how sensory systems process information depends crucially on identifying which features of the stimulus drive the response of sensory neurons, and which ones leave their response invariant. This task is made difficult by the many nonlinearities that shape sensory processing. Here, we present a novel perturbative approach to understand information processing by sensory neurons, where we linearize their collective response locally in stimulus space. We added small perturbations to reference stimuli and tested if they triggered visible changes in the responses, adapting their amplitude according to the previous responses with closed-loop experiments. We developed a local linear model that accurately predicts the sensitivity of the neural responses to these perturbations. Applying this approach to the rat retina, we estimated the optimal performance of a neural decoder and showed that the nonlinear sensitivity of the retina is consistent with an efficient encoding of stimulus information. Our approach can be used to characterize experimentally the sensitivity of neural systems to external stimuli locally, quantify experimentally the capacity of neural networks to encode sensory information, and relate their activity to behavior. PMID:29379871

  12. Worklist handling in workflow-enabled radiological application systems

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  13. Improving the claims process with EDI.

    PubMed

    Moynihan, J J

    1993-01-01

    Electronic data interchange (EDI) is redefining the healthcare claims process. The traditional managerial approach to claims processing emphasizes information flow within the patient accounting department and between patient accounting and other departments. EDI enlarges the scope of the claims process to include information exchange between providers and payers. Using EDI to improve both external and internal information exchange makes the claims process more efficient and less expensive. This article is excerpted from "The Healthcare Financial Manager's Guide to Healthcare EDI," by James J. Moynihan, published by the Healthcare Financial Management Association.

  14. Social Information Processing Patterns, Social Skills, and School Readiness in Preschool Children

    PubMed Central

    Ziv, Yair

    2012-01-01

    The links between social information processing, social competence, and school readiness were examined in this short-term longitudinal study with a sample of 198 preschool children. Data on social information processing were obtained via child interview, data on child social competence were obtained via teacher report, and data on school readiness were obtained via child assessment (early literacy skills) and teacher report (approaches to learning). Findings provided support for our hypothesis that both social information processing and social competence are related to school readiness. Social competence also partially mediated the link between social information processing and school readiness thus supporting our hypothesis about an indirect path in which mental processes are translated into social skills and then translated into school readiness. PMID:23046690

  15. Creating ISO/EN 13606 archetypes based on clinical information needs.

    PubMed

    Rinner, Christoph; Kohler, Michael; Hübner-Bloder, Gudrun; Saboor, Samrend; Ammenwerth, Elske; Duftschmid, Georg

    2011-01-01

    Archetypes model individual EHR contents and build the basis of the dual-model approach used in the ISO/EN 13606 EHR architecture. We present an approach to create archetypes using an iterative development process. It includes automated generation of electronic case report forms from archetypes. We evaluated our approach by developing 128 archetypes which represent 446 clinical information items from the diabetes domain.

  16. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  17. The Effect of Visual Information on the Manual Approach and Landing

    NASA Technical Reports Server (NTRS)

    Wewerinke, P. H.

    1982-01-01

    The effect of visual information in combination with basic display information on the approach performance. A pre-experimental model analysis was performed in terms of the optimal control model. The resulting aircraft approach performance predictions were compared with the results of a moving base simulator program. The results illustrate that the model provides a meaningful description of the visual (scene) perception process involved in the complex (multi-variable, time varying) manual approach task with a useful predictive capability. The theoretical framework was shown to allow a straight-forward investigation of the complex interaction of a variety of task variables.

  18. Measuring the Return on Information Technology: A Knowledge-Based Approach for Revenue Allocation at the Process and Firm Level

    DTIC Science & Technology

    2005-07-01

    approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of

  19. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  20. How to (and how not to) think about top-down influences on visual perception.

    PubMed

    Teufel, Christoph; Nanay, Bence

    2017-01-01

    The question of whether cognition can influence perception has a long history in neuroscience and philosophy. Here, we outline a novel approach to this issue, arguing that it should be viewed within the framework of top-down information-processing. This approach leads to a reversal of the standard explanatory order of the cognitive penetration debate: we suggest studying top-down processing at various levels without preconceptions of perception or cognition. Once a clear picture has emerged about which processes have influences on those at lower levels, we can re-address the extent to which they should be considered perceptual or cognitive. Using top-down processing within the visual system as a model for higher-level influences, we argue that the current evidence indicates clear constraints on top-down influences at all stages of information processing; it does, however, not support the notion of a boundary between specific types of information-processing as proposed by the cognitive impenetrability hypothesis. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. An Approach for Implementation of Project Management Information Systems

    NASA Astrophysics Data System (ADS)

    Běrziša, Solvita; Grabis, Jānis

    Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.

  2. [Modality specific systems of representation and processing of information. Superfluous images, useful representations, necessary evil or inevitable consequences of optimal stimulus processing].

    PubMed

    Zimmer, H D

    1993-01-01

    It is discussed what is underlying the assumption of modality-specific processing systems and representations. Starting from the information processing approach relevant aspects of mental representations and their physiological realizations are discussed. Then three different forms of modality-specific systems are distinguished: as stimulus specific processing, as specific informational formats, and as modular part systems. Parallel to that three kinds of analogue systems are differentiated: as holding an analogue-relation, as having a specific informational format and as a set of specific processing constraints. These different aspects of the assumption of modality-specific systems are demonstrated in the example of visual and spatial information processing. It is concluded that postulating information-specific systems is not a superfluous assumption, but it is necessary, and even more likely it is an inevitable consequence of an optimization of stimulus processing.

  3. Modeling Business Processes in Public Administration

    NASA Astrophysics Data System (ADS)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  4. Using Bayesian networks to support decision-focused information retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehner, P.; Elsaesser, C.; Seligman, L.

    This paper has described an approach to controlling the process of pulling data/information from distributed data bases in a way that is specific to a persons specific decision making context. Our prototype implementation of this approach uses a knowledge-based planner to generate a plan, an automatically constructed Bayesian network to evaluate the plan, specialized processing of the network to derive key information items that would substantially impact the evaluation of the plan (e.g., determine that replanning is needed), automated construction of Standing Requests for Information (SRIs) which are automated functions that monitor changes and trends in distributed data base thatmore » are relevant to the key information items. This emphasis of this paper is on how Bayesian networks are used.« less

  5. An applications-oriented approach to the development of virtual environments

    NASA Technical Reports Server (NTRS)

    Crowe, Michael X.

    1994-01-01

    The field of Virtual Reality (VR) is diverse, ranging in scope from research into fundamental enabling technologies to the building of full-scale entertainment facilities. However, the concept of virtual reality means many things to many people. Ideally, a definition of VR should derive from how it can provide solutions to existing challenges in building advanced human computer interfaces. The measure of success for VR lies in its ability to enhance the assimilation of complex information, whether to aid in difficult decision making processes, or to recreate real experiences in a compelling way. This philosophy is described using an example from a VR-based advertising project. The common and unique elements of this example are explained, though the fundamental development process is the same for all virtual environments that support information transfer. In short, this development approach is an applications oriented approach that begins by establishing and prioritizing user requirements and seeks to add value to the information transfer process through the appropriate use of VR technology.

  6. A KPI framework for process-based benchmarking of hospital information systems.

    PubMed

    Jahn, Franziska; Winter, Alfred

    2011-01-01

    Benchmarking is a major topic for monitoring, directing and elucidating the performance of hospital information systems (HIS). Current approaches neglect the outcome of the processes that are supported by the HIS and their contribution to the hospital's strategic goals. We suggest to benchmark HIS based on clinical documentation processes and their outcome. A framework consisting of a general process model and outcome criteria for clinical documentation processes is introduced.

  7. Teaching Information & Technology Skills: The Big6[TM] in Elementary Schools. Professional Growth Series.

    ERIC Educational Resources Information Center

    Eisenberg, Michael B.; Berkowitz, Robert E.

    This book about using the Big6 information problem solving process model in elementary schools is organized into two parts. Providing an overview of the Big6 approach, Part 1 includes the following chapters: "Introduction: The Need," including the information problem, the Big6 and other process models, and teaching/learning the Big6;…

  8. Meaningful Informed Consent with Young Children: Looking Forward through an Interactive Narrative Approach

    ERIC Educational Resources Information Center

    Mayne, Fiona; Howitt, Christine; Rennie, Léonie

    2016-01-01

    Ideas about ethical research with young children are evolving at a rapid rate. Not only can young children participate in the informed consent process, but researchers now also recognize that the process must be meaningful for them. As part of a larger study, this article reviews children's rights and informed consent literature as the foundation…

  9. The hippocampus and exploration: dynamically evolving behavior and neural representations

    PubMed Central

    Johnson, Adam; Varberg, Zachary; Benhardus, James; Maahs, Anthony; Schrater, Paul

    2012-01-01

    We develop a normative statistical approach to exploratory behavior called information foraging. Information foraging highlights the specific processes that contribute to active, rather than passive, exploration and learning. We hypothesize that the hippocampus plays a critical role in active exploration through directed information foraging by supporting a set of processes that allow an individual to determine where to sample. By examining these processes, we show how information directed information foraging provides a formal theoretical explanation for the common hippocampal substrates of constructive memory, vicarious trial and error behavior, schema-based facilitation of memory performance, and memory consolidation. PMID:22848196

  10. Extracting important information from Chinese Operation Notes with natural language processing methods.

    PubMed

    Wang, Hui; Zhang, Weide; Zeng, Qiang; Li, Zuofeng; Feng, Kaiyan; Liu, Lei

    2014-04-01

    Extracting information from unstructured clinical narratives is valuable for many clinical applications. Although natural Language Processing (NLP) methods have been profoundly studied in electronic medical records (EMR), few studies have explored NLP in extracting information from Chinese clinical narratives. In this study, we report the development and evaluation of extracting tumor-related information from operation notes of hepatic carcinomas which were written in Chinese. Using 86 operation notes manually annotated by physicians as the training set, we explored both rule-based and supervised machine-learning approaches. Evaluating on unseen 29 operation notes, our best approach yielded 69.6% in precision, 58.3% in recall and 63.5% F-score. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Improving Informed Consent with Minority Participants: Results from Researcher and Community Surveys

    PubMed Central

    Quinn, Sandra Crouse; Garza, Mary A.; Butler, James; Fryer, Craig S.; Casper, Erica T.; Thomas, Stephen B.; Barnard, David; Kim, Kevin H.

    2013-01-01

    Strengthening the informed consent process is one avenue for improving recruitment of minorities into research. This study examines that process from two different perspectives, that of researchers and that of African American and Latino community members. Through the use of two separate surveys, we compared strategies used by researchers with the preferences and attitudes of community members during the informed consent process. Our data suggest that researchers can improve the informed consent process by incorporating methods preferred by the community members along with methods shown in the literature for increasing comprehension. With this approach, the informed consent process may increase both participants’ comprehension of the material and overall satisfaction, fostering greater trust in research and openness to future research opportunities. PMID:23324203

  12. Understanding Self-Assessment as an Informed Process: Residents' Use of External Information for Self-Assessment of Performance in Simulated Resuscitations

    ERIC Educational Resources Information Center

    Plant, Jennifer L.; Corden, Mark; Mourad, Michelle; O'Brien, Bridget C.; van Schaik, Sandrijn M.

    2013-01-01

    ;Self-directed learning requires self-assessment of learning needs and performance, a complex process that requires collecting and interpreting data from various sources. Learners' approaches to self-assessment likely vary depending on the learner and the context. The aim of this study was to gain insight into how learners process external…

  13. Discovery of Information Diffusion Process in Social Networks

    NASA Astrophysics Data System (ADS)

    Kim, Kwanho; Jung, Jae-Yoon; Park, Jonghun

    Information diffusion analysis in social networks is of significance since it enables us to deeply understand dynamic social interactions among users. In this paper, we introduce approaches to discovering information diffusion process in social networks based on process mining. Process mining techniques are applied from three perspectives: social network analysis, process discovery and community recognition. We then present experimental results by using a real-life social network data. The proposed techniques are expected to employ as new analytical tools in online social networks such as blog and wikis for company marketers, politicians, news reporters and online writers.

  14. A Fifteen-Year Forecast of Information-Processing Technology. Final Report.

    ERIC Educational Resources Information Center

    Bernstein, George B.

    This study developed a variation of the DELPHI approach, a polling technique for systematically soliciting opinions from experts, to produce a technological forecast of developments in the information-processing industry. SEER (System for Event Evaluation and Review) combines the more desirable elements of existing techniques: (1) intuitive…

  15. Depth of Information Processing and Memory for Medical Facts.

    ERIC Educational Resources Information Center

    Slade, Peter D.; Onion, Carl W. R.

    1995-01-01

    The current emphasis in medical education is on engaging learners in deep processing of information to achieve better understanding of the subject matter. Traditional approaches aimed for memorization of medical facts; however, a good memory for medical facts is still essential in clinical practice. This study demonstrates that deep information…

  16. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts.

    PubMed

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement.

  17. Harnessing Biomedical Natural Language Processing Tools to Identify Medicinal Plant Knowledge from Historical Texts

    PubMed Central

    Sharma, Vivekanand; Law, Wayne; Balick, Michael J.; Sarkar, Indra Neil

    2017-01-01

    The growing amount of data describing historical medicinal uses of plants from digitization efforts provides the opportunity to develop systematic approaches for identifying potential plant-based therapies. However, the task of cataloguing plant use information from natural language text is a challenging task for ethnobotanists. To date, there have been only limited adoption of informatics approaches used for supporting the identification of ethnobotanical information associated with medicinal uses. This study explored the feasibility of using biomedical terminologies and natural language processing approaches for extracting relevant plant-associated therapeutic use information from historical biodiversity literature collection available from the Biodiversity Heritage Library. The results from this preliminary study suggest that there is potential utility of informatics methods to identify medicinal plant knowledge from digitized resources as well as highlight opportunities for improvement. PMID:29854223

  18. Information-driven self-organization: the dynamical system approach to autonomous robot behavior.

    PubMed

    Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail

    2012-09-01

    In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self-organization of behavior in complex robotic systems.

  19. Specifying process requirements for holistic care.

    PubMed

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2013-09-01

    Holistic (health and social) care aims at providing comprehensive care to the community, especially to elderly people and people with multiple illnesses. In turn, this requires using health and social care resources more efficiently through enhanced collaboration and coordination among the corresponding organizations and delivering care closer to patient needs and preferences. This paper takes a patient-centered, process view of holistic care delivery and focuses on requirements elicitation for supporting holistic care processes and enabling authorized users to access integrated patient information at the point of care when needed. To this end, an approach to holistic care process-support requirements elicitation is presented which is based on business process modeling and places particular emphasis on empowering collaboration, coordination and information sharing among health and social care organizations by actively involving users and by providing insights for alternative process designs. The approach provides a means for integrating diverse legacy applications in a process-oriented environment using a service-oriented architecture as an appropriate solution for supporting and automating holistic care processes. The approach is applied in the context of emergency medical care aiming at streamlining and providing support technology to cross-organizational health and social care processes to address global patient needs.

  20. The Design of an Information Management Program for Headquarters, Department of the Army. Phase 2. Management Summary.

    DTIC Science & Technology

    1980-02-26

    months estimated to be required in some areas), and more direct invol ement of information users in long range planning of information requirements (with...most people, there is a definite need to educate the members of the organization as to the implications of the IRM approach. Emphasis should be placed...from information sharing and a coordinated approach. Such an educational process has already begun with the execution of this study, but more must be

  1. Quality of service management framework for dynamic chaining of geographic information services

    NASA Astrophysics Data System (ADS)

    Onchaga, Richard

    2006-06-01

    Dynamic chaining of geographic information services (geo-services) is gaining popularity as a new paradigm for evolving flexible geo-information systems and for providing on-demand access to geo-information. In dynamic chaining, disparate geo-services are discovered and composed at run time to yield more elaborate functionality and create value-added geo-information. Common approaches to service chaining discover and compose disparate geo-services based on the functional capability of individual geo-services. The primary concern of common approaches is thus the emergent behavior of the resulting composite geo-service. However, as geo-services become mundane and take on a greater and more strategic role in mission critical processes, deliverable quality of service (QoS) becomes an important concern. QoS concerns operational characteristics of a service that determine its utility in an application context. To address pertinent QoS requirements, a new approach to service chaining becomes necessary. In this paper we propose a QoS-aware chaining approach in which geo-services are discovered, composed and executed considering both functional and QoS requirements. We prescribe a QoS management framework that defines fundamental principles, concepts and mechanisms which can be applied to evolve an effective distributed computing platform for QoS-aware chaining of geo-services - the so-called geo-service infrastructure. The paper also defines an extensible QoS model for services delivered by dynamic compositions of geo-services. The process of orthophoto generation is used to demonstrate the applicability of the prescribed framework to service-oriented geographic information processing.

  2. The predictive influence of family and neighborhood assets on fighting and weapon carrying from mid- to late adolescence.

    PubMed

    Haegerich, Tamara M; Oman, Roy F; Vesely, Sara K; Aspy, Cheryl B; Tolma, Eleni L

    2014-08-01

    Using a developmental, social-ecological approach to understand the etiology of health-risk behavior and inform primary prevention efforts, we assess the predictive effects of family and neighborhood social processes on youth physical fighting and weapon carrying. Specifically, we focus on relationships among youth and their parents, family communication, parental monitoring, as well as sense of community and neighborhood informal social control, support, concerns, and disorder. This study advances knowledge through its investigation of family and neighborhood structural factors and social processes together, employment of longitudinal models that estimate effects over adolescent development, and use of self-report and observational measures. Data from 1,093 youth/parent pairs were analyzed from the Youth Assets Study using a Generalized Estimating Equation approach; family and neighborhood assets and risks were analyzed as time varying and lagged. Similar family assets affected physical fighting and weapon carrying, whereas different neighborhood social processes influenced the two forms of youth violence. Study findings have implications for the primary prevention of youth violence, including the use of family-based approaches that build relationships and parental monitoring skills and community-level change approaches that promote informal social control and reduce neighborhood concerns about safety.

  3. The Role of Human Factors/Ergonomics in the Science of Security: Decision Making and Action Selection in Cyberspace.

    PubMed

    Proctor, Robert W; Chen, Jing

    2015-08-01

    The overarching goal is to convey the concept of science of security and the contributions that a scientifically based, human factors approach can make to this interdisciplinary field. Rather than a piecemeal approach to solving cybersecurity problems as they arise, the U.S. government is mounting a systematic effort to develop an approach grounded in science. Because humans play a central role in security measures, research on security-related decisions and actions grounded in principles of human information-processing and decision-making is crucial to this interdisciplinary effort. We describe the science of security and the role that human factors can play in it, and use two examples of research in cybersecurity--detection of phishing attacks and selection of mobile applications--to illustrate the contribution of a scientific, human factors approach. In these research areas, we show that systematic information-processing analyses of the decisions that users make and the actions they take provide a basis for integrating the human component of security science. Human factors specialists should utilize their foundation in the science of applied information processing and decision making to contribute to the science of cybersecurity. © 2015, Human Factors and Ergonomics Society.

  4. Use of Information: Getting to the Heart of the Matter

    ERIC Educational Resources Information Center

    Eisenberg, Michael B.

    2005-01-01

    The Big6 approach to information problem solving is widely used by students in the US. Use of Information is the 4th stage and marks a shift in focus from selecting and accessing information sources to using the information itself in a process that involves "critical thinking."

  5. An Overview of NASA's IM&S Verification and Validation Process Plan and Specification for Space Exploration

    NASA Technical Reports Server (NTRS)

    Gravitz, Robert M.; Hale, Joseph

    2006-01-01

    NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.

  6. Model-centric approaches for the development of health information systems.

    PubMed

    Tuomainen, Mika; Mykkänen, Juha; Luostarinen, Heli; Pöyhölä, Assi; Paakkanen, Esa

    2007-01-01

    Modeling is used increasingly in healthcare to increase shared knowledge, to improve the processes, and to document the requirements of the solutions related to health information systems (HIS). There are numerous modeling approaches which aim to support these aims, but a careful assessment of their strengths, weaknesses and deficiencies is needed. In this paper, we compare three model-centric approaches in the context of HIS development: the Model-Driven Architecture, Business Process Modeling with BPMN and BPEL and the HL7 Development Framework. The comparison reveals that all these approaches are viable candidates for the development of HIS. However, they have distinct strengths and abstraction levels, they require local and project-specific adaptation and offer varying levels of automation. In addition, illustration of the solutions to the end users must be improved.

  7. [Cognitive functions, their development and modern diagnostic methods].

    PubMed

    Klasik, Adam; Janas-Kozik, Małgorzata; Krupka-Matuszczyk, Irena; Augustyniak, Ewa

    2006-01-01

    Cognitive psychology is an interdisciplinary field whose main aim is to study the thinking mechanisms of humans leading to cognizance. Therefore the concept of human cognitive processes envelopes the knowledge related to the mechanisms which determine the way humans acquire information from the environment and utilize their knowledge and experience. There are three basic processes which need to be distinguished when discussing human perception development: acquiring sensations, perceptiveness and attention. Acquiring sensations means the experience arising from the stimulation of a single sense organ, i.e. detection and differentiation of sensory information. Perceptiveness stands for the interpretation of sensations and may include recognition and identification of sensory information. The attention process relates to the selectivity of perception. Mental processes of the higher order used in cognition, thanks to which humans tend to try to understand the world and adapt to it, doubtlessly include the processes of memory, reasoning, learning and problem solving. There is a great difference in the human cognitive functioning at different stages of one's life (from infancy to adulthood). The difference is both quantitative and qualitative. There are three main approaches to the human cognitive functioning development: Jean Piaget's approach, information processing approach and psychometric approach. Piaget's ideas continue to form the groundwork of child cognitive psychology. Piaget identified four developmental stages of child cognition: 1. Sensorimotor stage (birth - 2 years old); 2. Preoperational stage (ages 2-7); 3. Concrete operations (ages 7-11; 4. Formal operations (11 and more). The supporters of the information processing approach use a computer metaphor to present the human cognitive processes functioning model. The three important mechanisms involved are: coding, automation and strategy designing and they all often co-operate together. This theory has provided a theory. The psychometric approach concentrates on studying the differences in intelligence. The aim of this approach is to test intelligence by means of standardized tests (e.g. WISC-R, WAIS-R) used to show the individual differences among humans. Human cognitive functions determine individuals' adaptation capabilities and disturbances in this area indicate a number of psychopathological changes and are a symptom enabling to differentiate or diagnose one with a disorder. That is why the psychological assessment of cognitive functions is an important part of patients' diagnosis. Contemporary neuropsychological studies are to a great extent based computer tests. The use of computer methods has a number of measurement-related advantages. It allows for standardized testing environment, increasing therefore its reliability and standardizes the patient assessment process. Special attention should be paid to the neuropsychological tests included in the Vienna Test System (Cognitron, SIGNAL, RT, VIGIL, DAUF), which are used to assess the operational memory span, learning processes, reaction time, attention selective function, attention continuity as well as attention interference resistance. It also seems justified to present the CPT id test (Continuous Performance Test) as well as Free Recall. CPT is a diagnostic tool used to assess the attention selective function, attention continuity of attention, attention interference resistance as well as attention alertness. The Free Recall test is used in the memory processes diagnostics to assess patients' operational memory as well as the information organization degree in operational memory. The above mentioned neuropsychological tests are tools used in clinical assessment of cognitive function disorders.

  8. China’s Comprehensive Approach: Refining the U.S. Targeting Process to Inform U.S. Strategy

    DTIC Science & Technology

    2018-04-20

    control demonstrated by China, the subject matter expertise required to generate a comprehensive approach like China’s does exist. However, due to a vast...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...code) NATIONAL DEFENSE UNIVERSITY JOINT FORCES STAFF COLLEGE JOINT ADVANCED WARFIGHTING SCHOOL CHINA’S COMPREHENSIVE APPROACH

  9. Evaluation of soil erosion risk using Analytic Network Process and GIS: a case study from Spanish mountain olive plantations.

    PubMed

    Nekhay, Olexandr; Arriaza, Manuel; Boerboom, Luc

    2009-07-01

    The study presents an approach that combined objective information such as sampling or experimental data with subjective information such as expert opinions. This combined approach was based on the Analytic Network Process method. It was applied to evaluate soil erosion risk and overcomes one of the drawbacks of USLE/RUSLE soil erosion models, namely that they do not consider interactions among soil erosion factors. Another advantage of this method is that it can be used if there are insufficient experimental data. The lack of experimental data can be compensated for through the use of expert evaluations. As an example of the proposed approach, the risk of soil erosion was evaluated in olive groves in Southern Spain, showing the potential of the ANP method for modelling a complex physical process like soil erosion.

  10. Description of the AILS Alerting Algorithm

    NASA Technical Reports Server (NTRS)

    Samanant, Paul; Jackson, Mike

    2000-01-01

    This document provides a complete description of the Airborne Information for Lateral Spacing (AILS) alerting algorithms. The purpose of AILS is to provide separation assurance between aircraft during simultaneous approaches to closely spaced parallel runways. AILS will allow independent approaches to be flown in such situations where dependent approaches were previously required (typically under Instrument Meteorological Conditions (IMC)). This is achieved by providing multiple levels of alerting for pairs of aircraft that are in parallel approach situations. This document#s scope is comprehensive and covers everything from general overviews, definitions, and concepts down to algorithmic elements and equations. The entire algorithm is presented in complete and detailed pseudo-code format. This can be used by software programmers to program AILS into a software language. Additional supporting information is provided in the form of coordinate frame definitions, data requirements, calling requirements as well as all necessary pre-processing and post-processing requirements. This is important and required information for the implementation of AILS into an analysis, a simulation, or a real-time system.

  11. An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi

    NASA Astrophysics Data System (ADS)

    Deng, D.-P.; Lemmens, R.

    2011-08-01

    The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.

  12. Beyond Competence: An Essay on a Process Approach to Organising and Enacting Vocational Education

    ERIC Educational Resources Information Center

    Billett, Stephen

    2016-01-01

    The competency-based approach to vocational education is premised on narrow and dated conceptions of human functioning, performance and development. Its adoption is more driven by administrative concerns about measurable outcomes than educational processes and outcomes. Informed by educational science and earlier debates, this article discusses…

  13. Analytic Steering: Inserting Context into the Information Dialog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bohn, Shawn J.; Calapristi, Augustin J.; Brown, Shyretha D.

    2011-10-23

    An analyst’s intrinsic domain knowledge is a primary asset in almost any analysis task. Unstructured text analysis systems that apply un-supervised content analysis approaches can be more effective if they can leverage this domain knowledge in a manner that augments the information discovery process without obfuscating new or unexpected content. Current unsupervised approaches rely upon the prowess of the analyst to submit the right queries or observe generalized document and term relationships from ranked or visual results. We propose a new approach which allows the user to control or steer the analytic view within the unsupervised space. This process ismore » controlled through the data characterization process via user supplied context in the form of a collection of key terms. We show that steering with an appropriate choice of key terms can provide better relevance to the analytic domain and still enable the analyst to uncover un-expected relationships; this paper discusses cases where various analytic steering approaches can provide enhanced analysis results and cases where analytic steering can have a negative impact on the analysis process.« less

  14. New Challenges in Information Integration

    NASA Astrophysics Data System (ADS)

    Haas, Laura M.; Soffer, Aya

    Information integration is the cornerstone of modern business informatics. It is a pervasive problem; rarely is a new application built without an initial phase of gathering and integrating information. Information integration comes in a wide variety of forms. Historically, two major approaches were recognized: data federation and data warehousing. Today, we need new approaches, as information integration becomes more dynamic, while coping with growing volumes of increasingly dirty and diverse data. At the same time, information integration must be coupled more tightly with the applications and the analytics that will leverage the integrated results, to make the integration process more tractable and the results more consumable.

  15. Towards the understanding of network information processing in biology

    NASA Astrophysics Data System (ADS)

    Singh, Vijay

    Living organisms perform incredibly well in detecting a signal present in the environment. This information processing is achieved near optimally and quite reliably, even though the sources of signals are highly variable and complex. The work in the last few decades has given us a fair understanding of how individual signal processing units like neurons and cell receptors process signals, but the principles of collective information processing on biological networks are far from clear. Information processing in biological networks, like the brain, metabolic circuits, cellular-signaling circuits, etc., involves complex interactions among a large number of units (neurons, receptors). The combinatorially large number of states such a system can exist in makes it impossible to study these systems from the first principles, starting from the interactions between the basic units. The principles of collective information processing on such complex networks can be identified using coarse graining approaches. This could provide insights into the organization and function of complex biological networks. Here I study models of biological networks using continuum dynamics, renormalization, maximum likelihood estimation and information theory. Such coarse graining approaches identify features that are essential for certain processes performed by underlying biological networks. We find that long-range connections in the brain allow for global scale feature detection in a signal. These also suppress the noise and remove any gaps present in the signal. Hierarchical organization with long-range connections leads to large-scale connectivity at low synapse numbers. Time delays can be utilized to separate a mixture of signals with temporal scales. Our observations indicate that the rules in multivariate signal processing are quite different from traditional single unit signal processing.

  16. Complex vestibular macular anatomical relationships need a synthetic approach

    NASA Technical Reports Server (NTRS)

    Ross, M. D.

    2001-01-01

    Mammalian vestibular maculae are anatomically organized for complex parallel processing of linear acceleration information. Anatomical findings in rat maculae are provided in order to underscore this complexity, which is little understood functionally. This report emphasizes that a synthetic approach is critical to understanding how maculae function and the kind of information they conduct to the brain.

  17. Using a logical information model-driven design process in healthcare.

    PubMed

    Cheong, Yu Chye; Bird, Linda; Tun, Nwe Ni; Brooks, Colleen

    2011-01-01

    A hybrid standards-based approach has been adopted in Singapore to develop a Logical Information Model (LIM) for healthcare information exchange. The Singapore LIM uses a combination of international standards, including ISO13606-1 (a reference model for electronic health record communication), ISO21090 (healthcare datatypes), SNOMED CT (healthcare terminology) and HL7 v2 (healthcare messaging). This logic-based design approach also incorporates mechanisms for achieving bi-directional semantic interoperability.

  18. Electronic Data Interchange in Procurement

    DTIC Science & Technology

    1990-04-01

    contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated

  19. Cognition-Based Approaches for High-Precision Text Mining

    ERIC Educational Resources Information Center

    Shannon, George John

    2017-01-01

    This research improves the precision of information extraction from free-form text via the use of cognitive-based approaches to natural language processing (NLP). Cognitive-based approaches are an important, and relatively new, area of research in NLP and search, as well as linguistics. Cognitive approaches enable significant improvements in both…

  20. A standard-driven approach for electronic submission to pharmaceutical regulatory authorities.

    PubMed

    Lin, Ching-Heng; Chou, Hsin-I; Yang, Ueng-Cheng

    2018-03-01

    Using standards is not only useful for data interchange during the process of a clinical trial, but also useful for analyzing data in a review process. Any step, which speeds up approval of new drugs, may benefit patients. As a result, adopting standards for regulatory submission becomes mandatory in some countries. However, preparing standard-compliant documents, such as annotated case report form (aCRF), needs a great deal of knowledge and experience. The process is complex and labor-intensive. Therefore, there is a need to use information technology to facilitate this process. Instead of standardizing data after the completion of a clinical trial, this study proposed a standard-driven approach. This approach was achieved by implementing a computer-assisted "standard-driven pipeline (SDP)" in an existing clinical data management system. SDP used CDISC standards to drive all processes of a clinical trial, such as the design, data acquisition, tabulation, etc. RESULTS: A completed phase I/II trial was used to prove the concept and to evaluate the effects of this approach. By using the CDISC-compliant question library, aCRFs were generated automatically when the eCRFs were completed. For comparison purpose, the data collection process was simulated and the collected data was transformed by the SDP. This new approach reduced the missing data fields from sixty-two to eight and the controlled term mismatch field reduced from eight to zero during data tabulation. This standard-driven approach accelerated CRF annotation and assured data tabulation integrity. The benefits of this approach include an improvement in the use of standards during the clinical trial and a reduction in missing and unexpected data during tabulation. The standard-driven approach is an advanced design idea that can be used for future clinical information system development. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Children's Rights and Research Processes: Assisting Children to (In)formed Views

    ERIC Educational Resources Information Center

    Lundy, Laura; McEvoy, Lesley

    2012-01-01

    Acknowledging children as rights-holders has significant implications for research processes. What is distinctive about a children's rights informed approach to research is a focus not only on safe, inclusive and engaging opportunities for children to express their views but also on deliberate strategies to assist children in the formation of…

  2. A Multidirectional Model for Assessing Learning Disabled Students' Intelligence: An Information-Processing Framework.

    ERIC Educational Resources Information Center

    Swanson, H. Lee

    1982-01-01

    An information processing approach to the assessment of learning disabled students' intellectual performance is presented. The model is based on the assumption that intelligent behavior is comprised of a variety of problem- solving strategies. An account of child problem solving is explained and illustrated with a "thinking aloud" protocol.…

  3. Foreign Language Methods and an Information Processing Model of Memory.

    ERIC Educational Resources Information Center

    Willebrand, Julia

    The major approaches to language teaching (audiolingual method, generative grammar, Community Language Learning and Silent Way) are investigated to discover whether or not they are compatible in structure with an information-processing model of memory (IPM). The model of memory used was described by Roberta Klatzky in "Human Memory:…

  4. A multidimensional evaluation of a nursing information-literacy program.

    PubMed Central

    Fox, L M; Richter, J M; White, N E

    1996-01-01

    The goal of an information-literacy program is to develop student skills in locating, evaluating, and applying information for use in critical thinking and problem solving. This paper describes a multidimensional evaluation process for determining nursing students' growth in cognitive and affective domains. Results indicate improvement in student skills as a result of a nursing information-literacy program. Multidimensional evaluation produces a well-rounded picture of student progress based on formal measurement as well as informal feedback. Developing new educational programs can be a time-consuming challenge. It is important, when expending so much effort, to ensure that the goals of the new program are achieved and benefits to students demonstrated. A multidimensional approach to evaluation can help to accomplish those ends. In 1988, The University of Northern Colorado School of Nursing began working with a librarian to integrate an information-literacy component, entitled Pathways to Information Literacy, into the curriculum. This article describes the program and discusses how a multidimensional evaluation process was used to assess program effectiveness. The evaluation process not only helped to measure the effectiveness of the program but also allowed the instructors to use several different approaches to evaluation. PMID:8826621

  5. Classification of cognitive systems dedicated to data sharing

    NASA Astrophysics Data System (ADS)

    Ogiela, Lidia; Ogiela, Marek R.

    2017-08-01

    In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.

  6. User-centered requirements engineering in health information systems: a study in the hemophilia field.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2012-06-01

    The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  7. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    PubMed

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  8. Challenges in Extracting Information From Large Hydrogeophysical-monitoring Datasets

    NASA Astrophysics Data System (ADS)

    Day-Lewis, F. D.; Slater, L. D.; Johnson, T.

    2012-12-01

    Over the last decade, new automated geophysical data-acquisition systems have enabled collection of increasingly large and information-rich geophysical datasets. Concurrent advances in field instrumentation, web services, and high-performance computing have made real-time processing, inversion, and visualization of large three-dimensional tomographic datasets practical. Geophysical-monitoring datasets have provided high-resolution insights into diverse hydrologic processes including groundwater/surface-water exchange, infiltration, solute transport, and bioremediation. Despite the high information content of such datasets, extraction of quantitative or diagnostic hydrologic information is challenging. Visual inspection and interpretation for specific hydrologic processes is difficult for datasets that are large, complex, and (or) affected by forcings (e.g., seasonal variations) unrelated to the target hydrologic process. New strategies are needed to identify salient features in spatially distributed time-series data and to relate temporal changes in geophysical properties to hydrologic processes of interest while effectively filtering unrelated changes. Here, we review recent work using time-series and digital-signal-processing approaches in hydrogeophysics. Examples include applications of cross-correlation, spectral, and time-frequency (e.g., wavelet and Stockwell transforms) approaches to (1) identify salient features in large geophysical time series; (2) examine correlation or coherence between geophysical and hydrologic signals, even in the presence of non-stationarity; and (3) condense large datasets while preserving information of interest. Examples demonstrate analysis of large time-lapse electrical tomography and fiber-optic temperature datasets to extract information about groundwater/surface-water exchange and contaminant transport.

  9. Audit of the informed consent process as a part of a clinical research quality assurance program.

    PubMed

    Lad, Pramod M; Dahl, Rebecca

    2014-06-01

    Audits of the informed consent process are a key element of a clinical research quality assurance program. A systematic approach to such audits has not been described in the literature. In this paper we describe two components of the audit. The first is the audit of the informed consent document to verify adherence with federal regulations. The second component is comprised of the audit of the informed consent conference, with emphasis on a real time review of the appropriate communication of the key elements of the informed consent. Quality measures may include preparation of an informed consent history log, notes to accompany the informed consent, the use of an informed consent feedback tool, and the use of institutional surveys to assess comprehension of the informed consent process.

  10. Natural language processing systems for capturing and standardizing unstructured clinical information: A systematic review.

    PubMed

    Kreimeyer, Kory; Foster, Matthew; Pandey, Abhishek; Arya, Nina; Halford, Gwendolyn; Jones, Sandra F; Forshee, Richard; Walderhaug, Mark; Botsis, Taxiarchis

    2017-09-01

    We followed a systematic approach based on the Preferred Reporting Items for Systematic Reviews and Meta-Analyses to identify existing clinical natural language processing (NLP) systems that generate structured information from unstructured free text. Seven literature databases were searched with a query combining the concepts of natural language processing and structured data capture. Two reviewers screened all records for relevance during two screening phases, and information about clinical NLP systems was collected from the final set of papers. A total of 7149 records (after removing duplicates) were retrieved and screened, and 86 were determined to fit the review criteria. These papers contained information about 71 different clinical NLP systems, which were then analyzed. The NLP systems address a wide variety of important clinical and research tasks. Certain tasks are well addressed by the existing systems, while others remain as open challenges that only a small number of systems attempt, such as extraction of temporal information or normalization of concepts to standard terminologies. This review has identified many NLP systems capable of processing clinical free text and generating structured output, and the information collected and evaluated here will be important for prioritizing development of new approaches for clinical NLP. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. A quantum informational approach for dissecting chemical reactions

    NASA Astrophysics Data System (ADS)

    Duperrouzel, Corinne; Tecmer, Paweł; Boguslawski, Katharina; Barcza, Gergely; Legeza, Örs; Ayers, Paul W.

    2015-02-01

    We present a conceptionally different approach to dissect bond-formation processes in metal-driven catalysis using concepts from quantum information theory. Our method uses the entanglement and correlation among molecular orbitals to analyze changes in electronic structure that accompany chemical processes. As a proof-of-principle example, the evolution of nickel-ethene bond-formation is dissected, which allows us to monitor the interplay of back-bonding and π-donation along the reaction coordinate. Furthermore, the reaction pathway of nickel-ethene complexation is analyzed using quantum chemistry methods, revealing the presence of a transition state. Our study supports the crucial role of metal-to-ligand back-donation in the bond-forming process of nickel-ethene.

  12. Mathematics and Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, Gerald

    1979-01-01

    Examines the main mathematical approaches to information retrieval, including both algebraic and probabilistic models, and describes difficulties which impede formalization of information retrieval processes. A number of developments are covered where new theoretical understandings have directly led to improved retrieval techniques and operations.…

  13. Advanced Computing Architectures for Cognitive Processing

    DTIC Science & Technology

    2009-07-01

    Evolution ................................................................................. 20  Figure 9: Logic diagram smart block-based neuron...48  Figure 21: Naive Grid Potential Kernel...processing would be helpful for Air Force systems acquisition. Specific cognitive processing approaches addressed herein include global information grid

  14. A situation-response model for intelligent pilot aiding

    NASA Technical Reports Server (NTRS)

    Schudy, Robert; Corker, Kevin

    1987-01-01

    An intelligent pilot aiding system needs models of the pilot information processing to provide the computational basis for successful cooperation between the pilot and the aiding system. By combining artificial intelligence concepts with the human information processing model of Rasmussen, an abstraction hierarchy of states of knowledge, processing functions, and shortcuts are developed, which is useful for characterizing the information processing both of the pilot and of the aiding system. This approach is used in the conceptual design of a real time intelligent aiding system for flight crews of transport aircraft. One promising result was the tentative identification of a particular class of information processing shortcuts, from situation characterizations to appropriate responses, as the most important reliable pathway for dealing with complex time critical situations.

  15. On Cognition, Structured Sequence Processing, and Adaptive Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Petersson, Karl Magnus

    2008-11-01

    Cognitive neuroscience approaches the brain as a cognitive system: a system that functionally is conceptualized in terms of information processing. We outline some aspects of this concept and consider a physical system to be an information processing device when a subclass of its physical states can be viewed as representational/cognitive and transitions between these can be conceptualized as a process operating on these states by implementing operations on the corresponding representational structures. We identify a generic and fundamental problem in cognition: sequentially organized structured processing. Structured sequence processing provides the brain, in an essential sense, with its processing logic. In an approach addressing this problem, we illustrate how to integrate levels of analysis within a framework of adaptive dynamical systems. We note that the dynamical system framework lends itself to a description of asynchronous event-driven devices, which is likely to be important in cognition because the brain appears to be an asynchronous processing system. We use the human language faculty and natural language processing as a concrete example through out.

  16. Learning classification with auxiliary probabilistic information

    PubMed Central

    Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos

    2012-01-01

    Finding ways of incorporating auxiliary information or auxiliary data into the learning process has been the topic of active data mining and machine learning research in recent years. In this work we study and develop a new framework for classification learning problem in which, in addition to class labels, the learner is provided with an auxiliary (probabilistic) information that reflects how strong the expert feels about the class label. This approach can be extremely useful for many practical classification tasks that rely on subjective label assessment and where the cost of acquiring additional auxiliary information is negligible when compared to the cost of the example analysis and labelling. We develop classification algorithms capable of using the auxiliary information to make the learning process more efficient in terms of the sample complexity. We demonstrate the benefit of the approach on a number of synthetic and real world data sets by comparing it to the learning with class labels only. PMID:25309141

  17. Splash, pop, sizzle: Information processing with phononic computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sklan, Sophia R.

    2015-05-15

    Phonons, the quanta of mechanical vibration, are important to the transport of heat and sound in solid materials. Recent advances in the fundamental control of phonons (phononics) have brought into prominence the potential role of phonons in information processing. In this review, the many directions of realizing phononic computing and information processing are examined. Given the relative similarity of vibrational transport at different length scales, the related fields of acoustic, phononic, and thermal information processing are all included, as are quantum and classical computer implementations. Connections are made between the fundamental questions in phonon transport and phononic control and themore » device level approach to diodes, transistors, memory, and logic. .« less

  18. Modern Techniques in Acoustical Signal and Image Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J V

    2002-04-04

    Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less

  19. Process-Based Governance in Public Administrations Using Activity-Based Costing

    NASA Astrophysics Data System (ADS)

    Becker, Jörg; Bergener, Philipp; Räckers, Michael

    Decision- and policy-makers in public administrations currently lack on missing relevant information for sufficient governance. In Germany the introduction of New Public Management and double-entry accounting enable public administrations to get the opportunity to use cost-centered accounting mechanisms to establish new governance mechanisms. Process modelling in this case can be a useful instrument to help the public administrations decision- and policy-makers to structure their activities and capture relevant information. In combination with approaches like Activity-Based Costing, higher management level can be supported with a reasonable data base for fruitful and reasonable governance approaches. Therefore, the aim of this article is combining the public sector domain specific process modelling method PICTURE and concept of activity-based costing for supporting Public Administrations in process-based Governance.

  20. Global Precipitation Measurement: GPM Microwave Imager (GMI) Algorithm Development Approach

    NASA Technical Reports Server (NTRS)

    Stocker, Erich Franz

    2009-01-01

    This slide presentation reviews the approach to the development of the Global Precipitation Measurement algorithm. This presentation includes information about the responsibilities for the development of the algorithm, and the calibration. Also included is information about the orbit, and the sun angle. The test of the algorithm code will be done with synthetic data generated from the Precipitation Processing System (PPS).

  1. The Impact of Hierarchy and Group Structure on Information Processing in Decision Making: Application of a Networks/Systems Approach.

    ERIC Educational Resources Information Center

    Ford, David L., Jr.

    When one engages in organizational diagnosis, it has been suggested that greater understanding of the organization can come through: (1) an identification of all the channels conveying material and information, and (2) a description of the means by which this communication influences the behavior of the organization. A networks/system approach is…

  2. Using the FAN Approach to Deepen Trauma-Informed Care for Infants, Toddlers, and Families

    ERIC Educational Resources Information Center

    Heffron, Mary Claire; Gilkerson, Linda; Cosgrove, Kimberly; Heller, Sherryl Scott; Imberger, Jaci; Leviton, Audrey; Mueller, Mary; Norris-Shortle, Carole; Phillips, Caroline; Spielman, Eda; Wasserman, Kate

    2016-01-01

    Erikson Institute Fussy Baby Network® (FBN) leaders from around the country have been considering the importance of building trauma-informed service programs. In this article, they discuss ways that the Facilitating Attuned Interaction (FAN) approach and the core processes used by the FAN can be helpful both when trauma is an unexpected presence…

  3. Examining the Utility of the Schoolwide Expectations Survey for Specific Settings (SESSS): A Data-Informed Approach to Developing Expectation Matrices

    ERIC Educational Resources Information Center

    Royer, David James

    2017-01-01

    To best support all students' academic, behavioral, and social needs, an integrated systems approach is necessary. In such systems, all faculty and staff ideally recognize student success is a shared responsibility and collaborate in a data-informed process to define common student behavioral expectations to facilitate success academically,…

  4. Spatial vision processes: From the optical image to the symbolic structures of contour information

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.

    1988-01-01

    The significance of machine and natural vision is discussed together with the need for a general approach to image acquisition and processing aimed at recognition. An exploratory scheme is proposed which encompasses the definition of spatial primitives, intrinsic image properties and sampling, 2-D edge detection at the smallest scale, the construction of spatial primitives from edges, and the isolation of contour information from textural information. Concepts drawn from or suggested by natural vision at both perceptual and physiological levels are relied upon heavily to guide the development of the overall scheme. The scheme is intended to provide a larger context in which to place the emerging technology of detector array focal-plane processors. The approach differs from many recent efforts in edge detection and image coding by emphasizing smallest scale edge detection as a foundation for multi-scale symbolic processing while diminishing somewhat the importance of image convolutions with multi-scale edge operators. Cursory treatments of information theory illustrate that the direct application of this theory to structural information in images could not be realized.

  5. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis.

    PubMed

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.

  6. Knowledge-based expert systems and a proof-of-concept case study for multiple sequence alignment construction and analysis

    PubMed Central

    Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron

    2009-01-01

    The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242

  7. Challenges of Obtaining Informed Consent in Emergency Ward: A Qualitative Study in One Iranian Hospital

    PubMed Central

    Davoudi, Nayyereh; Nayeri, Nahid Dehghan; Zokaei, Mohammad Saeed; Fazeli, Nematallah

    2017-01-01

    Background and Objective: Regarding the fact that emergency ward has unique characteristics, whose uniqueness affects informed consent processes by creating specific challenges. Hence, it seems necessary to identify the process and challenges of informed consent in the emergency ward through a qualitative study to understand actual patients’ and health care providers’ experiences, beliefs, values, and feelings about the informed consent in the emergency ward. Through such studies, new insight can be gained on the process of informed consent and its challenges with the hope that the resulting knowledge will enable the promotion of ethical, legal as well as effective health services to the patients in the emergency ward. Method: In this qualitative study, research field was one of the emergency wards of educational and public hospitals in Iran. Field work and participant observation were carried out for 515 hours from June 2014 to March 2016. Also, conversations and semi-structured interviews based on the observations were conducted. The participants of the study were nurses and physicians working in the emergency ward, as well as patients and their attendants who were involved in the process of obtaining informed consent. Results: Three main categories were extracted from the data: a sense of frustration; reverse protection; and culture of paternalism in consent process. Conclusion: Findings of this study can be utilized in correcting the structures and processes of obtaining informed consent together with promotion of patients' ethical and legal care in emergency ward. In this way, the approaches in consent process will be changed from paternalistic approach to patient-centered care which concomitantly protects patient’s autonomy. PMID:29399235

  8. The RiverFish Approach to Business Process Modeling: Linking Business Steps to Control-Flow Patterns

    NASA Astrophysics Data System (ADS)

    Zuliane, Devanir; Oikawa, Marcio K.; Malkowski, Simon; Alcazar, José Perez; Ferreira, João Eduardo

    Despite the recent advances in the area of Business Process Management (BPM), today’s business processes have largely been implemented without clearly defined conceptual modeling. This results in growing difficulties for identification, maintenance, and reuse of rules, processes, and control-flow patterns. To mitigate these problems in future implementations, we propose a new approach to business process modeling using conceptual schemas, which represent hierarchies of concepts for rules and processes shared among collaborating information systems. This methodology bridges the gap between conceptual model description and identification of actual control-flow patterns for workflow implementation. We identify modeling guidelines that are characterized by clear phase separation, step-by-step execution, and process building through diagrams and tables. The separation of business process modeling in seven mutually exclusive phases clearly delimits information technology from business expertise. The sequential execution of these phases leads to the step-by-step creation of complex control-flow graphs. The process model is refined through intuitive table and diagram generation in each phase. Not only does the rigorous application of our modeling framework minimize the impact of rule and process changes, but it also facilitates the identification and maintenance of control-flow patterns in BPM-based information system architectures.

  9. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  10. A contemporary view of systems engineering. [definition of system and discussion of systems approach

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1974-01-01

    The concept of a 'system' is defined, and the 'systems approach' is discussed. Four contemporary examples of the systems approach are presented: an operations research project, the planning-programming-budgeting system, an information processing system, and aerospace programs.

  11. Developing the skills required for evidence-based practice.

    PubMed

    French, B

    1998-01-01

    The current health care environment requires practitioners with the skills to find and apply the best currently available evidence for effective health care, to contribute to the development of evidence-based practice protocols, and to evaluate the impact of utilizing validated research findings in practice. Current approaches to teaching research are based mainly on gaining skills by participation in the research process. Emphasis on the requirement for rigour in the process of creating new knowledge is assumed to lead to skill in the process of using research information created by others. This article reflects upon the requirements for evidence-based practice, and the degree to which current approaches to teaching research prepare practitioners who are able to find, evaluate and best use currently available research information. The potential for using the principles of systematic review as a teaching and learning strategy for research is explored, and some of the possible strengths and weakness of this approach are highlighted.

  12. Developing "My Asthma Diary": a process exemplar of a patient-driven arts-based knowledge translation tool.

    PubMed

    Archibald, Mandy M; Hartling, Lisa; Ali, Samina; Caine, Vera; Scott, Shannon D

    2018-06-05

    Although it is well established that family-centered education is critical to managing childhood asthma, the information needs of parents of children with asthma are not being met through current educational approaches. Patient-driven educational materials that leverage the power of the storytelling and the arts show promise in communicating health information and assisting in illness self-management. However, such arts-based knowledge translation approaches are in their infancy, and little is known about how to develop such tools for parents. This paper reports on the development of "My Asthma Diary" - an innovative knowledge translation tool based on rigorous research evidence and tailored to parents' asthma-related information needs. We used a multi-stage process to develop four eBook prototypes of "My Asthma Diary." We conducted formative research on parents' information needs and identified high quality research evidence on childhood asthma, and used these data to inform the development of the asthma eBooks. We established interdisciplinary consulting teams with health researchers, practitioners, and artists to help iteratively create the knowledge translation tools. We describe the iterative, transdisciplinary process of developing asthma eBooks which incorporates: (I) parents' preferences and information needs on childhood asthma, (II) quality evidence on childhood asthma and its management, and (III) the engaging and informative powers of storytelling and visual art as methods to communicate complex health information to parents. We identified four dominant methodological and procedural challenges encountered during this process: (I) working within an inter-disciplinary team, (II) quantity and ordering of information, (III) creating a composite narrative, and (IV) balancing actual and ideal management scenarios. We describe a replicable and rigorous multi-staged approach to developing a patient-driven, creative knowledge translation tool, which can be adapted for use with different populations and contexts. We identified specific procedural and methodological challenges that others conducting comparable work should consider, particularly as creative, patient-driven knowledge translation strategies continue to emerge across health disciplines.

  13. Effects of digital altimetry on pilot workload

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.

    1985-01-01

    A series of VOR-DME instrument landing approaches was flown in the DC-9 full-workload simulator to compare pilot performance, scan behavior, and workload when using a computer-drum-pointer altimeter (CDPA) and a digital altimeter (DA). Six pilots executed two sets of instrument landing approaches, with a CDPA on one set and a DA on the other set. Pilot scanning parameters, flight performance, and subjective opinion data were evaluated. It is found that the processes of gathering information from the CDPA and the DA are different. The DA requires a higher mental workload than the CDPA for a VOR-DME type landing approach. Mental processing of altitude information after transitioning back to the attitude indicator is more evident with the DA than with the CDPA.

  14. A pivotal-based approach for enterprise business process and IS integration

    NASA Astrophysics Data System (ADS)

    Ulmer, Jean-Stéphane; Belaud, Jean-Pierre; Le Lann, Jean-Marc

    2013-02-01

    A company must be able to describe and react against any endogenous or exogenous event. Such flexibility can be achieved through business process management (BPM). Nevertheless a BPM approach highlights complex relations between business and IT domains. A non-alignment is exposed between heterogeneous models: this is the 'business-IT gap' as described in the literature. Through concepts from business engineering and information systems driven by models and IT, we define a generic approach ensuring multi-view consistency. Its role is to maintain and provide all information related to the structure and semantic of models. Allowing the full return of a transformed model in the sense of reverse engineering, our platform enables synchronisation between analysis model and implementation model.

  15. Transforming Higher Education and Student Engagement through Collaborative Review to Inform Educational Design

    ERIC Educational Resources Information Center

    von Konsky, Brian R.; Martin, Romana; Bolt, Susan; Broadley, Tania; Ostashewski, Nathaniel

    2014-01-01

    This paper reports on staff perceptions arising from a review process designed to assist staff in making informed decisions regarding educational design, approaches to engage students in learning, and the technology to support engagement in the classroom and across multiple locations and delivery modes. The aim of the review process was to…

  16. Examination of the Nonlinear Dynamic Systems Associated with Science Student Cognition While Engaging in Science Information Processing

    ERIC Educational Resources Information Center

    Lamb, Richard; Cavagnetto, Andy; Akmal, Tariq

    2016-01-01

    A critical problem with the examination of learning in education is that there is an underlying assumption that the dynamic systems associated with student information processing can be measured using static linear assessments. This static linear approach does not provide sufficient ability to characterize learning. Much of the modern research…

  17. A Systems Approach to the Design and Operation of Effective Marketing Programs in Community Colleges.

    ERIC Educational Resources Information Center

    Scigliano, John A.

    1983-01-01

    Presents a research-based marketing model consisting of an environmental scanning process, a series of marketing audits, and an information-processing scheme. Views the essential elements of college marketing as information flow; high-level, long-term commitment; diverse strategies; innovation; and a broad view of marketing. Includes a marketing…

  18. An Efficient Workflow Environment to Support the Collaborative Development of Actionable Climate Information Using the NCAR Climate Risk Management Engine (CRMe)

    NASA Astrophysics Data System (ADS)

    Ammann, C. M.; Vigh, J. L.; Lee, J. A.

    2016-12-01

    Society's growing needs for robust and relevant climate information have fostered an explosion in tools and frameworks for processing climate projections. Many top-down workflows might be employed to generate sets of pre-computed data and plots, frequently served in a "loading-dock style" through a metadata-enabled search and discovery engine. Despite these increasing resources, the diverse needs of applications-driven projects often result in data processing workflow requirements that cannot be fully satisfied using past approaches. In parallel to the data processing challenges, the provision of climate information to users in a form that is also usable represents a formidable challenge of its own. Finally, many users do not have the time nor the desire to synthesize and distill massive volumes of climate information to find the relevant information for their particular application. All of these considerations call for new approaches to developing actionable climate information. CRMe seeks to bridge the gap between the diversity and richness of bottom-up needs of practitioners, with discrete, structured top-down workflows typically implemented for rapid delivery. Additionally, CRMe has implemented web-based data services capable of providing focused climate information in usable form for a given location, or as spatially aggregated information for entire regions or countries following the needs of users and sectors. Making climate data actionable also involves summarizing and presenting it in concise and approachable ways. CRMe is developing the concept of dashboards, co-developed with the users, to condense the key information into a quick summary of the most relevant, curated climate data for a given discipline, application, or location, while still enabling users to efficiently conduct deeper discovery into rich datasets on an as-needed basis.

  19. Extracting features of Gaussian self-similar stochastic processes via the Bandt-Pompe approach.

    PubMed

    Rosso, O A; Zunino, L; Pérez, D G; Figliola, A; Larrondo, H A; Garavaglia, M; Martín, M T; Plastino, A

    2007-12-01

    By recourse to appropriate information theory quantifiers (normalized Shannon entropy and Martín-Plastino-Rosso intensive statistical complexity measure), we revisit the characterization of Gaussian self-similar stochastic processes from a Bandt-Pompe viewpoint. We show that the ensuing approach exhibits considerable advantages with respect to other treatments. In particular, clear quantifiers gaps are found in the transition between the continuous processes and their associated noises.

  20. Remediation of information processing following traumatic brain injury: a community-based rehabilitation approach.

    PubMed

    Ashley, Mark J; Ashley, Jessica; Kreber, Lisa

    2012-01-01

    Traumatic brain injury (TBI) results in disruption of information processing via damage to primary, secondary, and tertiary cortical regions, as well as, subcortical pathways supporting information flow within and between cortical structures. TBI predominantly affects the anterior frontal poles, anterior temporal poles, white matter tracts and medial temporal structures. Fundamental information processing skills such as attention, perceptual processing, categorization and cognitive distance are concentrated within these same regions and are frequently disrupted following injury. Information processing skills improve in accordance with the extent to which residual frontal and temporal neurons can be encouraged to recruit and bias neuronal networks or the degree to which the functional connectivity of neural networks can be re-established and result in re-emergence or regeneration of specific cognitive skills. Higher-order cognitive processes, i.e., memory, reasoning, problem solving and other executive functions, are dependent upon the integrity of attention, perceptual processing, categorization, and cognitive distance. A therapeutic construct for treatment of attention, perceptual processing, categorization and cognitive distance deficits is presented along with an interventional model for encouragement of re-emergence or regeneration of these fundamental information processing skills.

  1. The economic value of drought information: Application to water resources management decisions in Spain

    NASA Astrophysics Data System (ADS)

    Garrote, Luis; Sordo, Alvaro; Iglesias, Ana

    2016-04-01

    Information is valuable when it improves decision-making (e.g., actions can be adjusted to better suit the situation at hand) and enables the mitigation of damage. However, quantifying the value of information is often difficult. Here we explore a general approach to understand the economic value of drought information for water managers framing our approach in the precautionary principle that reminds us that uncertainty is not a reason to postpone or avoid action. We explore how decision making can disregard uncertain effects, taking a short-term approach and focusing instead on the certain costs and benefits of taking action. Two main questions arise: How do we know that advanced drought information is actually helping decisions?; and What is the value of information in the decision process? The approach is applied to several regulated water resources systems in Spain. It first views drought information as a factor in the decision process which can be used by water managers to reduce uncertainty. Second, the value of drought information is the expected gain in a decision outcome (utility) from using additional information. Finally, the gains of improved information are compared with the information collection costs. Here we estimate the value by taking into account the accuracy of the drought information, the subjective probabilities about the value, analyzed as Bayesian probabilities, and the ability or skill of the stakeholders to apply the drought information to modify their actions. Since information may be considered a public good (non-rivalry and non-excludability), it may justify public policy in the provision of information, considering social costs and benefits. The application of the framework to the Spanish case studies shows that information benefits exceeds to costs when drought frequency is 20-40% above normal values; below these values uncertainty in the decisions dominate the results; above these values, the management decisions are limited even with perfect information.

  2. Author’s response: A universal approach to modeling visual word recognition and reading: not only possible, but also inevitable.

    PubMed

    Frost, Ram

    2012-10-01

    I have argued that orthographic processing cannot be understood and modeled without considering the manner in which orthographic structure represents phonological, semantic, and morphological information in a given writing system. A reading theory, therefore, must be a theory of the interaction of the reader with his/her linguistic environment. This outlines a novel approach to studying and modeling visual word recognition, an approach that focuses on the common cognitive principles involved in processing printed words across different writing systems. These claims were challenged by several commentaries that contested the merits of my general theoretical agenda, the relevance of the evolution of writing systems, and the plausibility of finding commonalities in reading across orthographies. Other commentaries extended the scope of the debate by bringing into the discussion additional perspectives. My response addresses all these issues. By considering the constraints of neurobiology on modeling reading, developmental data, and a large scope of cross-linguistic evidence, I argue that front-end implementations of orthographic processing that do not stem from a comprehensive theory of the complex information conveyed by writing systems do not present a viable approach for understanding reading. The common principles by which writing systems have evolved to represent orthographic, phonological, and semantic information in a language reveal the critical distributional characteristics of orthographic structure that govern reading behavior. Models of reading should thus be learning models, primarily constrained by cross-linguistic developmental evidence that describes how the statistical properties of writing systems shape the characteristics of orthographic processing. When this approach is adopted, a universal model of reading is possible.

  3. The distributed agent-based approach in the e-manufacturing environment

    NASA Astrophysics Data System (ADS)

    Sękala, A.; Kost, G.; Dobrzańska-Danikiewicz, A.; Banaś, W.; Foit, K.

    2015-11-01

    The deficiency of a coherent flow of information from a production department causes unplanned downtime and failures of machines and their equipment, which in turn results in production planning process based on incorrect and out-of-date information. All of these factors entail, as the consequence, the additional difficulties associated with the process of decision-making. They concern, among other, the coordination of components of a distributed system and providing the access to the required information, thereby generating unnecessary costs. The use of agent technology significantly speeds up the flow of information within the virtual enterprise. This paper includes the proposal of a multi-agent approach for the integration of processes within the virtual enterprise concept. The presented concept was elaborated to investigate the possible solutions of the ways of transmission of information in the production system taking into account the self-organization of constituent components. Thus it implicated the linking of the concept of multi-agent system with the system of managing the production information, based on the idea of e-manufacturing. The paper presents resulting scheme that should be the base for elaborating an informatics model of the target virtual system. The computer system itself is intended to be developed next.

  4. The general entity of life: a cybernetic approach.

    PubMed

    Bielecki, Andrzej

    2015-06-01

    Life, not only in the well-known context of biochemical metabolism but also in the context of hypothetical life synthesized laboratorially or possibly found on other planets, is considered in this paper. The three-component information-energetic-structural irreducible processing in autonomous systems is the core of the proposed approach. The cybernetic organization of a general entity of life--the alivon--is postulated. The crucial properties of life and evolution are derived from the proposed approach. Information encoded in biological structures is also studied.

  5. The Predictive Influence of Family and Neighborhood Assets on Fighting and Weapon Carrying from Mid- to Late-Adolescence

    PubMed Central

    Haegerich, Tamara M.; Oman, Roy F.; Vesely, Sara K.; Aspy, Cheryl B.; Tolma, Eleni L.

    2015-01-01

    Using a developmental, social-ecological approach to understand the etiology of health risk behavior and inform primary prevention efforts, we assess the predictive effects of family and neighborhood social processes on youth physical fighting and weapon carrying. Specifically, we focus on relationships among youth and their parents, family communication, and parental monitoring, as well as sense of community and neighborhood informal social control, support, concerns, and disorder. This study advances knowledge through its investigation of family and neighborhood structural factors and social processes together, employment of longitudinal models that estimate effects over adolescent development, and use of self-report and observational measures. Data from 1,093 youth/parent pairs were analyzed from the Youth Assets Study using a Generalized Estimating Equation (GEE) approach; family and neighborhood assets and risks were analyzed as time-varying and lagged. Similar family assets affected physical fighting and weapon carrying, whereas different neighborhood social processes influenced the two forms of youth violence. Study findings have implications for the primary prevention of youth violence, including the use of family-based approaches that build relationships and parental monitoring skills, and community-level change approaches that promote informal social control and reduce neighborhood concerns about safety. PMID:23677457

  6. Culturally Competent Informed-Consent Process to Evaluate a Social Policy for Older Persons With Low Literacy: The Mexican Case

    PubMed Central

    Aguila, Emma; Weidmer, Beverly A.; Illingworth, Alfonso Rivera; Martinez, Homero

    2017-01-01

    The informed-consent process seeks to provide complete information to participants about a research project and to protect personal information they may disclose. In this article, we present an informed-consent process that we piloted and improved to obtain consent from older adults in Yucatan, Mexico. Respondents had limited fluency in Spanish, spoke the local Mayan language, and had some physical limitations due to their age. We describe how we adapted the informed-consent process to comply with U.S. and Mexican regulations, while simplifying the forms and providing them in Spanish and Mayan. We present the challenges and lessons learned when dealing with low-literacy older populations, some with diminished autonomy, in a bilingual context and a binational approach to the legal framework. PMID:28824826

  7. Five Faces of Cognition: Theoretical Influences on Approaches to Learning Disabilities.

    ERIC Educational Resources Information Center

    Hresko, Wayne P.; Reid, D. Kim

    1981-01-01

    The label "cognitive" has been used to designate five substantially different approaches to the study of learning disabilities: information processing, metacognition, genetic epistemology, cognitive behavior modification, and the specific abilities model. (Author)

  8. Combining cognitive engineering and information fusion architectures to build effective joint systems

    NASA Astrophysics Data System (ADS)

    Sliva, Amy L.; Gorman, Joe; Voshell, Martin; Tittle, James; Bowman, Christopher

    2016-05-01

    The Dual Node Decision Wheels (DNDW) architecture concept was previously described as a novel approach toward integrating analytic and decision-making processes in joint human/automation systems in highly complex sociotechnical settings. In this paper, we extend the DNDW construct with a description of components in this framework, combining structures of the Dual Node Network (DNN) for Information Fusion and Resource Management with extensions on Rasmussen's Decision Ladder (DL) to provide guidance on constructing information systems that better serve decision-making support requirements. The DNN takes a component-centered approach to system design, decomposing each asset in terms of data inputs and outputs according to their roles and interactions in a fusion network. However, to ensure relevancy to and organizational fitment within command and control (C2) processes, principles from cognitive systems engineering emphasize that system design must take a human-centered systems view, integrating information needs and decision making requirements to drive the architecture design and capabilities of network assets. In the current work, we present an approach for structuring and assessing DNDW systems that uses a unique hybrid DNN top-down system design with a human-centered process design, combining DNN node decomposition with artifacts from cognitive analysis (i.e., system abstraction decomposition models, decision ladders) to provide work domain and task-level insights at different levels in an example intelligence, surveillance, and reconnaissance (ISR) system setting. This DNDW structure will ensure not only that the information fusion technologies and processes are structured effectively, but that the resulting information products will align with the requirements of human decision makers and be adaptable to different work settings .

  9. A model-driven approach to information security compliance

    NASA Astrophysics Data System (ADS)

    Correia, Anacleto; Gonçalves, António; Teodoro, M. Filomena

    2017-06-01

    The availability, integrity and confidentiality of information are fundamental to the long-term survival of any organization. Information security is a complex issue that must be holistically approached, combining assets that support corporate systems, in an extended network of business partners, vendors, customers and other stakeholders. This paper addresses the conception and implementation of information security systems, conform the ISO/IEC 27000 set of standards, using the model-driven approach. The process begins with the conception of a domain level model (computation independent model) based on information security vocabulary present in the ISO/IEC 27001 standard. Based on this model, after embedding in the model mandatory rules for attaining ISO/IEC 27001 conformance, a platform independent model is derived. Finally, a platform specific model serves the base for testing the compliance of information security systems with the ISO/IEC 27000 set of standards.

  10. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  11. An investigation into social information processing in young people with Asperger syndrome.

    PubMed

    Flood, Andrea Mary; Julian Hare, Dougal; Wallis, Paul

    2011-09-01

    Deficits in social functioning are a core feature of autistic spectrum disorders (ASD), being linked to various cognitive and developmental factors, but there has been little attempt to draw on normative models of social cognition to understand social behaviour in ASD. The current study explored the utility of Crick and Dodge's (1994) information processing model to studying social cognition in ASD, and examined associations between social information processing patterns, theory of mind skills and social functioning. A matched-group design compared young people with Asperger syndrome with typically developing peers, using a social information processing interview previously designed for this purpose. The Asperger syndrome group showed significantly different patterns of information processing at the intent attribution, response generation and response evaluation stages of the information processing model. Theory of mind skills were found to be significantly associated with parental ratings of peer problems in the Asperger syndrome group but not with parental ratings of pro-social behaviour, with only limited evidence of an association between social information processing and measures of theory of mind and social functioning. Overall, the study supports the use of normative social information processing approaches to understanding social functioning in ASD.

  12. Managing Information Resources: New Directions in State Government.

    ERIC Educational Resources Information Center

    Caudle, Sharon L.; Marchand, Donald A.

    1990-01-01

    Describes a national survey of management policies and practices applied to information and information technology in state government. Management approaches and trends are discussed in the areas of data processing, telecommunications, office automation, records management, state library services, policy formation, budgeting and accounting,…

  13. A conceptual model of the automated credibility assessment of the volunteered geographic information

    NASA Astrophysics Data System (ADS)

    Idris, N. H.; Jackson, M. J.; Ishak, M. H. I.

    2014-02-01

    The use of Volunteered Geographic Information (VGI) in collecting, sharing and disseminating geospatially referenced information on the Web is increasingly common. The potentials of this localized and collective information have been seen to complement the maintenance process of authoritative mapping data sources and in realizing the development of Digital Earth. The main barrier to the use of this data in supporting this bottom up approach is the credibility (trust), completeness, accuracy, and quality of both the data input and outputs generated. The only feasible approach to assess these data is by relying on an automated process. This paper describes a conceptual model of indicators (parameters) and practical approaches to automated assess the credibility of information contributed through the VGI including map mashups, Geo Web and crowd - sourced based applications. There are two main components proposed to be assessed in the conceptual model - metadata and data. The metadata component comprises the indicator of the hosting (websites) and the sources of data / information. The data component comprises the indicators to assess absolute and relative data positioning, attribute, thematic, temporal and geometric correctness and consistency. This paper suggests approaches to assess the components. To assess the metadata component, automated text categorization using supervised machine learning is proposed. To assess the correctness and consistency in the data component, we suggest a matching validation approach using the current emerging technologies from Linked Data infrastructures and using third party reviews validation. This study contributes to the research domain that focuses on the credibility, trust and quality issues of data contributed by web citizen providers.

  14. SUPPORT Tools for evidence-informed health Policymaking (STP) 1: What is evidence-informed policymaking?

    PubMed Central

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. In this article, we discuss the following three questions: What is evidence? What is the role of research evidence in informing health policy decisions? What is evidence-informed policymaking? Evidence-informed health policymaking is an approach to policy decisions that aims to ensure that decision making is well-informed by the best available research evidence. It is characterised by the systematic and transparent access to, and appraisal of, evidence as an input into the policymaking process. The overall process of policymaking is not assumed to be systematic and transparent. However, within the overall process of policymaking, systematic processes are used to ensure that relevant research is identified, appraised and used appropriately. These processes are transparent in order to ensure that others can examine what research evidence was used to inform policy decisions, as well as the judgements made about the evidence and its implications. Evidence-informed policymaking helps policymakers gain an understanding of these processes. PMID:20018099

  15. Industrial process system assessment: bridging process engineering and life cycle assessment through multiscale modeling.

    EPA Science Inventory

    The Industrial Process System Assessment (IPSA) methodology is a multiple step allocation approach for connecting information from the production line level up to the facility level and vice versa using a multiscale model of process systems. The allocation procedure assigns inpu...

  16. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    NASA Astrophysics Data System (ADS)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual approach of statistics of universal precursors or stress level. The approach is more related to failure physics, by studying the ongoing failure. But it requires watching and relevant modeling for years, even decades. Useful information on fault process and warnings can be issued along the way, starting when we discover a fault showing signs of preparatory processes, up to the time of the earthquake. Such information and warnings could be issued by government agencies in cooperation with scientists to the local Civil Protection committee closest to the fault with information about how to prepare, including directives about enhanced watching. For such a warning service we need a continuously operating geo-watching system, applying modern computing technology to the multidisciplinary data, and a rule based schedule to prepare adequate warnings.

  17. Intelligent alarming

    NASA Technical Reports Server (NTRS)

    Braden, W. B.

    1992-01-01

    This talk discusses the importance of providing a process operator with concise information about a process fault including a root cause diagnosis of the problem, a suggested best action for correcting the fault, and prioritization of the problem set. A decision tree approach is used to illustrate one type of approach for determining the root cause of a problem. Fault detection in several different types of scenarios is addressed, including pump malfunctions and pipeline leaks. The talk stresses the need for a good data rectification strategy and good process models along with a method for presenting the findings to the process operator in a focused and understandable way. A real time expert system is discussed as an effective tool to help provide operators with this type of information. The use of expert systems in the analysis of actual versus predicted results from neural networks and other types of process models is discussed.

  18. The New Approaches to Organization of Students' Individual Work in Foreign Language Learning in Ukraine and Abroad

    ERIC Educational Resources Information Center

    Lysak, Halyna; Martynyuk, Olena

    2017-01-01

    Different approaches to organization of students' individual work using information technologies in Ukraine and abroad have been presented in the paper. The authors have analyzed the concept and role of students' individual work in the language learning process. It has been revealed that students' individual work is a rather flexible process and…

  19. Approaches to evaluating climate change impacts on species: A guide to initiating the adaptation planning process

    Treesearch

    Erika L. Rowland; Jennifer E. Davison; Lisa J. Graumlich

    2011-01-01

    Assessing the impact of climate change on species and associated management objectives is a critical initial step for engaging in the adaptation planning process. Multiple approaches are available. While all possess limitations to their application associated with the uncertainties inherent in the data and models that inform their results, conducting and incorporating...

  20. Transforming Information Literacy Conversations to Enhance Student Learning: New Curriculum Dialogues

    ERIC Educational Resources Information Center

    Salisbury, Fiona A.; Karasmanis, Sharon; Robertson, Tracy; Corbin, Jenny; Hulett, Heather; Peseta, Tai L.

    2012-01-01

    Information literacy is an essential component of the La Trobe University inquiry/research graduate capability and it provides the skill set needed for students to take their first steps on the path to engaging with academic information and scholarly communication processes. A deep learning approach to information literacy can be achieved if…

  1. Information security management system planning for CBRN facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lenaeu, Joseph D.; O'Neil, Lori Ross; Leitch, Rosalyn M.

    The focus of this document is to provide guidance for the development of information security management system planning documents at chemical, biological, radiological, or nuclear (CBRN) facilities. It describes a risk-based approach for planning information security programs based on the sensitivity of the data developed, processed, communicated, and stored on facility information systems.

  2. A Teacher's Window into the Child's Mind and Papers from the Institute for Neuro-Physiological Psychology. A Non-Invasive Approach to Solving Learning and Behavior Problems.

    ERIC Educational Resources Information Center

    Goddard, Sally

    This book describes a neuro-developmental approach to learning difficulty assessment and remediation through assessment of a student's reception of information through the sensory channels, processing of sensory information in the brain, and repertoire of responses for expression. Chapter 1, "Reflexes: Their Impact on Success or Failure in…

  3. An adaptive semantic based mediation system for data interoperability among Health Information Systems.

    PubMed

    Khan, Wajahat Ali; Khattak, Asad Masood; Hussain, Maqbool; Amin, Muhammad Bilal; Afzal, Muhammad; Nugent, Christopher; Lee, Sungyoung

    2014-08-01

    Heterogeneity in the management of the complex medical data, obstructs the attainment of data level interoperability among Health Information Systems (HIS). This diversity is dependent on the compliance of HISs with different healthcare standards. Its solution demands a mediation system for the accurate interpretation of data in different heterogeneous formats for achieving data interoperability. We propose an adaptive AdapteR Interoperability ENgine mediation system called ARIEN, that arbitrates between HISs compliant to different healthcare standards for accurate and seamless information exchange to achieve data interoperability. ARIEN stores the semantic mapping information between different standards in the Mediation Bridge Ontology (MBO) using ontology matching techniques. These mappings are provided by our System for Parallel Heterogeneity (SPHeRe) matching system and Personalized-Detailed Clinical Model (P-DCM) approach to guarantee accuracy of mappings. The realization of the effectiveness of the mappings stored in the MBO is evaluation of the accuracy in transformation process among different standard formats. We evaluated our proposed system with the transformation process of medical records between Clinical Document Architecture (CDA) and Virtual Medical Record (vMR) standards. The transformation process achieved over 90 % of accuracy level in conversion process between CDA and vMR standards using pattern oriented approach from the MBO. The proposed mediation system improves the overall communication process between HISs. It provides an accurate and seamless medical information exchange to ensure data interoperability and timely healthcare services to patients.

  4. Persuasion and attitude change in science education

    NASA Astrophysics Data System (ADS)

    Koballa, Thomas R., Jr.

    Many strategies used to induce the occurrence of desirable science-related beliefs, attitudes, and behaviors involve the use of persuasive messages. Science educators need to become acquainted with persuasion in the context of social influence and learning theory to be able to evaluate its usefulness in the science education milieu. Persuasion is the conscious attempt to bring about a jointly developed mental state common to both source and receiver through the use of symbolic cues, and it can be distinguished from other forms of social influence. Propaganda is a type of persuasion directed toward a mass audience. Coercion relies on reinforcement control, whereas persuasion is prompted by information. Brainwashing involves coercive techniques used to obtain cooperation and compliance. Persuasion and instruction are much alike; both require conscious cognitive activity by the recipient and involve communication which includes giving arguments and evidence for the purpose of getting someone to do something or to believe something.Persuasion research is anchored in learning theory. Early efforts were based on information processing. Studies following an information process approach focused on the effect of the variables harbored within the question Who says what in which channel to whom with what effect? on belief and attitude change. Cognitive processing and social exchange approaches to persuasion represent extensions to information process. Cognitive processing is concerned specifically with how people personally process the arguments presented in a persuasive message. Social exchange emphasizes the interchange that takes place between the message source and recipient. These approaches seem to be fruitful areas for future persuasion research in science education.Science educators' unfamiliarity with persuasion research stems from the fact that it is largely reported in the social psychology literature and has not been integrated into a framework familiar to educators.

  5. New opportunities of real-world data from clinical routine settings in life-cycle management of drugs: example of an integrative approach in multiple sclerosis.

    PubMed

    Rothenbacher, Dietrich; Capkun, Gorana; Uenal, Hatice; Tumani, Hayrettin; Geissbühler, Yvonne; Tilson, Hugh

    2015-05-01

    The assessment and demonstration of a positive benefit-risk balance of a drug is a life-long process and includes specific data from preclinical, clinical development and post-launch experience. However, new integrative approaches are needed to enrich evidence from clinical trials and sponsor-initiated observational studies with information from multiple additional sources, including registry information and other existing observational data and, more recently, health-related administrative claims and medical records databases. To illustrate the value of this approach, this paper exemplifies such a cross-package approach to the area of multiple sclerosis, exploring also possible analytic strategies when using these multiple sources of information.

  6. Social Information Processing in Preschool Children: Relations to Sociodemographic Risk and Problem Behavior

    PubMed Central

    Ziv, Yair; Sorongon, Alberto

    2011-01-01

    Using a multi-component, process-oriented approach, the links between Social Information Processing in the preschool years and a) sociodemographic risk, and b) behavior problems in preschool, were examined in a community sample of 196 children. Findings provided support for our initial hypotheses that aspects of social information processing in preschool are related to both sociodemographic risk and to behavior problems in preschool. Response evaluation, and in particular, the positive evaluation of an aggressive response, were related to both sociodemographic risk and children’s aggressive behavior and partially mediated the links between sociodemographic risk and aggressive behavior in preschool. PMID:21420102

  7. Mining of relations between proteins over biomedical scientific literature using a deep-linguistic approach.

    PubMed

    Rinaldi, Fabio; Schneider, Gerold; Kaljurand, Kaarel; Hess, Michael; Andronis, Christos; Konstandi, Ourania; Persidis, Andreas

    2007-02-01

    The amount of new discoveries (as published in the scientific literature) in the biomedical area is growing at an exponential rate. This growth makes it very difficult to filter the most relevant results, and thus the extraction of the core information becomes very expensive. Therefore, there is a growing interest in text processing approaches that can deliver selected information from scientific publications, which can limit the amount of human intervention normally needed to gather those results. This paper presents and evaluates an approach aimed at automating the process of extracting functional relations (e.g. interactions between genes and proteins) from scientific literature in the biomedical domain. The approach, using a novel dependency-based parser, is based on a complete syntactic analysis of the corpus. We have implemented a state-of-the-art text mining system for biomedical literature, based on a deep-linguistic, full-parsing approach. The results are validated on two different corpora: the manually annotated genomics information access (GENIA) corpus and the automatically annotated arabidopsis thaliana circadian rhythms (ATCR) corpus. We show how a deep-linguistic approach (contrary to common belief) can be used in a real world text mining application, offering high-precision relation extraction, while at the same time retaining a sufficient recall.

  8. Giftedness and Cultural Accumulation: An Information Processing Perspective

    ERIC Educational Resources Information Center

    Woolcott, Geoff

    2013-01-01

    There appears to be differing approaches, in modern education, to the identification and development of gifted students, but researchers are beginning to find some cohesiveness through approaches that examine giftedness from within broad views of human cognition and behavior. This paper takes such an approach by considering learning and memory as…

  9. 76 FR 72220 - Incorporation of Risk Management Concepts in Regulatory Programs

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-22

    ... and support the adoption of improved designs or processes. \\1\\ A deterministic approach to regulation... longstanding goal to move toward more risk-informed, performance- based approaches in its regulatory programs... regulatory approach that would continue to ensure the safe and secure use of nuclear material. As part of...

  10. Cross-Evaluation of Degree Programmes in Higher Education

    ERIC Educational Resources Information Center

    Kettunen, Juha

    2010-01-01

    Purpose: This study seeks to develop and describe the benchmarking approach of enhancement-led evaluation in higher education and to present a cross-evaluation process for degree programmes. Design/methodology/approach: The benchmarking approach produces useful information for the development of degree programmes based on self-evaluation,…

  11. Performance and cost evaluation of health information systems using micro-costing and discrete-event simulation.

    PubMed

    Rejeb, Olfa; Pilet, Claire; Hamana, Sabri; Xie, Xiaolan; Durand, Thierry; Aloui, Saber; Doly, Anne; Biron, Pierre; Perrier, Lionel; Augusto, Vincent

    2018-06-01

    Innovation and health-care funding reforms have contributed to the deployment of Information and Communication Technology (ICT) to improve patient care. Many health-care organizations considered the application of ICT as a crucial key to enhance health-care management. The purpose of this paper is to provide a methodology to assess the organizational impact of high-level Health Information System (HIS) on patient pathway. We propose an integrated performance evaluation of HIS approach through the combination of formal modeling using the Architecture of Integrated Information Systems (ARIS) models, a micro-costing approach for cost evaluation, and a Discrete-Event Simulation (DES) approach. The methodology is applied to the consultation for cancer treatment process. Simulation scenarios are established to conclude about the impact of HIS on patient pathway. We demonstrated that although high level HIS lengthen the consultation, occupation rate of oncologists are lower and quality of service is higher (through the number of available information accessed during the consultation to formulate the diagnostic). The provided method allows also to determine the most cost-effective ICT elements to improve the care process quality while minimizing costs. The methodology is flexible enough to be applied to other health-care systems.

  12. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  13. COM3/369: Knowledge-based Information Systems: A new approach for the representation and retrieval of medical information

    PubMed Central

    Mann, G; Birkmann, C; Schmidt, T; Schaeffler, V

    1999-01-01

    Introduction Present solutions for the representation and retrieval of medical information from online sources are not very satisfying. Either the retrieval process lacks of precision and completeness the representation does not support the update and maintenance of the represented information. Most efforts are currently put into improving the combination of search engines and HTML based documents. However, due to the current shortcomings of methods for natural language understanding there are clear limitations to this approach. Furthermore, this approach does not solve the maintenance problem. At least medical information exceeding a certain complexity seems to afford approaches that rely on structured knowledge representation and corresponding retrieval mechanisms. Methods Knowledge-based information systems are based on the following fundamental ideas. The representation of information is based on ontologies that define the structure of the domain's concepts and their relations. Views on domain models are defined and represented as retrieval schemata. Retrieval schemata can be interpreted as canonical query types focussing on specific aspects of the provided information (e.g. diagnosis or therapy centred views). Based on these retrieval schemata it can be decided which parts of the information in the domain model must be represented explicitly and formalised to support the retrieval process. As representation language propositional logic is used. All other information can be represented in a structured but informal way using text, images etc. Layout schemata are used to assign layout information to retrieved domain concepts. Depending on the target environment HTML or XML can be used. Results Based on this approach two knowledge-based information systems have been developed. The 'Ophthalmologic Knowledge-based Information System for Diabetic Retinopathy' (OKIS-DR) provides information on diagnoses, findings, examinations, guidelines, and reference images related to diabetic retinopathy. OKIS-DR uses combinations of findings to specify the information that must be retrieved. The second system focuses on nutrition related allergies and intolerances. Information on allergies and intolerances of a patient are used to retrieve general information on the specified combination of allergies and intolerances. As a special feature the system generates tables showing food types and products that are tolerated or not tolerated by patients. Evaluation by external experts and user groups showed that the described approach of knowledge-based information systems increases the precision and completeness of knowledge retrieval. Due to the structured and non-redundant representation of information the maintenance and update of the information can be simplified. Both systems are available as WWW based online knowledge bases and CD-ROMs (cf. http://mta.gsf.de topic: products).

  14. A systems engineering perspective on the human-centered design of health information systems.

    PubMed

    Samaras, George M; Horst, Richard L

    2005-02-01

    The discipline of systems engineering, over the past five decades, has used a structured systematic approach to managing the "cradle to grave" development of products and processes. While elements of this approach are typically used to guide the development of information systems that instantiate a significant user interface, it appears to be rare for the entire process to be implemented. In fact, a number of authors have put forth development lifecycle models that are subsets of the classical systems engineering method, but fail to include steps such as incremental hazard analysis and post-deployment corrective and preventative actions. In that most health information systems have safety implications, we argue that the design and development of such systems would benefit by implementing this systems engineering approach in full. Particularly with regard to bringing a human-centered perspective to the formulation of system requirements and the configuration of effective user interfaces, this classical systems engineering method provides an excellent framework for incorporating human factors (ergonomics) knowledge and integrating ergonomists in the interdisciplinary development of health information systems.

  15. Gradient-based multiresolution image fusion.

    PubMed

    Petrović, Valdimir S; Xydeas, Costas S

    2004-02-01

    A novel approach to multiresolution signal-level image fusion is presented for accurately transferring visual information from any number of input image signals, into a single fused image without loss of information or the introduction of distortion. The proposed system uses a "fuse-then-decompose" technique realized through a novel, fusion/decomposition system architecture. In particular, information fusion is performed on a multiresolution gradient map representation domain of image signal information. At each resolution, input images are represented as gradient maps and combined to produce new, fused gradient maps. Fused gradient map signals are processed, using gradient filters derived from high-pass quadrature mirror filters to yield a fused multiresolution pyramid representation. The fused output image is obtained by applying, on the fused pyramid, a reconstruction process that is analogous to that of conventional discrete wavelet transform. This new gradient fusion significantly reduces the amount of distortion artefacts and the loss of contrast information usually observed in fused images obtained from conventional multiresolution fusion schemes. This is because fusion in the gradient map domain significantly improves the reliability of the feature selection and information fusion processes. Fusion performance is evaluated through informal visual inspection and subjective psychometric preference tests, as well as objective fusion performance measurements. Results clearly demonstrate the superiority of this new approach when compared to conventional fusion systems.

  16. Relatedness-based Multi-Entity Summarization

    PubMed Central

    Gunaratna, Kalpa; Yazdavar, Amir Hossein; Thirunarayan, Krishnaprasad; Sheth, Amit; Cheng, Gong

    2017-01-01

    Representing world knowledge in a machine processable format is important as entities and their descriptions have fueled tremendous growth in knowledge-rich information processing platforms, services, and systems. Prominent applications of knowledge graphs include search engines (e.g., Google Search and Microsoft Bing), email clients (e.g., Gmail), and intelligent personal assistants (e.g., Google Now, Amazon Echo, and Apple’s Siri). In this paper, we present an approach that can summarize facts about a collection of entities by analyzing their relatedness in preference to summarizing each entity in isolation. Specifically, we generate informative entity summaries by selecting: (i) inter-entity facts that are similar and (ii) intra-entity facts that are important and diverse. We employ a constrained knapsack problem solving approach to efficiently compute entity summaries. We perform both qualitative and quantitative experiments and demonstrate that our approach yields promising results compared to two other stand-alone state-of-the-art entity summarization approaches. PMID:29051696

  17. Rapid identification of kidney cyst mutations by whole exome sequencing in zebrafish

    PubMed Central

    Ryan, Sean; Willer, Jason; Marjoram, Lindsay; Bagwell, Jennifer; Mankiewicz, Jamie; Leshchiner, Ignaty; Goessling, Wolfram; Bagnat, Michel; Katsanis, Nicholas

    2013-01-01

    Forward genetic approaches in zebrafish have provided invaluable information about developmental processes. However, the relative difficulty of mapping and isolating mutations has limited the number of new genetic screens. Recent improvements in the annotation of the zebrafish genome coupled to a reduction in sequencing costs prompted the development of whole genome and RNA sequencing approaches for gene discovery. Here we describe a whole exome sequencing (WES) approach that allows rapid and cost-effective identification of mutations. We used our WES methodology to isolate four mutations that cause kidney cysts; we identified novel alleles in two ciliary genes as well as two novel mutants. The WES approach described here does not require specialized infrastructure or training and is therefore widely accessible. This methodology should thus help facilitate genetic screens and expedite the identification of mutants that can inform basic biological processes and the causality of genetic disorders in humans. PMID:24130329

  18. Ultrasonic inspection of carbon fiber reinforced plastic by means of sample-recognition methods

    NASA Technical Reports Server (NTRS)

    Bilgram, R.

    1985-01-01

    In the case of carbon fiber reinforced plastic (CFRP), it has not yet been possible to detect nonlocal defects and material degradation related to aging with the aid of nondestructive inspection method. An approach for overcoming difficulties regarding such an inspection involves an extension of the ultrasonic inspection procedure on the basis of a use of signal processing and sample recognition methods. The basic concept involved in this approach is related to the realization that the ultrasonic signal contains information regarding the medium which is not utilized in conventional ultrasonic inspection. However, the analytical study of the phyiscal processes involved is very complex. For this reason, an empirical approach is employed to make use of the information which has not been utilized before. This approach uses reference signals which can be obtained with material specimens of different quality. The implementation of these concepts for the supersonic inspection of CFRP laminates is discussed.

  19. Using templates and linguistic patterns to define process performance indicators

    NASA Astrophysics Data System (ADS)

    del-Río-Ortega, Adela; Resinas, Manuel; Durán, Amador; Ruiz-Cortés, Antonio

    2016-02-01

    Process performance management (PPM) aims at measuring, monitoring and analysing the performance of business processes (BPs), in order to check the achievement of strategic and operational goals and to support decision-making for their optimisation. PPM is based on process performance indicators (PPIs), so having an appropriate definition of them is crucial. One of the main problems of PPIs definition is to express them in an unambiguous, complete, understandable, traceable and verifiable manner. In practice, PPIs are defined informally - usually in ad hoc, natural language, with its well-known problems - or they are defined from an implementation perspective, hardly understandable to non-technical people. In order to solve this problem, in this article we propose a novel approach to improve the definition of PPIs using templates and linguistic patterns. This approach promotes reuse, reduces both ambiguities and missing information, is understandable to all stakeholders and maintains traceability with the process model. Furthermore, it enables the automated processing of PPI definitions by its straightforward translation into the PPINOT metamodel, allowing the gathering of the required information for their computation as well as the analysis of the relationships between them and with BP elements.

  20. Process Mining-Based Method of Designing and Optimizing the Layouts of Emergency Departments in Hospitals.

    PubMed

    Rismanchian, Farhood; Lee, Young Hoon

    2017-07-01

    This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.

  1. Negation in Context: A Functional Approach to Suppression

    ERIC Educational Resources Information Center

    Giora, Rachel; Fein, Ofer; Aschkenazi, Keren; Alkabets-Zlozover, Inbar

    2007-01-01

    Three experiments show that, contrary to the current view, comprehenders do not unconditionally deactivate information marked by negation. Instead, they discard negated information when it is functionally motivated. In Experiment 1, comprehenders discarded negated concepts when cued by a topic shift to dampen recently processed information.…

  2. Measuring Information Technology Performance: Operational Efficiency and Operational Effectiveness

    ERIC Educational Resources Information Center

    Moore, Annette G.

    2012-01-01

    This dissertation provides a practical approach for measuring operational efficiency and operational effectiveness for IT organizations introducing the ITIL process framework. The intent of the study was to assist Chief Information Officers (CIOs) in explaining the impact of introducing the Information Technology Infrastructure Library (ITIL)…

  3. A Multi-Scale, Integrated Approach to Representing Watershed Systems

    NASA Astrophysics Data System (ADS)

    Ivanov, Valeriy; Kim, Jongho; Fatichi, Simone; Katopodes, Nikolaos

    2014-05-01

    Understanding and predicting process dynamics across a range of scales are fundamental challenges for basic hydrologic research and practical applications. This is particularly true when larger-spatial-scale processes, such as surface-subsurface flow and precipitation, need to be translated to fine space-time scale dynamics of processes, such as channel hydraulics and sediment transport, that are often of primary interest. Inferring characteristics of fine-scale processes from uncertain coarse-scale climate projection information poses additional challenges. We have developed an integrated model simulating hydrological processes, flow dynamics, erosion, and sediment transport, tRIBS+VEGGIE-FEaST. The model targets to take the advantage of the current generation of wealth of data representing watershed topography, vegetation, soil, and landuse, as well as to explore the hydrological effects of physical factors and their feedback mechanisms over a range of scales. We illustrate how the modeling system connects precipitation-hydrologic runoff partition process to the dynamics of flow, erosion, and sedimentation, and how the soil's substrate condition can impact the latter processes, resulting in a non-unique response. We further illustrate an approach to using downscaled climate change information with a process-based model to infer the moments of hydrologic variables in future climate conditions and explore the impact of climate information uncertainty.

  4. A Corpus-Based Discourse Information Analysis of Chinese EFL Learners' Autonomy in Legal Case Brief Writing

    ERIC Educational Resources Information Center

    Chen, Jinshi

    2017-01-01

    Legal case brief writing is pedagogically important yet insufficiently discussed for Chinese EFL learners majoring in law. Based on process genre approach and discourse information theory (DIT), the present study designs a corpus-based analytical model for Chinese EFL learners' autonomy in legal case brief writing and explores the process of case…

  5. Examining Factors Associated with (In)Stability in Social Information Processing among Urban School Children: A Latent Transition Analytic Approach

    ERIC Educational Resources Information Center

    Goldweber, Asha; Bradshaw, Catherine P.; Goodman, Kimberly; Monahan, Kathryn; Cooley-Strickland, Michele

    2011-01-01

    There is compelling evidence for the role of social information processing (SIP) in aggressive behavior. However, less is known about factors that influence stability versus instability in patterns of SIP over time. Latent transition analysis was used to identify SIP patterns over one year and examine how community violence exposure, aggressive…

  6. Information systems - Issues in global habitability

    NASA Technical Reports Server (NTRS)

    Norman, S. D.; Brass, J. A.; Jones, H.; Morse, D. R.

    1984-01-01

    The present investigation is concerned with fundamental issues, related to information considerations, which arise in an interdisciplinary approach to questions of global habitability. Information system problems and issues are illustrated with the aid of an example involving biochemical cycling and biochemical productivity. The estimation of net primary production (NPP) as an important consideration in the overall global habitability issue is discussed. The NPP model requires three types of data, related to meteorological information, a land surface inventory, and the vegetation structure. Approaches for obtaining and processing these data are discussed. Attention is given to user requirements, information system requirements, workstations, network communications, hardware/software access, and data management.

  7. Toward theoretical understanding of the fertility preservation decision-making process: examining information processing among young women with cancer.

    PubMed

    Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2013-01-01

    Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.

  8. Method Engineering: A Service-Oriented Approach

    NASA Astrophysics Data System (ADS)

    Cauvet, Corine

    In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.

  9. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  10. Medicare Part D Beneficiaries' Plan Switching Decisions and Information Processing.

    PubMed

    Han, Jayoung; Urmie, Julie

    2017-03-01

    Medicare Part D beneficiaries tend not to switch plans despite the government's efforts to engage beneficiaries in the plan switching process. Understanding current and alternative plan features is a necessary step to make informed plan switching decisions. This study explored beneficiaries' plan switching using a mixed-methods approach, with a focus on the concept of information processing. We found large variation in beneficiary comprehension of plan information among both switchers and nonswitchers. Knowledge about alternative plans was especially poor, with only about half of switchers and 2 in 10 nonswitchers being well informed about plans other than their current plan. We also found that helpers had a prominent role in plan decision making-nearly twice as many switchers as nonswitchers worked with helpers for their plan selection. Our study suggests that easier access to helpers as well as helpers' extensive involvement in the decision-making process promote informed plan switching decisions.

  11. Providing Enterprise Information Services for Multinational Interoperability - The EIM Approach

    DTIC Science & Technology

    2005-06-01

    Federated Search • Document Processing and Archiving • Workflow processing References 1. Perspective on Multinational Information Sharing, Cheryl...layer, with a primary focus on the Application Layers. App Layer Focus Areas Enterprise Content Mgt, Workflow, Business Processes, Federated ... Search Integration Layer – EAI Components Nation 1 Nation 2 Nation 3 … Nation N Silo 1 Silo 2 Silo 3 S ilo N Silo 1 Silo 1 Silo 2 Silo 2 Silo 3 Silo 3 S

  12. FPGA implementation of sparse matrix algorithm for information retrieval

    NASA Astrophysics Data System (ADS)

    Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio

    2005-06-01

    Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.

  13. Efficient terrestrial laser scan segmentation exploiting data structure

    NASA Astrophysics Data System (ADS)

    Mahmoudabadi, Hamid; Olsen, Michael J.; Todorovic, Sinisa

    2016-09-01

    New technologies such as lidar enable the rapid collection of massive datasets to model a 3D scene as a point cloud. However, while hardware technology continues to advance, processing 3D point clouds into informative models remains complex and time consuming. A common approach to increase processing efficiently is to segment the point cloud into smaller sections. This paper proposes a novel approach for point cloud segmentation using computer vision algorithms to analyze panoramic representations of individual laser scans. These panoramas can be quickly created using an inherent neighborhood structure that is established during the scanning process, which scans at fixed angular increments in a cylindrical or spherical coordinate system. In the proposed approach, a selected image segmentation algorithm is applied on several input layers exploiting this angular structure including laser intensity, range, normal vectors, and color information. These segments are then mapped back to the 3D point cloud so that modeling can be completed more efficiently. This approach does not depend on pre-defined mathematical models and consequently setting parameters for them. Unlike common geometrical point cloud segmentation methods, the proposed method employs the colorimetric and intensity data as another source of information. The proposed algorithm is demonstrated on several datasets encompassing variety of scenes and objects. Results show a very high perceptual (visual) level of segmentation and thereby the feasibility of the proposed algorithm. The proposed method is also more efficient compared to Random Sample Consensus (RANSAC), which is a common approach for point cloud segmentation.

  14. Management Information System Based on the Balanced Scorecard

    ERIC Educational Resources Information Center

    Kettunen, Juha; Kantola, Ismo

    2005-01-01

    Purpose: This study seeks to describe the planning and implementation in Finland of a campus-wide management information system using a rigorous planning methodology. Design/methodology/approach: The structure of the management information system is planned on the basis of the management process, where strategic management and the balanced…

  15. Biochemistry of the Envenomation Response--A Generator Theme for Interdisciplinary Integration

    ERIC Educational Resources Information Center

    Montagna, Erik; Guerreiro, Juliano R.; Torres, Bayardo B.

    2010-01-01

    The understanding of complex physiological processes requires information from many different areas of knowledge. To meet this interdisciplinary scenario, the ability of integrating and articulating information is demanded. The difficulty of such approach arises because, more often than not, information is fragmented through under graduation…

  16. A Computer-Assisted Approach for Conducting Information Technology Applied Instructions

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Hwang, Gwo-Jen; Tsai, Pei Jin; Yang, Tzu-Chi

    2009-01-01

    The growing popularity of computer and network technologies has attracted researchers to investigate the strategies and the effects of information technology applied instructions. Previous research has not only demonstrated the benefits of applying information technologies to the learning process, but has also revealed the difficulty of applying…

  17. The methodology of database design in organization management systems

    NASA Astrophysics Data System (ADS)

    Chudinov, I. L.; Osipova, V. V.; Bobrova, Y. V.

    2017-01-01

    The paper describes the unified methodology of database design for management information systems. Designing the conceptual information model for the domain area is the most important and labor-intensive stage in database design. Basing on the proposed integrated approach to design, the conceptual information model, the main principles of developing the relation databases are provided and user’s information needs are considered. According to the methodology, the process of designing the conceptual information model includes three basic stages, which are defined in detail. Finally, the article describes the process of performing the results of analyzing user’s information needs and the rationale for use of classifiers.

  18. Neural Information Processing in Cognition: We Start to Understand the Orchestra, but Where is the Conductor?

    PubMed Central

    Palm, Günther

    2016-01-01

    Research in neural information processing has been successful in the past, providing useful approaches both to practical problems in computer science and to computational models in neuroscience. Recent developments in the area of cognitive neuroscience present new challenges for a computational or theoretical understanding asking for neural information processing models that fulfill criteria or constraints from cognitive psychology, neuroscience and computational efficiency. The most important of these criteria for the evaluation of present and future contributions to this new emerging field are listed at the end of this article. PMID:26858632

  19. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  20. Patent information retrieval: approaching a method and analysing nanotechnology patent collaborations.

    PubMed

    Ozcan, Sercan; Islam, Nazrul

    2017-01-01

    Many challenges still remain in the processing of explicit technological knowledge documents such as patents. Given the limitations and drawbacks of the existing approaches, this research sets out to develop an improved method for searching patent databases and extracting patent information to increase the efficiency and reliability of nanotechnology patent information retrieval process and to empirically analyse patent collaboration. A tech-mining method was applied and the subsequent analysis was performed using Thomson data analyser software. The findings show that nations such as Korea and Japan are highly collaborative in sharing technological knowledge across academic and corporate organisations within their national boundaries, and China presents, in some cases, a great illustration of effective patent collaboration and co-inventorship. This study also analyses key patent strengths by country, organisation and technology.

  1. Representation and Exchange of Knowledge as a Basis of Information Processes. Proceedings of the International Research Forum in Information Science (5th, Heidelberg, West Germany, September 5-7, 1983).

    ERIC Educational Resources Information Center

    Dietschmann, Hans, Ed.

    This 22-paper collection addresses a variety of issues related to representation and transfer of knowledge. Individual papers include an explanation of the usefulness of general scientific models versus case-specific approaches and a discussion of different empirical approaches to the general problem of knowledge representation for information…

  2. To what extent information technology can be really useful in education?

    NASA Astrophysics Data System (ADS)

    Kalashnikov, N. P.; Olchak, A. S.; Scherbachev, O. V.

    2017-01-01

    Authors consider particular cases when evidently beneficial (in general) introduction of information technologies into educational process come across certain psychological limitations, turning its benefits into losses. The evolution of approach to education - from traditional to IT-based is traced. The examples are provided when the exaggerated IT-component of educational process leads to evident losses in both professional education and general cultural background of students. The authors are discussing certain compromise solutions between conservative and modernistic educational approaches. In the authors opinion the healthy portion of traditional conservative educational technologies may bring only benefits for the newer generations of the globalized IT-society.

  3. Arts-Based Learning: A New Approach to Nursing Education Using Andragogy.

    PubMed

    Nguyen, Megan; Miranda, Joyal; Lapum, Jennifer; Donald, Faith

    2016-07-01

    Learner-oriented strategies focusing on learning processes are needed to prepare nursing students for complex practice situations. An arts-based learning approach uses art to nurture cognitive and emotional learning. Knowles' theory of andragogy aims to develop the skill of learning and can inform the process of implementing arts-based learning. This article explores the use and evaluation of andragogy-informed arts-based learning for teaching nursing theory at the undergraduate level. Arts-based learning activities were implemented and then evaluated by students and instructors using anonymous questionnaires. Most students reported that the activities promoted learning. All instructors indicated an interest in integrating arts-based learning into the curricula. Facilitators and barriers to mainstreaming arts-based learning were highlighted. Findings stimulate implications for prospective research and education. Findings suggest that arts-based learning approaches enhance learning by supporting deep inquiry and different learning styles. Further exploration of andragogy-informed arts-based learning in nursing and other disciplines is warranted. [J Nurs Educ. 2016;55(7):407-410.]. Copyright 2016, SLACK Incorporated.

  4. Approaches to informed consent for hypothesis-testing and hypothesis-generating clinical genomics research.

    PubMed

    Facio, Flavia M; Sapp, Julie C; Linn, Amy; Biesecker, Leslie G

    2012-10-10

    Massively-parallel sequencing (MPS) technologies create challenges for informed consent of research participants given the enormous scale of the data and the wide range of potential results. We propose that the consent process in these studies be based on whether they use MPS to test a hypothesis or to generate hypotheses. To demonstrate the differences in these approaches to informed consent, we describe the consent processes for two MPS studies. The purpose of our hypothesis-testing study is to elucidate the etiology of rare phenotypes using MPS. The purpose of our hypothesis-generating study is to test the feasibility of using MPS to generate clinical hypotheses, and to approach the return of results as an experimental manipulation. Issues to consider in both designs include: volume and nature of the potential results, primary versus secondary results, return of individual results, duty to warn, length of interaction, target population, and privacy and confidentiality. The categorization of MPS studies as hypothesis-testing versus hypothesis-generating can help to clarify the issue of so-called incidental or secondary results for the consent process, and aid the communication of the research goals to study participants.

  5. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  6. 23 CFR 630.1008 - State-level processes and procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., data and information resources, training, and periodic evaluation enable a systematic approach for... management procedures. States should develop and implement systematic procedures to assess work zone impacts... practices and State processes and procedures. (e) Process review. In order to assess the effectiveness of...

  7. Target Information Processing: A Joint Decision and Estimation Approach

    DTIC Science & Technology

    2012-03-29

    ground targets ( track - before - detect ) using computer cluster and graphics processing unit. Estimation and filtering theory is one of the most important...targets ( track - before - detect ) using computer cluster and graphics processing unit. Estimation and filtering theory is one of the most important

  8. Demodulation processes in auditory perception

    NASA Astrophysics Data System (ADS)

    Feth, Lawrence L.

    1994-08-01

    The long range goal of this project is the understanding of human auditory processing of information conveyed by complex, time-varying signals such as speech, music or important environmental sounds. Our work is guided by the assumption that human auditory communication is a 'modulation - demodulation' process. That is, we assume that sound sources produce a complex stream of sound pressure waves with information encoded as variations ( modulations) of the signal amplitude and frequency. The listeners task then is one of demodulation. Much of past. psychoacoustics work has been based in what we characterize as 'spectrum picture processing.' Complex sounds are Fourier analyzed to produce an amplitude-by-frequency 'picture' and the perception process is modeled as if the listener were analyzing the spectral picture. This approach leads to studies such as 'profile analysis' and the power-spectrum model of masking. Our approach leads us to investigate time-varying, complex sounds. We refer to them as dynamic signals and we have developed auditory signal processing models to help guide our experimental work.

  9. Information accountability and usability: are there any connections?

    PubMed

    Sahama, Tony; Kushniruk, Andre; Kuwata, Shigeki

    2013-01-01

    Availability of health information is rapidly increasing and the expansion and proliferation of health information is inevitable. The Electronic Healthcare Record, Electronic Medical Record and Personal Health Record are at the core of this trend and are required for appropriate and practicable exchange and sharing of health information. However, it is becoming increasingly recognized that it is essential to preserve patient privacy and information security when utilising sensitive information for clinical, management and administrative processes. Furthermore, the usability of emerging healthcare applications is also becoming a growing concern. This paper proposes a novel approach for integrating consideration of information accountability with a perspective from usability engineering that can be applied when developing healthcare information technology applications. A social networking user case in the healthcare information exchange will be presented in the context of our approach.

  10. A comparative psychophysical approach to visual perception in primates.

    PubMed

    Matsuno, Toyomi; Fujita, Kazuo

    2009-04-01

    Studies on the visual processing of primates, which have well developed visual systems, provide essential information about the perceptual bases of their higher-order cognitive abilities. Although the mechanisms underlying visual processing are largely shared between human and nonhuman primates, differences have also been reported. In this article, we review psychophysical investigations comparing the basic visual processing that operates in human and nonhuman species, and discuss the future contributions potentially deriving from such comparative psychophysical approaches to primate minds.

  11. Complete information acquisition in scanning probe microscopy

    DOE PAGES

    Belianinov, Alex; Kalinin, Sergei V.; Jesse, Stephen

    2015-03-13

    In the last three decades, scanning probe microscopy (SPM) has emerged as a primary tool for exploring and controlling the nanoworld. A critical part of the SPM measurements is the information transfer from the tip-surface junction to a macroscopic measurement system. This process reduces the many degrees of freedom of a vibrating cantilever to relatively few parameters recorded as images. Similarly, the details of dynamic cantilever response at sub-microsecond time scales of transients, higher-order eigenmodes and harmonics are averaged out by transitioning to millisecond time scale of pixel acquisition. Hence, the amount of information available to the external observer ismore » severely limited, and its selection is biased by the chosen data processing method. Here, we report a fundamentally new approach for SPM imaging based on information theory-type analysis of the data stream from the detector. This approach allows full exploration of complex tip-surface interactions, spatial mapping of multidimensional variability of material s properties and their mutual interactions, and SPM imaging at the information channel capacity limit.« less

  12. A Diagnosis-Prognosis Feedback Loop for Improved Performance Under Uncertainties

    NASA Technical Reports Server (NTRS)

    Leser, Patrick E.; Warner, James E.

    2017-01-01

    The feed-forward relationship between diagnosis and prognosis is the foundation of both aircraft structural health management and the digital twin concept. Measurements of structural response are obtained either in-situ with mounted sensor networks or offline using more traditional techniques (e.g., nondestructive evaluation). Diagnosis algorithms process this information to detect and quantify damage and then feed this data forward to a prognostic framework. A prognosis of the structure's future operational readiness (e.g., remaining useful life or residual strength) is then made and is used to inform mission- critical decision-making. Years of research have been devoted to improving the elements of this process, but the process itself has not changed significantly. Here, a new approach is proposed in which prognosis information is not only fed forward for decision-making, but it is also fed back to the forthcoming diagnosis. In this way, diagnosis algorithms can take advantage of a priori information about the expected state of health, rather than operating in an uninformed condition. As a feasibility test, a diagnosis-prognosis feedback loop of this manner is demonstrated. The approach is applied to a numerical example in which fatigue crack growth is simulated in a simple aluminum alloy test specimen. A prognosis was derived from a set of diagnoses which provided feedback to a subsequent set of diagnoses. Improvements in accuracy and a reduction in uncertainty in the prognosis- informed diagnoses were observed when compared with an uninformed diagnostic approach.

  13. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  14. Use of the self-organising map network (SOMNet) as a decision support system for regional mental health planning.

    PubMed

    Chung, Younjin; Salvador-Carulla, Luis; Salinas-Pérez, José A; Uriarte-Uriarte, Jose J; Iruin-Sanz, Alvaro; García-Alonso, Carlos R

    2018-04-25

    Decision-making in mental health systems should be supported by the evidence-informed knowledge transfer of data. Since mental health systems are inherently complex, involving interactions between its structures, processes and outcomes, decision support systems (DSS) need to be developed using advanced computational methods and visual tools to allow full system analysis, whilst incorporating domain experts in the analysis process. In this study, we use a DSS model developed for interactive data mining and domain expert collaboration in the analysis of complex mental health systems to improve system knowledge and evidence-informed policy planning. We combine an interactive visual data mining approach, the self-organising map network (SOMNet), with an operational expert knowledge approach, expert-based collaborative analysis (EbCA), to develop a DSS model. The SOMNet was applied to the analysis of healthcare patterns and indicators of three different regional mental health systems in Spain, comprising 106 small catchment areas and providing healthcare for over 9 million inhabitants. Based on the EbCA, the domain experts in the development team guided and evaluated the analytical processes and results. Another group of 13 domain experts in mental health systems planning and research evaluated the model based on the analytical information of the SOMNet approach for processing information and discovering knowledge in a real-world context. Through the evaluation, the domain experts assessed the feasibility and technology readiness level (TRL) of the DSS model. The SOMNet, combined with the EbCA, effectively processed evidence-based information when analysing system outliers, explaining global and local patterns, and refining key performance indicators with their analytical interpretations. The evaluation results showed that the DSS model was feasible by the domain experts and reached level 7 of the TRL (system prototype demonstration in operational environment). This study supports the benefits of combining health systems engineering (SOMNet) and expert knowledge (EbCA) to analyse the complexity of health systems research. The use of the SOMNet approach contributes to the demonstration of DSS for mental health planning in practice.

  15. Systematic process synthesis and design methods for cost effective waste minimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W.

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents.

  16. Quality and Certification of Electronic Health Records

    PubMed Central

    Hoerbst, A.; Ammenwerth, E.

    2010-01-01

    Background Numerous projects, initiatives, and programs are dedicated to the development of Electronic Health Records (EHR) worldwide. Increasingly more of these plans have recently been brought from a scientific environment to real life applications. In this context, quality is a crucial factor with regard to the acceptance and utility of Electronic Health Records. However, the dissemination of the existing quality approaches is often rather limited. Objectives The present paper aims at the description and comparison of the current major quality certification approaches to EHRs. Methods A literature analysis was carried out in order to identify the relevant publications with regard to EHR quality certification. PubMed, ACM Digital Library, IEEExplore, CiteSeer, and Google (Scholar) were used to collect relevant sources. The documents that were obtained were analyzed using techniques of qualitative content analysis. Results The analysis discusses and compares the quality approaches of CCHIT, EuroRec, IHE, openEHR, and EN13606. These approaches differ with regard to their focus, support of service-oriented EHRs, process of (re-)certification and testing, number of systems certified and tested, supporting organizations, and regional relevance. Discussion The analyzed approaches show differences with regard to their structure and processes. System vendors can exploit these approaches in order to improve and certify their information systems. Health care organizations can use these approaches to support selection processes or to assess the quality of their own information systems. PMID:23616834

  17. Subjective Age Bias: A Motivational and Information Processing Approach

    ERIC Educational Resources Information Center

    Teuscher, Ursina

    2009-01-01

    There is broad empirical evidence, but still a lack of theoretical explanations, for the phenomenon that most older people feel considerably younger than their real age. In this article, a measurement model of subjective age was assessed, and two independent theoretical approaches are proposed: (1) a motivational approach assuming that the age…

  18. Dispositional Flow as a Mediator of the Relationships between Attentional Control and Approaches to Studying during Academic Examination Preparation

    ERIC Educational Resources Information Center

    Cermakova, Lucie; Moneta, Giovanni B.; Spada, Marcantonio M.

    2010-01-01

    This study investigated how attentional control and study-related dispositional flow influence students' approaches to studying when preparing for academic examinations. Based on information-processing theories, it was hypothesised that attentional control would be positively associated with deep and strategic approaches to studying, and…

  19. Using Q Methodology in the Literature Review Process: A Mixed Research Approach

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Frels, Rebecca K.

    2015-01-01

    Because of the mixed research-based nature of literature reviews, it is surprising, then, that insufficient information has been provided as to how reviewers can incorporate mixed research approaches into their literature reviews. Thus, in this article, we provide a mixed methods research approach--Q methodology--for analyzing information…

  20. A Strategic Planning Approach to Technology Integration: Critical Success Factors.

    ERIC Educational Resources Information Center

    Shaw, Sam; Zabudsky, Jeff

    Within most institutions of higher learning, the typical approach to the integration of new information and communications technologies into the teaching and learning process has involved a heavy reliance on early adopters. This path of least resistance approach has provided organizations with the opportunity to quickly claim a presence in the…

  1. The Teaching-Learning Environment, an Information-Dynamic Approach

    ERIC Educational Resources Information Center

    De Blasio, Cataldo; Järvinen, Mika

    2014-01-01

    In the present study a generalized approach is given for the description of acquisition procedures with a particular focus on the knowledge acquisition process. The learning progression is given as an example here letting the theory to be applied to different situations. An analytical approach is performed starting from the generalized fundamental…

  2. SUPPORT Tools for evidence-informed health Policymaking (STP) 3: Setting priorities for supporting evidence-informed policymaking

    PubMed Central

    2009-01-01

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. Policymakers have limited resources for developing – or supporting the development of – evidence-informed policies and programmes. These required resources include staff time, staff infrastructural needs (such as access to a librarian or journal article purchasing), and ongoing professional development. They may therefore prefer instead to contract out such work to independent units with more suitably skilled staff and appropriate infrastructure. However, policymakers may only have limited financial resources to do so. Regardless of whether the support for evidence-informed policymaking is provided in-house or contracted out, or whether it is centralised or decentralised, resources always need to be used wisely in order to maximise their impact. Examples of undesirable practices in a priority-setting approach include timelines to support evidence-informed policymaking being negotiated on a case-by-case basis (instead of having clear norms about the level of support that can be provided for each timeline), implicit (rather than explicit) criteria for setting priorities, ad hoc (rather than systematic and explicit) priority-setting process, and the absence of both a communications plan and a monitoring and evaluation plan. In this article, we suggest questions that can guide those setting priorities for finding and using research evidence to support evidence-informed policymaking. These are: 1. Does the approach to prioritisation make clear the timelines that have been set for addressing high-priority issues in different ways? 2. Does the approach incorporate explicit criteria for determining priorities? 3. Does the approach incorporate an explicit process for determining priorities? 4. Does the approach incorporate a communications strategy and a monitoring and evaluation plan? PMID:20018110

  3. SUPPORT Tools for evidence-informed health Policymaking (STP) 3: Setting priorities for supporting evidence-informed policymaking.

    PubMed

    Lavis, John N; Oxman, Andrew D; Lewin, Simon; Fretheim, Atle

    2009-12-16

    This article is part of a series written for people responsible for making decisions about health policies and programmes and for those who support these decision makers. Policymakers have limited resources for developing--or supporting the development of--evidence-informed policies and programmes. These required resources include staff time, staff infrastructural needs (such as access to a librarian or journal article purchasing), and ongoing professional development. They may therefore prefer instead to contract out such work to independent units with more suitably skilled staff and appropriate infrastructure. However, policymakers may only have limited financial resources to do so. Regardless of whether the support for evidence-informed policymaking is provided in-house or contracted out, or whether it is centralised or decentralised, resources always need to be used wisely in order to maximise their impact. Examples of undesirable practices in a priority-setting approach include timelines to support evidence-informed policymaking being negotiated on a case-by-case basis (instead of having clear norms about the level of support that can be provided for each timeline), implicit (rather than explicit) criteria for setting priorities, ad hoc (rather than systematic and explicit) priority-setting process, and the absence of both a communications plan and a monitoring and evaluation plan. In this article, we suggest questions that can guide those setting priorities for finding and using research evidence to support evidence-informed policymaking. These are: 1. Does the approach to prioritisation make clear the timelines that have been set for addressing high-priority issues in different ways? 2. Does the approach incorporate explicit criteria for determining priorities? 3. Does the approach incorporate an explicit process for determining priorities? 4. Does the approach incorporate a communications strategy and a monitoring and evaluation plan?

  4. A multi-method approach to evaluate health information systems.

    PubMed

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  5. The Effects of a Cognitive Information Processing Career Intervention on the Dysfunctional Career Thoughts, Locus of Control, and Career Decision Self-Efficacy of Underprepared College Students

    ERIC Educational Resources Information Center

    Henderson, Kristina M.

    2009-01-01

    This study investigated the impact of a seven-session career intervention in a First Year Experience course on the dysfunctional career thoughts, locus of control, and career decision self-efficacy of underprepared college students. The career intervention was based on the cognitive information processing approach to career decision making…

  6. Process improvement for the safe delivery of multidisciplinary-executed treatments-A case in Y-90 microspheres therapy.

    PubMed

    Cai, Bin; Altman, Michael B; Garcia-Ramirez, Jose; LaBrash, Jason; Goddu, S Murty; Mutic, Sasa; Parikh, Parag J; Olsen, Jeffrey R; Saad, Nael; Zoberi, Jacqueline E

    To develop a safe and robust workflow for yttrium-90 (Y-90) radioembolization procedures in a multidisciplinary team environment. A generalized Define-Measure-Analyze-Improve-Control (DMAIC)-based approach to process improvement was applied to a Y-90 radioembolization workflow. In the first DMAIC cycle, events with the Y-90 workflow were defined and analyzed. To improve the workflow, a web-based interactive electronic white board (EWB) system was adopted as the central communication platform and information processing hub. The EWB-based Y-90 workflow then underwent a second DMAIC cycle. Out of 245 treatments, three misses that went undetected until treatment initiation were recorded over a period of 21 months, and root-cause-analysis was performed to determine causes of each incident and opportunities for improvement. The EWB-based Y-90 process was further improved via new rules to define reliable sources of information as inputs into the planning process, as well as new check points to ensure this information was communicated correctly throughout the process flow. After implementation of the revised EWB-based Y-90 workflow, after two DMAIC-like cycles, there were zero misses out of 153 patient treatments in 1 year. The DMAIC-based approach adopted here allowed the iterative development of a robust workflow to achieve an adaptable, event-minimizing planning process despite a complex setting which requires the participation of multiple teams for Y-90 microspheres therapy. Implementation of such a workflow using the EWB or similar platform with a DMAIC-based process improvement approach could be expanded to other treatment procedures, especially those requiring multidisciplinary management. Copyright © 2016 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  7. Personality and self-regulation: trait and information-processing perspectives.

    PubMed

    Hoyle, Rick H

    2006-12-01

    This article introduces the special issue of Journal of Personality on personality and self-regulation. The goal of the issue is to illustrate and inspire research that integrates personality and process-oriented accounts of self-regulation. The article begins by discussing the trait perspective on self-regulation--distinguishing between temperament and personality accounts--and the information-processing perspective. Three approaches to integrating these perspectives are then presented. These range from methodological approaches, in which constructs representing the two perspectives are examined in integrated statistical models, to conceptual approaches, in which the two perspectives are unified in a holistic theoretical model of self-regulation. The article concludes with an overview of the special issue contributions, which are organized in four sections: broad, integrative models of personality and self-regulation; models that examine the developmental origins of self-regulation and self-regulatory styles; focused programs of research that concern specific aspects or applications of self-regulation; and strategies for increasing the efficiency and effectiveness of self-regulation.

  8. Integrating Human Factors Engineering and Information Processing Approaches to Facilitate Evaluations in Criminal Justice Technology Research.

    PubMed

    Salvemini, Anthony V; Piza, Eric L; Carter, Jeremy G; Grommon, Eric L; Merritt, Nancy

    2015-06-01

    Evaluations are routinely conducted by government agencies and research organizations to assess the effectiveness of technology in criminal justice. Interdisciplinary research methods are salient to this effort. Technology evaluations are faced with a number of challenges including (1) the need to facilitate effective communication between social science researchers, technology specialists, and practitioners, (2) the need to better understand procedural and contextual aspects of a given technology, and (3) the need to generate findings that can be readily used for decision making and policy recommendations. Process and outcome evaluations of technology can be enhanced by integrating concepts from human factors engineering and information processing. This systemic approach, which focuses on the interaction between humans, technology, and information, enables researchers to better assess how a given technology is used in practice. Examples are drawn from complex technologies currently deployed within the criminal justice system where traditional evaluations have primarily focused on outcome metrics. Although this evidence-based approach has significant value, it is vulnerable to fully account for human and structural complexities that compose technology operations. Guiding principles for technology evaluations are described for identifying and defining key study metrics, facilitating communication within an interdisciplinary research team, and for understanding the interaction between users, technology, and information. The approach posited here can also enable researchers to better assess factors that may facilitate or degrade the operational impact of the technology and answer fundamental questions concerning whether the technology works as intended, at what level, and cost. © The Author(s) 2015.

  9. Assessing the impact of case sensitivity and term information gain on biomedical concept recognition.

    PubMed

    Groza, Tudor; Verspoor, Karin

    2015-01-01

    Concept recognition (CR) is a foundational task in the biomedical domain. It supports the important process of transforming unstructured resources into structured knowledge. To date, several CR approaches have been proposed, most of which focus on a particular set of biomedical ontologies. Their underlying mechanisms vary from shallow natural language processing and dictionary lookup to specialized machine learning modules. However, no prior approach considers the case sensitivity characteristics and the term distribution of the underlying ontology on the CR process. This article proposes a framework that models the CR process as an information retrieval task in which both case sensitivity and the information gain associated with tokens in lexical representations (e.g., term labels, synonyms) are central components of a strategy for generating term variants. The case sensitivity of a given ontology is assessed based on the distribution of so-called case sensitive tokens in its terms, while information gain is modelled using a combination of divergence from randomness and mutual information. An extensive evaluation has been carried out using the CRAFT corpus. Experimental results show that case sensitivity awareness leads to an increase of up to 0.07 F1 against a non-case sensitive baseline on the Protein Ontology and GO Cellular Component. Similarly, the use of information gain leads to an increase of up to 0.06 F1 against a standard baseline in the case of GO Biological Process and Molecular Function and GO Cellular Component. Overall, subject to the underlying token distribution, these methods lead to valid complementary strategies for augmenting term label sets to improve concept recognition.

  10. Translating building information modeling to building energy modeling using model view definition.

    PubMed

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  11. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    PubMed Central

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  12. Healthcare information system approaches based on middleware concepts.

    PubMed

    Holena, M; Blobel, B

    1997-01-01

    To meet the challenges for efficient and high-level quality, health care systems must implement the "Shared Care" paradigm of distributed co-operating systems. To this end, both the newly developed and legacy applications must be fully integrated into the care process. These requirements can be fulfilled by information systems based on middleware concepts. In the paper, the middleware approaches HL7, DHE, and CORBA are described. The relevance of those approaches to the healthcare domain is documented. The description presented here is complemented through two other papers in this volume, concentrating on the evaluation of the approaches, and on their security threats and solutions.

  13. Enhancing Health-Care Services with Mixed Reality Systems

    NASA Astrophysics Data System (ADS)

    Stantchev, Vladimir

    This work presents a development approach for mixed reality systems in health care. Although health-care service costs account for 5-15% of GDP in developed countries the sector has been remarkably resistant to the introduction of technology-supported optimizations. Digitalization of data storing and processing in the form of electronic patient records (EPR) and hospital information systems (HIS) is a first necessary step. Contrary to typical business functions (e.g., accounting or CRM) a health-care service is characterized by a knowledge intensive decision process and usage of specialized devices ranging from stethoscopes to complex surgical systems. Mixed reality systems can help fill the gap between highly patient-specific health-care services that need a variety of technical resources on the one side and the streamlined process flow that typical process supporting information systems expect on the other side. To achieve this task, we present a development approach that includes an evaluation of existing tasks and processes within the health-care service and the information systems that currently support the service, as well as identification of decision paths and actions that can benefit from mixed reality systems. The result is a mixed reality system that allows a clinician to monitor the elements of the physical world and to blend them with virtual information provided by the systems. He or she can also plan and schedule treatments and operations in the digital world depending on status information from this mixed reality.

  14. MT+, integrating magnetotellurics to determine earth structure, physical state, and processes

    USGS Publications Warehouse

    Bedrosian, P.A.

    2007-01-01

    As one of the few deep-earth imaging techniques, magnetotellurics provides information on both the structure and physical state of the crust and upper mantle. Magnetotellurics is sensitive to electrical conductivity, which varies within the earth by many orders of magnitude and is modified by a range of earth processes. As with all geophysical techniques, magnetotellurics has a non-unique inverse problem and has limitations in resolution and sensitivity. As such, an integrated approach, either via the joint interpretation of independent geophysical models, or through the simultaneous inversion of independent data sets is valuable, and at times essential to an accurate interpretation. Magnetotelluric data and models are increasingly integrated with geological, geophysical and geochemical information. This review considers recent studies that illustrate the ways in which such information is combined, from qualitative comparisons to statistical correlation studies to multi-property inversions. Also emphasized are the range of problems addressed by these integrated approaches, and their value in elucidating earth structure, physical state, and processes. ?? Springer Science+Business Media B.V. 2007.

  15. a Continual Engagement Approach Through Gis-Mcda Conflict Resolution of Loggerhead Sea Turtle Bycatch in Mexico

    NASA Astrophysics Data System (ADS)

    Bojórquez-Tapia, L. A.

    2015-12-01

    Continual engagement is an approach that emphasizes of uninterrupted interaction with the stakeholders with the purpose of fully integrating their knowledge into policymaking process. It focuses on the creation of hybrid scientific-local knowledge highly relevant to community and policy makers needs, while balancing the power asymmetries among stakeholders. Hence, it presupposes a capacity for a continuous revision and adjustment of the analyses that support the policymaking process. While continual engagement implies a capacity for enabling an effective communication, translation and mediation of knowledge among the diverse stakeholders, experts and policymakers, it also means keeping a close eye out for how knowledge evolves and how new data and information is introduced along a policymaking process. Through a case study, the loggerhead sea turtle (Caretta caretta) fishing bycatch in Mexico, a geographical information system-multicriteria modeling (GIS-MCDA) approach is presented to address the challenges of implementing continual engagement in conflict resolution processes. The GIS-MCDA combined the analytical hierarchy process (AHP) and compromise programming (CP) to generate consensus regarding the spatial pattern of conflicts. The AHP was fundamental for synthesizing the different sources of knowledge into a geospatial model. In particular, the AHP enabled the assess the salience, legitimacy, and credibility of the information produced for all involved. Results enabled the development of specific policies based upon an assessment of the risk of the loggerhead population to different levels of fishing bycatch, and the needs of the fishing communities in the region.

  16. Reorienting health services in the Northern Territory of Australia: a conceptual model for building health promotion capacity in the workforce.

    PubMed

    Judd, Jenni; Keleher, Helen

    2013-06-01

    Reorienting work practices to include health promotion and prevention is complex and requires specific strategies and interventions. This paper presents original research that used 'real-world' practice to demonstrate that knowledge gathered from practice is relevant for the development of practice-based evidence. The paper shows how practitioners can inform and influence improvements in health promotion practice. Practitioner-informed evidence necessarily incorporates qualitative research to capture the richness of their reflective experiences. Using a participatory action research (PAR) approach, the research question asked 'what are the core dimensions of building health promotion capacity in a primary health care workforce in a real-world setting?' PAR is a method in which the researcher operates in full collaboration with members of the organisation being studied for the purposes of achieving some kind of change, in this case to increase the amount of health promotion and prevention practice within this community health setting. The PAR process involved six reflection and action cycles over two years. Data collection processes included: survey; in-depth interviews; a training intervention; observations of practice; workplace diaries; and two nominal groups. The listen/reflect/act process enabled lessons from practice to inform future capacity-building processes. This research strengthened and supported the development of health promotion to inform 'better health' practices through respectful change processes based on research, practitioner-informed evidence, and capacity-building strategies. A conceptual model for building health promotion capacity in the primary health care workforce was informed by the PAR processes and recognised the importance of the determinants approach. Practitioner-informed evidence is the missing link in the evidence debate and provides the links between evidence and its translation to practice. New models of health promotion service delivery can be developed in community settings recognising the importance of involving practitioners themselves in these processes.

  17. PaFlexPepDock: parallel ab-initio docking of peptides onto their receptors with full flexibility based on Rosetta.

    PubMed

    Li, Haiou; Lu, Liyao; Chen, Rong; Quan, Lijun; Xia, Xiaoyan; Lü, Qiang

    2014-01-01

    Structural information related to protein-peptide complexes can be very useful for novel drug discovery and design. The computational docking of protein and peptide can supplement the structural information available on protein-peptide interactions explored by experimental ways. Protein-peptide docking of this paper can be described as three processes that occur in parallel: ab-initio peptide folding, peptide docking with its receptor, and refinement of some flexible areas of the receptor as the peptide is approaching. Several existing methods have been used to sample the degrees of freedom in the three processes, which are usually triggered in an organized sequential scheme. In this paper, we proposed a parallel approach that combines all the three processes during the docking of a folding peptide with a flexible receptor. This approach mimics the actual protein-peptide docking process in parallel way, and is expected to deliver better performance than sequential approaches. We used 22 unbound protein-peptide docking examples to evaluate our method. Our analysis of the results showed that the explicit refinement of the flexible areas of the receptor facilitated more accurate modeling of the interfaces of the complexes, while combining all of the moves in parallel helped the constructing of energy funnels for predictions.

  18. Developing Emotion-Based Case Formulations: A Research-Informed Method.

    PubMed

    Pascual-Leone, Antonio; Kramer, Ueli

    2017-01-01

    New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Information Processing and Retrieval in Arab Countries: Traditional Approaches and Modern Potentials.

    ERIC Educational Resources Information Center

    Madkour, M. A. K.

    1980-01-01

    Discusses underlying assumptions and prerequisites for information development in Arab countries. Administrative and environmental impediments which hinder the optimum utilization of available resources and suggestions for improvements are outlined. A brief bibliography is provided. (Author/RAA)

  20. Partial information decomposition as a unified approach to the specification of neural goal functions.

    PubMed

    Wibral, Michael; Priesemann, Viola; Kay, Jim W; Lizier, Joseph T; Phillips, William A

    2017-03-01

    In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six-layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a 'goal function', of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. 'edge filtering', 'working memory'). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon's mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. First, we review the framework of PID. Then we apply it to reevaluate and analyze several earlier proposals of information theoretic neural goal functions (predictive coding, infomax and coherent infomax, efficient coding). We find that PID allows to compare these goal functions in a common framework, and also provides a versatile approach to design new goal functions from first principles. Building on this, we design and analyze a novel goal function, called 'coding with synergy', which builds on combining external input and prior knowledge in a synergistic manner. We suggest that this novel goal function may be highly useful in neural information processing. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  1. A Structured, Yet Agile Approach to Designing C2 Operating Environments

    DTIC Science & Technology

    2012-06-01

    PROCESS ........................................................ 22 APPENDIX A: SUPPLEMENTAL MATERIAL...organization’s mission effectiveness. Lastly, he identifies the mechanisms for C2 agility, enabled by people, processes , information, systems...operations, controls forces, and coordinates operational activities and/or a facility that is organized to gather, process , analyze, dispatch, and

  2. Video image processing to create a speed sensor

    DOT National Transportation Integrated Search

    1999-11-01

    Image processing has been applied to traffic analysis in recent years, with different goals. In the report, a new approach is presented for extracting vehicular speed information, given a sequence of real-time traffic images. We extract moving edges ...

  3. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    PubMed Central

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  4. Data update in a land information network

    NASA Astrophysics Data System (ADS)

    Mullin, Robin C.

    1988-01-01

    The on-going update of data exchanged in a land information network is examined. In the past, major developments have been undertaken to enable the exchange of data between land information systems. A model of a land information network and the data update process have been developed. Based on these, a functional description of the database and software to perform data updating is presented. A prototype of the data update process was implemented using the ARC/INFO geographic information system. This was used to test four approaches to data updating, i.e., bulk, block, incremental, and alert updates. A bulk update is performed by replacing a complete file with an updated file. A block update requires that the data set be partitioned into blocks. When an update occurs, only the blocks which are affected need to be transferred. An incremental update approach records each feature which is added or deleted and transmits only the features needed to update the copy of the file. An alert is a marker indicating that an update has occurred. It can be placed in a file to warn a user that if he is active in an area containing markers, updated data is available. The four approaches have been tested using a cadastral data set.

  5. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    PubMed

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  6. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  7. Mental Status Documentation: Information Quality and Data Processes

    PubMed Central

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses’ assessment, documentation, decisionmaking and communication regarding patients’ mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm. PMID:28269919

  8. Mental Status Documentation: Information Quality and Data Processes.

    PubMed

    Weir, Charlene; Gibson, Bryan; Taft, Teresa; Slager, Stacey; Lewis, Lacey; Staggers, Nancy

    2016-01-01

    Delirium is a fluctuating disturbance of cognition and/or consciousness associated with poor outcomes. Caring for patients with delirium requires integration of disparate information across clinicians, settings and time. The goal of this project was to characterize the information processes involved in nurses' assessment, documentation, decisionmaking and communication regarding patients' mental status in the inpatient setting. VA nurse managers of medical wards (n=18) were systematically selected across the US. A semi-structured telephone interview focused on current assessment, documentation, and communication processes, as well as clinical and administrative decision-making was conducted, audio-recorded and transcribed. A thematic analytic approach was used. Five themes emerged: 1) Fuzzy Concepts, 2) Grey Data, 3) Process Variability 4) Context is Critical and 5) Goal Conflict. This project describes the vague and variable information processes related to delirium and mental status that undermine effective risk, prevention, identification, communication and mitigation of harm.

  9. Incorporating Learning Characteristics into Automatic Essay Scoring Models: What Individual Differences and Linguistic Features Tell Us about Writing Quality

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, Laura K.; Snow, Erica L.; McNamara, Danielle S.

    2016-01-01

    This study investigates a novel approach to automatically assessing essay quality that combines natural language processing approaches that assess text features with approaches that assess individual differences in writers such as demographic information, standardized test scores, and survey results. The results demonstrate that combining text…

  10. Multiresponse imaging system design for improved resolution

    NASA Technical Reports Server (NTRS)

    Alter-Gartenberg, Rachel; Fales, Carl L.; Huck, Friedrich O.; Rahman, Zia-Ur; Reichenbach, Stephen E.

    1991-01-01

    Multiresponse imaging is a process that acquires A images, each with a different optical response, and reassembles them into a single image with an improved resolution that can approach 1/sq rt A times the photodetector-array sampling lattice. Our goals are to optimize the performance of this process in terms of the resolution and fidelity of the restored image and to assess the amount of information required to do so. The theoretical approach is based on the extension of both image restoration and rate-distortion theories from their traditional realm of signal processing to image processing which includes image gathering and display.

  11. Results of our national survey. Current formulary decision-making strategies and new factors influencing the process.

    PubMed

    1995-08-01

    Formulary recently conducted a survey of 2,000 of its readers to uncover what forces are at play in their formulary decision-making processes. Topics included general philosophies toward formulary decision making, philosophies toward adding and deleting products, influences on the process, trends related to product reviews, formulary management strategies, drug information educational strategies, and new approaches to the formulary decision-making process. Some 295 surveys (14.75%) were returned. Highlights and analyses of the survey findings are presented for your review and comparison with your practice setting's approaches.

  12. BPMN as a Communication Language for the Process- and Event-Oriented Perspectives in Fact-Oriented Conceptual Models

    NASA Astrophysics Data System (ADS)

    Bollen, Peter

    In this paper we will show how the OMG specification of BPMN (Business Process Modeling Notation) can be used to model the process- and event-oriented perspectives of an application subject area. We will illustrate how the fact-oriented conceptual models for the information-, process- and event perspectives can be used in a 'bottom-up' approach for creating a BPMN model in combination with other approaches, e.g. the use of a textual description. We will use the common doctor's office example as a running example in this article.

  13. Supporting Active Patient and Health Care Collaboration: A Prototype for Future Health Care Information Systems.

    PubMed

    Åhlfeldt, Rose-Mharie; Persson, Anne; Rexhepi, Hanife; Wåhlander, Kalle

    2016-12-01

    This article presents and illustrates the main features of a proposed process-oriented approach for patient information distribution in future health care information systems, by using a prototype of a process support system. The development of the prototype was based on the Visuera method, which includes five defined steps. The results indicate that a visualized prototype is a suitable tool for illustrating both the opportunities and constraints of future ideas and solutions in e-Health. The main challenges for developing and implementing a fully functional process support system concern both technical and organizational/management aspects. © The Author(s) 2015.

  14. Earth Sciences Data and Information System (ESDIS) program planning and evaluation methodology development

    NASA Technical Reports Server (NTRS)

    Dickinson, William B.

    1995-01-01

    An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.

  15. Information security governance: a risk assessment approach to health information systems protection.

    PubMed

    Williams, Patricia A H

    2013-01-01

    It is no small task to manage the protection of healthcare data and healthcare information systems. In an environment that is demanding adaptation to change for all information collection, storage and retrieval systems, including those for of e-health and information systems, it is imperative that good information security governance is in place. This includes understanding and meeting legislative and regulatory requirements. This chapter provides three models to educate and guide organisations in this complex area, and to simplify the process of information security governance and ensure appropriate and effective measures are put in place. The approach is risk based, adapted and contextualized for healthcare. In addition, specific considerations of the impact of cloud services, secondary use of data, big data and mobile health are discussed.

  16. Could a neuroscientist understand a microprocessor?

    DOE PAGES

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    2017-01-12

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  17. Could a Neuroscientist Understand a Microprocessor?

    PubMed Central

    Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods. PMID:28081141

  18. Could a Neuroscientist Understand a Microprocessor?

    PubMed

    Jonas, Eric; Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.

  19. Could a neuroscientist understand a microprocessor?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  20. Heuristic-based information acquisition and decision making among pilots.

    PubMed

    Wiggins, Mark W; Bollwerk, Sandra

    2006-01-01

    This research was designed to examine the impact of heuristic-based approaches to the acquisition of task-related information on the selection of an optimal alternative during simulated in-flight decision making. The work integrated features of naturalistic and normative decision making and strategies of information acquisition within a computer-based, decision support framework. The study comprised two phases, the first of which involved familiarizing pilots with three different heuristic-based strategies of information acquisition: frequency, elimination by aspects, and majority of confirming decisions. The second stage enabled participants to choose one of the three strategies of information acquisition to resolve a fourth (choice) scenario. The results indicated that task-oriented experience, rather than the information acquisition strategies, predicted the selection of the optimal alternative. It was also evident that of the three strategies available, the elimination by aspects information acquisition strategy was preferred by most participants. It was concluded that task-oriented experience, rather than the process of information acquisition, predicted task accuracy during the decision-making task. It was also concluded that pilots have a preference for one particular approach to information acquisition. Applications of outcomes of this research include the development of decision support systems that adapt to the information-processing capabilities and preferences of users.

  1. Validating archetypes for the Multiple Sclerosis Functional Composite.

    PubMed

    Braun, Michael; Brandt, Alexander Ulrich; Schulz, Stefan; Boeker, Martin

    2014-08-03

    Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions.This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model.

  2. Validating archetypes for the Multiple Sclerosis Functional Composite

    PubMed Central

    2014-01-01

    Background Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. Methods A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Results Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. Conclusions The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions. This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model. PMID:25087081

  3. Bayesian hierarchical functional data analysis via contaminated informative priors.

    PubMed

    Scarpa, Bruno; Dunson, David B

    2009-09-01

    A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.

  4. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  5. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  6. Community dialogues for child health: results from a qualitative process evaluation in three countries.

    PubMed

    Martin, Sandrine; Leitão, Jordana; Muhangi, Denis; Nuwa, Anthony; Magul, Dieterio; Counihan, Helen

    2017-06-05

    Across the developing world, countries are increasingly adopting the integrated community case management of childhood illnesses (iCCM) strategy in efforts to reduce child mortality. This intervention's effectiveness is dependent on community adoption and changes in care-seeking practices. We assessed the implementation process of a theory-driven community dialogue (CD) intervention specifically designed to strengthen the support and uptake of the newly introduced iCCM services and related behaviours in three African countries. A qualitative process evaluation methodology was chosen and used secondary project data and primary data collected in two districts of each of the three countries, in purposefully sampled communities. The final data set included 67 focus group discussions and 57 key informant interviews, totalling 642 respondents, including caregivers, CD facilitators community leaders, and trainers. Thematic analysis of the data followed the 'Framework Approach' utilising both a deduction and induction process. Results show that CDs contribute to triggering community uptake of and support for iCCM services through filling health information gaps and building cooperation within communities. We found it to be an effective approach for addressing social norms around child care practices. This approach was embraced by communities for its flexibility and value in planning individual and collective change. Regular CDs can contribute to the formation of new habits, particularly in relation to seeking timely care in case of child sickness. This study also confirms the value of process evaluation to unwrap the mechanisms of community mobilisation approaches in context and provides key insights for improving the CD approach.

  7. How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?

    NASA Astrophysics Data System (ADS)

    Wachowicz, Monica

    2000-04-01

    This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).

  8. Sizing the science data processing requirements for EOS

    NASA Technical Reports Server (NTRS)

    Wharton, Stephen W.; Chang, Hyo D.; Krupp, Brian; Lu, Yun-Chi

    1991-01-01

    The methodology used in the compilation and synthesis of baseline science requirements associated with the 30 + EOS (Earth Observing System) instruments and over 2,400 EOS data products (both output and required input) proposed by EOS investigators is discussed. A brief background on EOS and the EOS Data and Information System (EOSDIS) is presented, and the approach is outlined in terms of a multilayer model. The methodology used to compile, synthesize, and tabulate requirements within the model is described. The principal benefit of this approach is the reduction of effort needed to update the analysis and maintain the accuracy of the science data processing requirements in response to changes in EOS platforms, instruments, data products, processing center allocations, or other model input parameters. The spreadsheets used in the model provide a compact representation, thereby facilitating review and presentation of the information content.

  9. Input-Based Approaches to Teaching Grammar: A Review of Classroom-Oriented Research.

    ERIC Educational Resources Information Center

    Ellis, Rod

    1999-01-01

    Examines the theoretical rationales (universal grammar, information-processing theories, skill-learning theories) for input-based grammar teaching and reviews classroom-oriented research (i.e., enriched-input studies, input-processing studies) that has integrated this option. (Author/VWL)

  10. Speech Communication Behavior; Perspectives and Principles.

    ERIC Educational Resources Information Center

    Barker, Larry L., Ed.; Kibler, Robert J., Ed.

    Readings are included on seven topics: 1) theories and models of communication processes, 2) acquisition and performance of communication behaviors, 3) human information processing and diffusion, 4) persuasion and attitude change, 5) psychophysiological approaches to studying communication, 6) interpersonal communication within transracial…

  11. Safeguards Approaches for Black Box Processes or Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diaz-Marcano, Helly; Gitau, Ernest TN; Hockert, John

    2013-09-25

    The objective of this study is to determine whether a safeguards approach can be developed for “black box” processes or facilities. These are facilities where a State or operator may limit IAEA access to specific processes or portions of a facility; in other cases, the IAEA may be prohibited access to the entire facility. The determination of whether a black box process or facility is safeguardable is dependent upon the details of the process type, design, and layout; the specific limitations on inspector access; and the restrictions placed upon the design information that can be provided to the IAEA. Thismore » analysis identified the necessary conditions for safeguardability of black box processes and facilities.« less

  12. Healthcare delivery systems: designing quality into health information systems.

    PubMed

    Joyce, Phil; Green, Rosamund; Winch, Graham

    2007-01-01

    To ensure that quality is 'engineered in' a holistic, integrated and quality approach is required, and Total Quality Management (TQM) principles are the obvious foundations for this. This paper describes a novel approach to viewing the operations of a healthcare provider where electronic means could be used to distribute information (including electronic fund settlements), building around the Full Service Provider core. Specifically, an approach called the "triple pair flow" model is used to provide a view of healthcare delivery that is integrated, yet detailed, and that combines the strategic enterprise view with a business process view.

  13. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    DTIC Science & Technology

    2002-08-01

    the measurement noise, as well as the physical model of the forward scattered electric field. The Bayesian algorithms for the Uncertain Permittivity...received at multiple sensors. In this research project a tissue- model -based signal-detection theory approach for the detection of mammary tumors in the...oriented information processors. In this research project a tissue- model - based signal detection theory approach for the detection of mammary tumors in the

  14. A Lean Approach to Improving SE Visibility in Large Operational Systems Evolution

    DTIC Science & Technology

    2013-06-01

    large health care system of systems. To enhance both visibility and flow, the approach utilizes visualization techniques, pull-scheduling processes...development processes. This paper describes an example implementation of the concept in a large health care system of systems. To enhance both visibility...and then provides the results to the requestor as soon as available. Hospital System Information Support Development The health care SoS is a set

  15. How Implementation of TQM and the Development of a Process Improvement Model, Within a Forward Support Battalion, Can Improve Preparation of the Material Condition Status Report (DA Form 2406)

    DTIC Science & Technology

    1990-12-01

    studies for the continuing education of managers new to the TQM approach , for informing vendors of their responsibilities under a changed process, and...Department of Defense (DoD) is adopting a management approach known as Total Quality Management (TQM) in an effort to improve quality and productivity...individuals selected be highly knowledgeable about the operations in their shop or unit. The main function of PATs is to collect and summarize process data for

  16. Health technology funding decision-making processes around the world: the same, yet different.

    PubMed

    Stafinski, Tania; Menon, Devidas; Philippon, Donald J; McCabe, Christopher

    2011-06-01

    All healthcare systems routinely make resource allocation decisions that trade off potential health gains to different patient populations. However, when such trade-offs relate to the introduction of new, promising health technologies, perceived 'winners' and 'losers' are more apparent. In recent years, public scrutiny over such decisions has intensified, raising the need to better understand how they are currently made and how they might be improved. The objective of this paper is to critically review and compare current processes for making health technology funding decisions at the regional, state/provincial and national level in 20 countries. A comprehensive search for published, peer-reviewed and grey literature describing actual national, state/provincial and regional/institutional technology decision-making processes was conducted. Information was extracted by two independent reviewers and tabulated to facilitate qualitative comparative analyses. To identify strengths and weaknesses of processes identified, websites of corresponding organizations were searched for commissioned reviews/evaluations, which were subsequently analysed using standard qualitative methods. A total of 21 national, four provincial/state and six regional/institutional-level processes were found. Although information on each one varied, they could be grouped into four sequential categories: (i) identification of the decision problem; (ii) information inputs; (iii) elements of the decision-making process; and (iv) public accountability and decision implementation. While information requirements of all processes appeared substantial and decision-making factors comprehensive, the way in which they were utilized was often unclear, as were approaches used to incorporate social values or equity arguments into decisions. A comprehensive inventory of approaches to implementing the four main components of all technology funding decision-making processes was compiled, from which areas for future work or research aimed at improving the acceptability of decisions were identified. They include the explication of decision criteria and social values underpinning processes.

  17. Students' Approaches to the Evaluation of Digital Information: Insights from Their Trust Judgments

    ERIC Educational Resources Information Center

    Johnson, Frances; Sbaffi, Laura; Rowley, Jennifer

    2016-01-01

    This study contributes to an understanding of the role of experience in the evaluation phase of the information search process. A questionnaire-based survey collected data from 1st and 3rd-year undergraduate students regarding the factors that influence their judgment of the trustworthiness of online health information. Exploratory and…

  18. A New Integrated Approach for the Transfer of Knowledge

    ERIC Educational Resources Information Center

    Lazanas, P.

    2006-01-01

    One of the purposes of knowledge generation at the higher education level is the creation of expertise. However, the mental structures that an expert uses to process information are not generally considered. Instead, information alone is presented to the learner and it is hoped that he or she will somehow integrate this information into knowledge…

  19. Implementation of Systematic Review Tools in IRIS | Science ...

    EPA Pesticide Factsheets

    Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view

  20. Situation Awareness Implications of Adaptive Automation of Air Traffic Controller Information Processing Functions

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; McClernon, Christopher K.; Perry, Carlene M.; Segall, Noa

    2004-01-01

    The goal of this research was to define a measure of situation awareness (SA) in an air traffic control (ATC) task and to assess the influence of adaptive automation (AA) of various information processing functions on controller perception, comprehension and projection. The measure was also to serve as a basis for defining and developing an approach to triggering dynamic control allocations, as part of AA, based on controller SA. To achieve these objectives, an enhanced version of an ATC simulation (Multitask (copyright)) was developed for use in two human factors experiments. The simulation captured the basic functions of Terminal Radar Approach Control (TRACON) and was capable of presenting to operators four different modes of control, including information acquisition, information analysis, decision making and action implementation automation, as well as a completely manual control mode. The SA measure that was developed as part of the research was based on the Situation Awareness Global Assessment Technique (SAGAT), previous goal-directed task analyses of enroute control and TRACON, and a separate cognitive task analysis on the ATC simulation. The results of the analysis on Multitask were used as a basis for formulating SA queries as part of the SAGAT-based approach to measuring controller SA, which was used in the experiments. A total of 16 subjects were recruited for both experiments. Half the subjects were used in Experiment #1, which focused on assessing the sensitivity and reliability of the SA measurement approach in the ATC simulation. Comparisons were made of manual versus automated control. The remaining subjects were used in the second experiment, which was intended to more completely describe the SA implications of AA applied to specific controller information processing functions, and to describe how the measure could ultimately serve as a trigger of dynamic function allocations in the application of AA to ATC. Comparisons were made of the sensitivity of the SA measure to automation manipulations impacting both higher-order information processing functions, such as information analysis and decision making, versus lower-order functions, including information acquisition and action implementation. All subjects were exposed to all forms of AA of the ATC task and the manual control condition. The approach to AA used in both experiments was to match operator workload, assessed using a secondary task, to dynamic control allocations in the primary task. In total, the subjects in each experiment participated in 10 trials with each lasting between 45 minutes and 1 hour. In both experiments, ATC performance was measured in terms of aircraft cleared, conflicting, and collided. Secondary task (gauge monitoring) performance was assessed in terms of a hit-to-signal ratio. As part of the SA measure, three simulation freezes were conducted during each trial to administer queries on Level 1, 2, and 3 SA.

  1. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  2. Towards a Unified Approach to Information Integration - A review paper on data/information fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitney, Paul D.; Posse, Christian; Lei, Xingye C.

    2005-10-14

    Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, informationmore » is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.« less

  3. Application of self-organizing maps to the study of U-Zr-Ti-Nb distribution in sandstone-hosted uranium ores

    NASA Astrophysics Data System (ADS)

    Klus, Jakub; Pořízka, Pavel; Prochazka, David; Mikysek, Petr; Novotný, Jan; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef

    2017-05-01

    This paper presents a novel approach for processing the spectral information obtained from high-resolution elemental mapping performed by means of Laser-Induced Breakdown Spectroscopy. The proposed methodology is aimed at the description of possible elemental associations within a heterogeneous sample. High-resolution elemental mapping provides a large number of measurements. Moreover, typical laser-induced plasma spectrum consists of several thousands of spectral variables. Analysis of heterogeneous samples, where valuable information is hidden in a limited fraction of sample mass, requires special treatment. The sample under study is a sandstone-hosted uranium ore that shows irregular distribution of ore elements such as zirconium, titanium, uranium and niobium. Presented processing methodology shows the way to reduce the dimensionality of data and retain the spectral information by utilizing self-organizing maps (SOM). The spectral information from SOM is processed further to detect either simultaneous or isolated presence of elements. Conclusions suggested by SOM are in good agreement with geological studies of mineralization phases performed at the deposit. Even deeper investigation of the SOM results enables discrimination of interesting measurements and reveals new possibilities in the visualization of chemical mapping information. Suggested approach improves the description of elemental associations in mineral phases, which is crucial for the mining industry.

  4. Bush Encroachment Mapping for Africa - Multi-Scale Analysis with Remote Sensing and GIS

    NASA Astrophysics Data System (ADS)

    Graw, V. A. M.; Oldenburg, C.; Dubovyk, O.

    2015-12-01

    Bush encroachment describes a global problem which is especially facing the savanna ecosystem in Africa. Livestock is directly affected by decreasing grasslands and inedible invasive species which defines the process of bush encroachment. For many small scale farmers in developing countries livestock represents a type of insurance in times of crop failure or drought. Among that bush encroachment is also a problem for crop production. Studies on the mapping of bush encroachment so far focus on small scales using high-resolution data and rarely provide information beyond the national level. Therefore a process chain was developed using a multi-scale approach to detect bush encroachment for whole Africa. The bush encroachment map is calibrated with ground truth data provided by experts in Southern, Eastern and Western Africa. By up-scaling location specific information on different levels of remote sensing imagery - 30m with Landsat images and 250m with MODIS data - a map is created showing potential and actual areas of bush encroachment on the African continent and thereby provides an innovative approach to map bush encroachment on the regional scale. A classification approach links location data based on GPS information from experts to the respective pixel in the remote sensing imagery. Supervised classification is used while actual bush encroachment information represents the training samples for the up-scaling. The classification technique is based on Random Forests and regression trees, a machine learning classification approach. Working on multiple scales and with the help of field data an innovative approach can be presented showing areas affected by bush encroachment on the African continent. This information can help to prevent further grassland decrease and identify those regions where land management strategies are of high importance to sustain livestock keeping and thereby also secure livelihoods in rural areas.

  5. FROM ASSESSMENT TO POLICY--LESSONS LEARNED FROM THE U.S. NATIONAL ASSESSMENT (Journal Article)

    EPA Science Inventory

    The process of translating scientific information into timely and useful insights that inform policy and resource management decisions, despite the existence of uncertainties, is a difficult and challenging task. Policy-focused assessment is one approach to achieving this end. ...

  6. FROM ASSESSMENT TO POLICY: LESSONS LEARNED FROM THE U.S. NATIONAL ASSESSMENT

    EPA Science Inventory

    The process of translating scientific information into timely and useful insights that inform policy and resource management decisions, despite the existence of uncertainties, is a difficult and challenging task. Policy-focused assessment is one approach to achieving this end. I...

  7. A Menagerie of Tracks at Maryland: HARD, Enterprise, QA, and Genomics, Oh My!

    DTIC Science & Technology

    2006-01-01

    mutually agreeable search strategy for acquiring the desired information. Like information need negotiation in a reference interview, clarification...answer key to identify relevant nuggets in system responses. The obvious downside of this approach is that the process requires human intervention

  8. Neurocounseling: Brain-Based Clinical Approaches

    ERIC Educational Resources Information Center

    Field, Thomas A., Ed.; Jones, Laura K., Ed.; Russell-Chapin, Lori A.

    2017-01-01

    This text presents current, accessible information on enhancing the counseling process using a brain-based paradigm. Leading experts provide guidelines and insights for becoming a skillful neuroscience-informed counselor, making direct connections between the material covered and clinical practice. In this much-needed resource-the first to address…

  9. Development of a framework to improve the process of recruitment to randomised controlled trials (RCTs): the SEAR (Screened, Eligible, Approached, Randomised) framework.

    PubMed

    Wilson, Caroline; Rooshenas, Leila; Paramasivan, Sangeetha; Elliott, Daisy; Jepson, Marcus; Strong, Sean; Birtle, Alison; Beard, David J; Halliday, Alison; Hamdy, Freddie C; Lewis, Rebecca; Metcalfe, Chris; Rogers, Chris A; Stein, Robert C; Blazeby, Jane M; Donovan, Jenny L

    2018-01-19

    Research has shown that recruitment to trials is a process that stretches from identifying potentially eligible patients, through eligibility assessment, to obtaining informed consent. The length and complexity of this pathway means that many patients do not have the opportunity to consider participation. This article presents the development of a simple framework to document, understand and improve the process of trial recruitment. Eight RCTs integrated a QuinteT Recruitment Intervention (QRI) into the main trial, feasibility or pilot study. Part of the QRI required mapping the patient recruitment pathway using trial-specific screening and recruitment logs. A content analysis compared the logs to identify aspects of the recruitment pathway and process that were useful in monitoring and improving recruitment. Findings were synthesised to develop an optimised simple framework that can be used in a wide range of RCTs. The eight trials recorded basic information about patients screened for trial participation and randomisation outcome. Three trials systematically recorded reasons why an individual was not enrolled in the trial, and further details why they were not eligible or approached, or declined randomisation. A framework to facilitate clearer recording of the recruitment process and reasons for non-participation was developed: SEAR - Screening, to identify potentially eligible trial participants; Eligibility, assessed against the trial protocol inclusion/exclusion criteria; Approach, the provision of oral and written information and invitation to participate in the trial, and Randomised or not, with the outcome of randomisation or treatment received. The SEAR framework encourages the collection of information to identify recruitment obstacles and facilitate improvements to the recruitment process. SEAR can be adapted to monitor recruitment to most RCTs, but is likely to add most value in trials where recruitment problems are anticipated or evident. Further work to test it more widely is recommended.

  10. Developing a systematic approach to safer medication use during pregnancy: summary of a Centers for Disease Control and Prevention--convened meeting.

    PubMed

    Broussard, Cheryl S; Frey, Meghan T; Hernandez-Diaz, Sonia; Greene, Michael F; Chambers, Christina D; Sahin, Leyla; Collins Sharp, Beth A; Honein, Margaret A

    2014-09-01

    To address information gaps that limit informed clinical decisions on medication use in pregnancy, the Centers for Disease Control and Prevention (CDC) solicited expert input on a draft prototype outlining a systematic approach to evaluating the quality and strength of existing evidence for associated risks. The draft prototype outlined a process for the systematic review of available evidence and deliberations by a panel of experts to inform clinical decision making for managing health conditions in pregnancy. At an expert meeting convened by the CDC in January 2013, participants divided into working groups discussed decision points within the prototype. This report summarizes their discussions of best practices for formulating an expert review process, developing evidence summaries and treatment guidance, and disseminating information. There is clear recognition of current knowledge gaps and a strong collaboration of federal partners, academic experts, and professional organizations willing to work together toward safer medication use during pregnancy. Published by Elsevier Inc.

  11. Developing a systematic approach to safer medication use during pregnancy: summary of a Centers for Disease Control and Prevention—convened meeting

    PubMed Central

    Broussard, Cheryl S.; Frey, Meghan T.; Hernandez-Diaz, Sonia; Greene, Michael F.; Chambers, Christina D.; Sahin, Leyla; Collins Sharp, Beth A.; Honein, Margaret A.

    2015-01-01

    To address information gaps that limit informed clinical decisions on medication use in pregnancy, the Centers for Disease Control and Prevention (CDC) solicited expert input on a draft prototype outlining a systematic approach to evaluating the quality and strength of existing evidence for associated risks. The draft prototype outlined a process for the systematic review of available evidence and deliberations by a panel of experts to inform clinical decision making for managing health conditions in pregnancy. At an expert meeting convened by the CDC in January 2013, participants divided into working groups discussed decision points within the prototype. This report summarizes their discussions of best practices for formulating an expert review process, developing evidence summaries and treatment guidance, and disseminating information. There is clear recognition of current knowledge gaps and a strong collaboration of federal partners, academic experts, and professional organizations willing to work together toward safer medication use during pregnancy. PMID:24881821

  12. Restoration of color in a remote sensing image and its quality evaluation

    NASA Astrophysics Data System (ADS)

    Zhang, Zuxun; Li, Zhijiang; Zhang, Jianqing; Wang, Zhihe

    2003-09-01

    This paper is focused on the restoration of color remote sensing (including airborne photo). A complete approach is recommended. It propose that two main aspects should be concerned in restoring a remote sensing image, that are restoration of space information, restoration of photometric information. In this proposal, the restoration of space information can be performed by making the modulation transfer function (MTF) as degradation function, in which the MTF is obtained by measuring the edge curve of origin image. The restoration of photometric information can be performed by improved local maximum entropy algorithm. What's more, a valid approach in processing color remote sensing image is recommended. That is splits the color remote sensing image into three monochromatic images which corresponding three visible light bands and synthesizes the three images after being processed separately with psychological color vision restriction. Finally, three novel evaluation variables are obtained based on image restoration to evaluate the image restoration quality in space restoration quality and photometric restoration quality. An evaluation is provided at last.

  13. Preface to QoIS 2009

    NASA Astrophysics Data System (ADS)

    Comyn-Wattiau, Isabelle; Thalheim, Bernhard

    Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.

  14. The Effects of Probe Similarity on Retrieval and Comparison Processes in Associative Recognition.

    PubMed

    Zhang, Qiong; Walsh, Matthew M; Anderson, John R

    2017-02-01

    In this study, we investigated the information processing stages underlying associative recognition. We recorded EEG data while participants performed a task that involved deciding whether a probe word triple matched any previously studied triple. We varied the similarity between probes and studied triples. According to a model of associative recognition developed in the Adaptive Control of Thought-Rational cognitive architecture, probe similarity affects the duration of the retrieval stage: Retrieval is fastest when the probe is similar to a studied triple. This effect may be obscured, however, by the duration of the comparison stage, which is fastest when the probe is not similar to the retrieved triple. Owing to the opposing effects of probe similarity on retrieval and comparison, overall RTs provide little information about each stage's duration. As such, we evaluated the model using a novel approach that decomposes the EEG signal into a sequence of latent states and provides information about the durations of the underlying information processing stages. The approach uses a hidden semi-Markov model to identify brief sinusoidal peaks (called bumps) that mark the onsets of distinct cognitive stages. The analysis confirmed that probe type has opposite effects on retrieval and comparison stages.

  15. WPS mediation: An approach to process geospatial data on different computing backends

    NASA Astrophysics Data System (ADS)

    Giuliani, Gregory; Nativi, Stefano; Lehmann, Anthony; Ray, Nicolas

    2012-10-01

    The OGC Web Processing Service (WPS) specification allows generating information by processing distributed geospatial data made available through Spatial Data Infrastructures (SDIs). However, current SDIs have limited analytical capacities and various problems emerge when trying to use them in data and computing-intensive domains such as environmental sciences. These problems are usually not or only partially solvable using single computing resources. Therefore, the Geographic Information (GI) community is trying to benefit from the superior storage and computing capabilities offered by distributed computing (e.g., Grids, Clouds) related methods and technologies. Currently, there is no commonly agreed approach to grid-enable WPS. No implementation allows one to seamlessly execute a geoprocessing calculation following user requirements on different computing backends, ranging from a stand-alone GIS server up to computer clusters and large Grid infrastructures. Considering this issue, this paper presents a proof of concept by mediating different geospatial and Grid software packages, and by proposing an extension of WPS specification through two optional parameters. The applicability of this approach will be demonstrated using a Normalized Difference Vegetation Index (NDVI) mediated WPS process, highlighting benefits, and issues that need to be further investigated to improve performances.

  16. Nested polynomial trends for the improvement of Gaussian process-based predictors

    NASA Astrophysics Data System (ADS)

    Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.

    2017-10-01

    The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.

  17. Five Faces of Cognition: Theoretical Influences on Approaches to Learning Disabilities.

    ERIC Educational Resources Information Center

    Hresko, Wayne P.; Reid, D. Kim

    1988-01-01

    The article points out that the label "cognitive" has been used to designate five substantially different approaches to learning disabilities: information processing, metacognition, genetic epistemology, cognitive behavior modification, and the specific-abilities model. Despite the similar label, the instructional interventions of these approaches…

  18. A Systematic Approach to Subgroup Classification in Intellectual Disability

    ERIC Educational Resources Information Center

    Schalock, Robert L.; Luckasson, Ruth

    2015-01-01

    This article describes a systematic approach to subgroup classification based on a classification framework and sequential steps involved in the subgrouping process. The sequential steps are stating the purpose of the classification, identifying the classification elements, using relevant information, and using clearly stated and purposeful…

  19. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach

    PubMed Central

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing. PMID:27462287

  20. Cognitive Effects of Mindfulness Training: Results of a Pilot Study Based on a Theory Driven Approach.

    PubMed

    Wimmer, Lena; Bellingrath, Silja; von Stockhausen, Lisa

    2016-01-01

    The present paper reports a pilot study which tested cognitive effects of mindfulness practice in a theory-driven approach. Thirty-four fifth graders received either a mindfulness training which was based on the mindfulness-based stress reduction approach (experimental group), a concentration training (active control group), or no treatment (passive control group). Based on the operational definition of mindfulness by Bishop et al. (2004), effects on sustained attention, cognitive flexibility, cognitive inhibition, and data-driven as opposed to schema-based information processing were predicted. These abilities were assessed in a pre-post design by means of a vigilance test, a reversible figures test, the Wisconsin Card Sorting Test, a Stroop test, a visual search task, and a recognition task of prototypical faces. Results suggest that the mindfulness training specifically improved cognitive inhibition and data-driven information processing.

  1. Experimental Study Comparing a Traditional Approach to Performance Appraisal Training to a Whole-Brain Training Method at C.B. Fleet Laboratories

    ERIC Educational Resources Information Center

    Selden, Sally; Sherrier, Tom; Wooters, Robert

    2012-01-01

    The purpose of this study is to examine the effects of a new approach to performance appraisal training. Motivated by split-brain theory and existing studies of cognitive information processing and performance appraisals, this exploratory study examined the effects of a whole-brain approach to training managers for implementing performance…

  2. Participatory Design, User Involvement and Health IT Evaluation.

    PubMed

    Kushniruk, Andre; Nøhr, Christian

    2016-01-01

    End user involvement and input into the design and evaluation of information systems has been recognized as being a critical success factor in the adoption of information systems. Nowhere is this need more critical than in the design of health information systems. Consistent with evidence from the general software engineering literature, the degree of user input into design of complex systems has been identified as one of the most important factors in the success or failure of complex information systems. The participatory approach goes beyond user-centered design and co-operative design approaches to include end users as more active participants in design ideas and decision making. Proponents of participatory approaches argue for greater end user participation in both design and evaluative processes. Evidence regarding the effectiveness of increased user involvement in design is explored in this contribution in the context of health IT. The contribution will discuss several approaches to including users in design and evaluation. Challenges in IT evaluation during participatory design will be described and explored along with several case studies.

  3. A method of demand-driven and data-centric Web service configuration for flexible business process implementation

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Li Da; Fei, Xiang; Jiang, Lihong; Cai, Hongming; Wang, Shuai

    2017-08-01

    Facing the rapidly changing business environments, implementation of flexible business process is crucial, but difficult especially in data-intensive application areas. This study aims to provide scalable and easily accessible information resources to leverage business process management. In this article, with a resource-oriented approach, enterprise data resources are represented as data-centric Web services, grouped on-demand of business requirement and configured dynamically to adapt to changing business processes. First, a configurable architecture CIRPA involving information resource pool is proposed to act as a scalable and dynamic platform to virtualise enterprise information resources as data-centric Web services. By exposing data-centric resources as REST services in larger granularities, tenant-isolated information resources could be accessed in business process execution. Second, dynamic information resource pool is designed to fulfil configurable and on-demand data accessing in business process execution. CIRPA also isolates transaction data from business process while supporting diverse business processes composition. Finally, a case study of using our method in logistics application shows that CIRPA provides an enhanced performance both in static service encapsulation and dynamic service execution in cloud computing environment.

  4. Human Factors of CC-130 Operations. Volume 5: Human Factors in Decision Making

    DTIC Science & Technology

    1998-02-01

    known about human information processing and decision making. Topics for HFDM training come directly from this theoretical framework . The proposed...The proposed training can be distinguished from other approaches with similar goals (either explicit or implicit) by its base within a theoretical ... framework of human information processing. The differences lie less in the content than in the way the material is organized and shaped by theory. The

  5. NASA Systems Engineering Handbook

    NASA Technical Reports Server (NTRS)

    Hirshorn, Steven R.; Voss, Linda D.; Bromley, Linda K.

    2017-01-01

    The update of this handbook continues the methodology of the previous revision: a top-down compatibility with higher level Agency policy and a bottom-up infusion of guidance from the NASA practitioners in the field. This approach provides the opportunity to obtain best practices from across NASA and bridge the information to the established NASA systems engineering processes and to communicate principles of good practice as well as alternative approaches rather than specify a particular way to accomplish a task. The result embodied in this handbook is a top-level implementation approach on the practice of systems engineering unique to NASA. Material used for updating this handbook has been drawn from many sources, including NPRs, Center systems engineering handbooks and processes, other Agency best practices, and external systems engineering textbooks and guides. This handbook consists of six chapters: (1) an introduction, (2) a systems engineering fundamentals discussion, (3) the NASA program project life cycles, (4) systems engineering processes to get from a concept to a design, (5) systems engineering processes to get from a design to a final product, and (6) crosscutting management processes in systems engineering. The chapters are supplemented by appendices that provide outlines, examples, and further information to illustrate topics in the chapters. The handbook makes extensive use of boxes and figures to define, refine, illustrate, and extend concepts in the chapters.

  6. Information processing and dynamics in minimally cognitive agents.

    PubMed

    Beer, Randall D; Williams, Paul L

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.

  7. Digital Image Processing in Private Industry.

    ERIC Educational Resources Information Center

    Moore, Connie

    1986-01-01

    Examines various types of private industry optical disk installations in terms of business requirements for digital image systems in five areas: records management; transaction processing; engineering/manufacturing; information distribution; and office automation. Approaches for implementing image systems are addressed as well as key success…

  8. Global Persistent Attack: A Systems Architecture, Process Modeling, and Risk Analysis Approach

    DTIC Science & Technology

    2008-06-01

    develop an analysis process for quantifying risk associated with the limitations presented by a fiscally constrained environment. The second step...previous independent analysis of each force structure provided information for quantifying risk associated with the given force presentations, the

  9. The impact of teachers' approaches to teaching and students' learning styles on students' approaches to learning in college online biology courses

    NASA Astrophysics Data System (ADS)

    Hong, Yuh-Fong

    With the rapid growth of online courses in higher education institutions, research on quality of learning for online courses is needed. However, there is a notable lack of research in the cited literature providing evidence that online distance education promotes the quality of independent learning to which it aspires. Previous studies focused on academic outcomes and technology applications which do not monitor students' learning processes, such as their approaches to learning. Understanding students' learning processes and factors influencing quality of learning will provide valuable information for instructors and institutions in providing quality online courses and programs. The purpose of this study was to identify and investigate college biology teachers' approaches to teaching and students' learning styles, and to examine the impact of approaches to teaching and learning styles on students' approaches to learning via online instruction. Data collection included eighty-seven participants from five online biology courses at a community college in the southern area of Texas. Data analysis showed the following results. First, there were significant differences in approaches to learning among students with different learning styles. Second, there was a significant difference in students' approaches to learning between classes using different approaches to teaching. Three, the impact of learning styles on students' approaches to learning was not influenced by instructors' approaches to teaching. Two conclusions were obtained from the results. First, individuals with the ability to perceive information abstractly might be more likely to adopt deep approaches to learning than those preferring to perceive information through concrete experience in online learning environments. Second, Teaching Approach Inventory might not be suitable to measure approaches to teaching for online biology courses due to online instructional design and technology limitations. Based on the findings and conclusions of this study, implications for distance education and future research are described.

  10. An Estimate of the Total DNA in the Biosphere

    PubMed Central

    Landenmark, Hanna K. E.; Forgan, Duncan H.; Cockell, Charles S.

    2015-01-01

    Modern whole-organism genome analysis, in combination with biomass estimates, allows us to estimate a lower bound on the total information content in the biosphere: 5.3 × 1031 (±3.6 × 1031) megabases (Mb) of DNA. Given conservative estimates regarding DNA transcription rates, this information content suggests biosphere processing speeds exceeding yottaNOPS values (1024 Nucleotide Operations Per Second). Although prokaryotes evolved at least 3 billion years before plants and animals, we find that the information content of prokaryotes is similar to plants and animals at the present day. This information-based approach offers a new way to quantify anthropogenic and natural processes in the biosphere and its information diversity over time. PMID:26066900

  11. An Estimate of the Total DNA in the Biosphere.

    PubMed

    Landenmark, Hanna K E; Forgan, Duncan H; Cockell, Charles S

    2015-06-01

    Modern whole-organism genome analysis, in combination with biomass estimates, allows us to estimate a lower bound on the total information content in the biosphere: 5.3 × 1031 (±3.6 × 1031) megabases (Mb) of DNA. Given conservative estimates regarding DNA transcription rates, this information content suggests biosphere processing speeds exceeding yottaNOPS values (1024 Nucleotide Operations Per Second). Although prokaryotes evolved at least 3 billion years before plants and animals, we find that the information content of prokaryotes is similar to plants and animals at the present day. This information-based approach offers a new way to quantify anthropogenic and natural processes in the biosphere and its information diversity over time.

  12. Combination of uncertainty theories and decision-aiding methods for natural risk management in a context of imperfect information

    NASA Astrophysics Data System (ADS)

    Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille

    2017-04-01

    In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.

  13. IT strategic planning in hospitals: from theory to practice.

    PubMed

    Jaana, Mirou; Teitelbaum, Mari; Roffey, Tyson

    2014-07-01

    To date, IT strategic planning has been mostly theory-based with limited information on "best practices" in this area. This study presents the process and outcomes of IT strategic planning undertaken at a pediatric hospital (PH) in Canada. A five-stage sequential and incremental process was adopted. Various tools / approaches were used including review of existing documentation, internal survey (n = 111), fifteen interviews, and twelve workshops. IT strategic planning was informed by 230 individuals (12 percent of hospital community) and revealed consistency in the themes and concerns raised by participants (e.g., slow IT projects delivery rate, lack of understanding of IT priorities, strained communication with IT staff). Mobile and remote access to patients' information, and an integrated EMR were identified as top priorities. The methodology and used approach revealed effective, improved internal relationships, and ensured commitment to the final IT strategic plan. Several lessons were learned including: maintaining a dynamic approach capable of adapting to the fast technology evolution; involving stakeholders and ensuring continuous communication; using effective research tools to support strategic planning; and grounding the process and final product in existing models. This study contributes to the development of "best practices" in IT strategic planning, and illustrates "how" to apply the theoretical principles in this area. This is especially important as IT leaders are encouraged to integrate evidence-based management into their decision making and practices. The methodology and lessons learned may inform practitioners in other hospitals planning to engage in IT strategic planning in the future.

  14. Collaborating with Youth to Inform and Develop Tools for Psychotropic Decision Making

    PubMed Central

    Murphy, Andrea; Gardner, David; Kutcher, Stan; Davidson, Simon; Manion, Ian

    2010-01-01

    Introduction: Youth oriented and informed resources designed to support psychopharmacotherapeutic decision-making are essentially unavailable. This article outlines the approach taken to design such resources, the product that resulted from the approach taken, and the lessons learned from the process. Methods: A project team with psychopharmacology expertise was assembled. The project team reviewed best practices regarding medication educational materials and related tools to support decisions. Collaboration with key stakeholders who were thought of as primary end-users and target groups occurred. A graphic designer and a plain language consultant were also retained. Results: Through an iterative and collaborative process over approximately 6 months, Med Ed and Med Ed Passport were developed. Literature and input from key stakeholders, in particular youth, was instrumental to the development of the tools and materials within Med Ed. A training program utilizing a train-the-trainer model was developed to facilitate the implementation of Med Ed in Ontario, which is currently ongoing. Conclusion: An evidence-informed process that includes youth and key stakeholder engagement is required for developing tools to support in psychopharmacotherapeutic decision-making. The development process fostered an environment of reciprocity between the project team and key stakeholders. PMID:21037916

  15. Investigation of signal models and methods for evaluating structures of processing telecommunication information exchange systems under acoustic noise conditions

    NASA Astrophysics Data System (ADS)

    Kropotov, Y. A.; Belov, A. A.; Proskuryakov, A. Y.; Kolpakov, A. A.

    2018-05-01

    The paper considers models and methods for estimating signals during the transmission of information messages in telecommunication systems of audio exchange. One-dimensional probability distribution functions that can be used to isolate useful signals, and acoustic noise interference are presented. An approach to the estimation of the correlation and spectral functions of the parameters of acoustic signals is proposed, based on the parametric representation of acoustic signals and the components of the noise components. The paper suggests an approach to improving the efficiency of interference cancellation and highlighting the necessary information when processing signals from telecommunications systems. In this case, the suppression of acoustic noise is based on the methods of adaptive filtering and adaptive compensation. The work also describes the models of echo signals and the structure of subscriber devices in operational command telecommunications systems.

  16. Imagining roles for epigenetics in health promotion research.

    PubMed

    McBride, Colleen M; Koehly, Laura M

    2017-04-01

    Discoveries from the Human Genome Project have invigorated discussions of epigenetic effects-modifiable chemical processes that influence DNA's ability to give instructions to turn gene expression on or off-on health outcomes. We suggest three domains in which new understandings of epigenetics could inform innovations in health promotion research: (1) increase the motivational potency of health communications (e.g., explaining individual differences in health outcomes to interrupt optimistic biases about health exposures); (2) illuminate new approaches to targeted and tailored health promotion interventions (e.g., relapse prevention targeted to epigenetic responses to intervention participation); and (3) inform more sensitive measures of intervention impact, (e.g., replace or augment self-reported adherence). We suggest a three-step process for using epigenetics in health promotion research that emphasizes integrating epigenetic mechanisms into conceptual model development that then informs selection of intervention approaches and outcomes. Lastly, we pose examples of relevant scientific questions worth exploring.

  17. Computer-Aided TRIZ Ideality and Level of Invention Estimation Using Natural Language Processing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Adams, Christopher; Tate, Derrick

    Patent textual descriptions provide a wealth of information that can be used to understand the underlying design approaches that result in the generation of novel and innovative technology. This article will discuss a new approach for estimating Degree of Ideality and Level of Invention metrics from the theory of inventive problem solving (TRIZ) using patent textual information. Patent text includes information that can be used to model both the functions performed by a design and the associated costs and problems that affect a design’s value. The motivation of this research is to use patent data with calculation of TRIZ metrics to help designers understand which combinations of system components and functions result in creative and innovative design solutions. This article will discuss in detail methods to estimate these TRIZ metrics using natural language processing and machine learning with the use of neural networks.

  18. Video Analysis and Remote Digital Ethnography: Approaches to understanding user perspectives and processes involving healthcare information technology.

    PubMed

    Kushniruk, Andre W; Borycki, Elizabeth M

    2015-01-01

    Innovations in healthcare information systems promise to revolutionize and streamline healthcare processes worldwide. However, the complexity of these systems and the need to better understand issues related to human-computer interaction have slowed progress in this area. In this chapter the authors describe their work in using methods adapted from usability engineering, video ethnography and analysis of digital log files for improving our understanding of complex real-world healthcare interactions between humans and technology. The approaches taken are cost-effective and practical and can provide detailed ethnographic data on issues health professionals and consumers encounter while using systems as well as potential safety problems. The work is important in that it can be used in techno-anthropology to characterize complex user interactions with technologies and also to provide feedback into redesign and optimization of improved healthcare information systems.

  19. Mixture and odorant processing in the olfactory systems of insects: a comparative perspective.

    PubMed

    Clifford, Marie R; Riffell, Jeffrey A

    2013-11-01

    Natural olfactory stimuli are often complex mixtures of volatiles, of which the identities and ratios of constituents are important for odor-mediated behaviors. Despite this importance, the mechanism by which the olfactory system processes this complex information remains an area of active study. In this review, we describe recent progress in how odorants and mixtures are processed in the brain of insects. We use a comparative approach toward contrasting olfactory coding and the behavioral efficacy of mixtures in different insect species, and organize these topics around four sections: (1) Examples of the behavioral efficacy of odor mixtures and the olfactory environment; (2) mixture processing in the periphery; (3) mixture coding in the antennal lobe; and (4) evolutionary implications and adaptations for olfactory processing. We also include pertinent background information about the processing of individual odorants and comparative differences in wiring and anatomy, as these topics have been richly investigated and inform the processing of mixtures in the insect olfactory system. Finally, we describe exciting studies that have begun to elucidate the role of the processing of complex olfactory information in evolution and speciation.

  20. Comprehensive process model of clinical information interaction in primary care: results of a "best-fit" framework synthesis.

    PubMed

    Veinot, Tiffany C; Senteio, Charles R; Hanauer, David; Lowery, Julie C

    2018-06-01

    To describe a new, comprehensive process model of clinical information interaction in primary care (Clinical Information Interaction Model, or CIIM) based on a systematic synthesis of published research. We used the "best fit" framework synthesis approach. Searches were performed in PubMed, Embase, the Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO, Library and Information Science Abstracts, Library, Information Science and Technology Abstracts, and Engineering Village. Two authors reviewed articles according to inclusion and exclusion criteria. Data abstraction and content analysis of 443 published papers were used to create a model in which every element was supported by empirical research. The CIIM documents how primary care clinicians interact with information as they make point-of-care clinical decisions. The model highlights 3 major process components: (1) context, (2) activity (usual and contingent), and (3) influence. Usual activities include information processing, source-user interaction, information evaluation, selection of information, information use, clinical reasoning, and clinical decisions. Clinician characteristics, patient behaviors, and other professionals influence the process. The CIIM depicts the complete process of information interaction, enabling a grasp of relationships previously difficult to discern. The CIIM suggests potentially helpful functionality for clinical decision support systems (CDSSs) to support primary care, including a greater focus on information processing and use. The CIIM also documents the role of influence in clinical information interaction; influencers may affect the success of CDSS implementations. The CIIM offers a new framework for achieving CDSS workflow integration and new directions for CDSS design that can support the work of diverse primary care clinicians.

  1. Principles of operating room organization.

    PubMed

    Watkins, W D

    1997-01-01

    The importance of the changing health care climate has triggered important changes in the management of high-cost components of acute care facilities. By integrating and better managing various elements of the surgical process, health care institutions are able to rationally trim costs while maintaining high-quality services. The leadership that physicians can provide is crucial to the success of this undertaking (1). The importance of the use of primary data related to patient throughput and related resources should be strongly emphasized, for only when such data are converted to INFORMATION of functional value can participating healthcare personnel be reasonably expected to anticipate and respond to varying clinical demands with ever-limited resources. Despite the claims of specific commercial vendors, no single product will likely be sufficient to significantly change the perioperative process to the degree or for the duration demanded by healthcare reform. The most effective approach to achieving safety, cost-effectiveness, and predictable process in the realm of Surgical Services will occur by appropriate application of the "best of breed" contributions of: (a) medical/patient safety practice/oversight; (b) information technology; (c) contemporary management; and (d) innovative and functional cost-accounting methodology. S "modified activity-based cost accounting method" can serve as the basis for acquiring true direct-cost information related to the perioperative process. The proposed overall management strategy emphasizes process and feedback, rather than specific product, and although imposing initial demands and change on the traditional hospital setting, can advance the strongest competitive position in perioperative services. This comprehensive approach comprises a functional basis for important bench-marking activities among multiple surgical services. An active, comparative process of this type is of paramount importance in emphasizing patient care and safety as the highest priority while changing the process and cost of perioperative care. Additionally, this approach objectively defines the surgical process in terms by which the impact of new treatments, drugs, devices and process changes can be assessed rationally.

  2. Retrospective: lessons learned from the Santa Barbara project and their implications for health information exchange.

    PubMed

    Frohlich, Jonah; Karp, Sam; Smith, Mark D; Sujansky, Walter

    2007-01-01

    Despite its closure in December 2006, the Santa Barbara County Care Data Exchange helped focus national attention on the value of health information exchange (HIE). This in turn led to the federal government's plan to establish regional health information organizations (RHIOs). During its existence, the project pioneered innovative approaches, including certification of health information technology vendors, a community-wide governance model, and deployment of a peer-to-peer technical model now in wider use. RHIO efforts will benefit from the project's lessons about the need for an incremental development approach, rigorous implementation processes, early attention to privacy and liability concerns, and planning for a sustainable business model.

  3. Formalization of software requirements for information systems using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Yegorov, Y. S.; Milov, V. R.; Kvasov, A. S.; Sorokoumova, S. N.; Suvorova, O. V.

    2018-05-01

    The paper considers an approach to the design of information systems based on flexible software development methodologies. The possibility of improving the management of the life cycle of information systems by assessing the functional relationship between requirements and business objectives is described. An approach is proposed to establish the relationship between the degree of achievement of business objectives and the fulfillment of requirements for the projected information system. It describes solutions that allow one to formalize the process of formation of functional and non-functional requirements with the help of fuzzy logic apparatus. The form of the objective function is formed on the basis of expert knowledge and is specified via learning from very small data set.

  4. A generalized model via random walks for information filtering

    NASA Astrophysics Data System (ADS)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-08-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation.

  5. Expanding the Political Philosophy Dimension of the RISP Model: Examining the Conditional Indirect Effects of Cultural Cognition.

    PubMed

    Hmielowski, Jay D; Wang, Meredith Y; Donaway, Rebecca R

    2018-04-25

    This article attempts to connect literatures from the Risk Information Seeking and Processing (RISP) model and cultural cognition theory. We do this by assessing the relationship between the two prominent cultural cognition variables (i.e., group and grid) and risk perceptions. We then examine whether these risk perceptions are associated with three outcomes important to the RISP model: information seeking, systematic processing, and heuristic processing, through a serial mediation model. We used 2015 data collected from 10 communities across the United States to test our hypotheses. Our results show that people high on group and low on grid (egalitarian communitarians) show greater risk perceptions regarding water quality issues. Moreover, these higher levels of perceived risk translate into increased information seeking, systematic processing of information, and lower heuristic processing through intervening variables from the RISP model (e.g., negative emotions and information insufficiency). These results extend the extant literature by expanding on the treatment of political ideology within the RISP model literature and taking a more nuanced approach to political beliefs in accordance with the cultural cognitions literature. Our article also expands on the RISP literature by looking at information-processing variables. © 2018 Society for Risk Analysis.

  6. A Framework for Preliminary Design of Aircraft Structures Based on Process Information. Part 1

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    1998-01-01

    This report discusses the general framework and development of a computational tool for preliminary design of aircraft structures based on process information. The described methodology is suitable for multidisciplinary design optimization (MDO) activities associated with integrated product and process development (IPPD). The framework consists of three parts: (1) product and process definitions; (2) engineering synthesis, and (3) optimization. The product and process definitions are part of input information provided by the design team. The backbone of the system is its ability to analyze a given structural design for performance as well as manufacturability and cost assessment. The system uses a database on material systems and manufacturing processes. Based on the identified set of design variables and an objective function, the system is capable of performing optimization subject to manufacturability, cost, and performance constraints. The accuracy of the manufacturability measures and cost models discussed here depend largely on the available data on specific methods of manufacture and assembly and associated labor requirements. As such, our focus in this research has been on the methodology itself and not so much on its accurate implementation in an industrial setting. A three-tier approach is presented for an IPPD-MDO based design of aircraft structures. The variable-complexity cost estimation methodology and an approach for integrating manufacturing cost assessment into design process are also discussed. This report is presented in two parts. In the first part, the design methodology is presented, and the computational design tool is described. In the second part, a prototype model of the preliminary design Tool for Aircraft Structures based on Process Information (TASPI) is described. Part two also contains an example problem that applies the methodology described here for evaluation of six different design concepts for a wing spar.

  7. Application of a responsive evaluation approach in medical education.

    PubMed

    Curran, Vernon; Christopher, Jeanette; Lemire, Francine; Collins, Alice; Barrett, Brendan

    2003-03-01

    This paper reports on the usefulness of a responsive evaluation model in evaluating the clinical skills assessment and training (CSAT) programme at the Faculty of Medicine, Memorial University of Newfoundland, Canada. The purpose of this paper is to introduce the responsive evaluation approach, ascertain its utility, feasibility, propriety and accuracy in a medical education context, and discuss its applicability as a model for medical education programme evaluation. Robert Stake's original 12-step responsive evaluation model was modified and reduced to five steps, including: (1) stakeholder audience identification, consultation and issues exploration; (2) stakeholder concerns and issues analysis; (3) identification of evaluative standards and criteria; (4) design and implementation of evaluation methodology; and (5) data analysis and reporting. This modified responsive evaluation process was applied to the CSAT programme and a meta-evaluation was conducted to evaluate the effectiveness of the approach. The responsive evaluation approach was useful in identifying the concerns and issues of programme stakeholders, solidifying the standards and criteria for measuring the success of the CSAT programme, and gathering rich and descriptive evaluative information about educational processes. The evaluation was perceived to be human resource dependent in nature, yet was deemed to have been practical, efficient and effective in uncovering meaningful and useful information for stakeholder decision-making. Responsive evaluation is derived from the naturalistic paradigm and concentrates on examining the educational process rather than predefined outcomes of the process. Responsive evaluation results are perceived as having more relevance to stakeholder concerns and issues, and therefore more likely to be acted upon. Conducting an evaluation that is responsive to the needs of these groups will ensure that evaluative information is meaningful and more likely to be used for programme enhancement and improvement.

  8. First and Second Graders Writing Informational Text

    ERIC Educational Resources Information Center

    Read, Sylvia

    2005-01-01

    Process approaches to writing instruction in primary-grade classrooms have become widespread due to the influence of Graves (1983), Calkins (1986), Avery (1993), and others. Their work emphasizes expressive writing, particularly personal narrative, more than expository or informational writing. As a consequence, expressive writing is what children…

  9. Using rapid reviews: an example from a study conducted to inform policy-making.

    PubMed

    O'Leary, Denise F; Casey, Mary; O'Connor, Laserina; Stokes, Diarmuid; Fealy, Gerard M; O'Brien, Denise; Smith, Rita; McNamara, Martin S; Egan, Claire

    2017-03-01

    A discussion of the potential use of rapid review approaches in nursing and midwifery research which presents a worked example from a study conducted to inform policy decision-making. Rapid reviews, which can be defined as outputs of a knowledge synthesis approach that involves modifying or omitting elements of a systematic review process due to limited time or resources, are becoming increasingly popular in health research. This paper provides guidance on how a rapid review can be undertaken and discusses the strengths and challenges of the approach. Data from a rapid review of the literature undertaken in 2015 is used as a worked example to highlight one method of undertaking a rapid review. Seeking evidence to inform health policy-making or evidence based practice is a process that can be limited by time constraints, making it difficult to conduct comprehensive systematic reviews. Rapid reviews provide a solution as they are a systematic method of synthesizing evidence quickly. There is no single best way to conduct a rapid review but researchers can ensure they are adhering to best practice by being systematic, having subject and methodological expertise on the review team, reporting the details of the approach they took, highlighting the limitations of the approach, engaging in good evidence synthesis and communicating regularly with end users, other team members and experts. © 2016 John Wiley & Sons Ltd.

  10. Sensitivity Analysis and Optimization of Enclosure Radiation with Applications to Crystal Growth

    NASA Technical Reports Server (NTRS)

    Tiller, Michael M.

    1995-01-01

    In engineering, simulation software is often used as a convenient means for carrying out experiments to evaluate physical systems. The benefit of using simulations as 'numerical' experiments is that the experimental conditions can be easily modified and repeated at much lower cost than the comparable physical experiment. The goal of these experiments is to 'improve' the process or result of the experiment. In most cases, the computational experiments employ the same trial and error approach as their physical counterparts. When using this approach for complex systems, the cause and effect relationship of the system may never be fully understood and efficient strategies for improvement never utilized. However, it is possible when running simulations to accurately and efficiently determine the sensitivity of the system results with respect to simulation to accurately and efficiently determine the sensitivity of the system results with respect to simulation parameters (e.g., initial conditions, boundary conditions, and material properties) by manipulating the underlying computations. This results in a better understanding of the system dynamics and gives us efficient means to improve processing conditions. We begin by discussing the steps involved in performing simulations. Then we consider how sensitivity information about simulation results can be obtained and ways this information may be used to improve the process or result of the experiment. Next, we discuss optimization and the efficient algorithms which use sensitivity information. We draw on all this information to propose a generalized approach for integrating simulation and optimization, with an emphasis on software programming issues. After discussing our approach to simulation and optimization we consider an application involving crystal growth. This application is interesting because it includes radiative heat transfer. We discuss the computation of radiative new factors and the impact this mode of heat transfer has on our approach. Finally, we will demonstrate the results of our optimization.

  11. Approach to implementing a DICOM network: incorporate both economics and workflow adaptation

    NASA Astrophysics Data System (ADS)

    Beaver, S. Merritt; Sippel-Schmidt, Teresa M.

    1995-05-01

    This paper describes an approach to aide in the decision-making process for the justification and design of a digital image and information management system. It identifies key technical and clinical issues that need to be addressed by a healthcare institution during this process. Some issues identified here are very controversial and may take months or years for a department to determine solutions which meet their specific staffing, financial, and technical needs.

  12. Factory approach can streamline patient accounting.

    PubMed

    Rands, J; Muench, M

    1991-08-01

    Although they may seem fundamentally different, similarities exist between operations of factories and healthcare organizations' business offices. As a result, a patient accounting approach based on manufacturing firms' management techniques may help smooth healthcare business processes. Receivables performance management incorporates the Japanese techniques of "just-in-time" and total quality management to reduce unbilled accounts and information backlog and accelerate payment. A preliminary diagnostic assessment of a patient accounting process helps identify bottlenecks and set priorities for work flow.

  13. Dynamic Stimuli And Active Processing In Human Visual Perception

    NASA Astrophysics Data System (ADS)

    Haber, Ralph N.

    1990-03-01

    Theories of visual perception traditionally have considered a static retinal image to be the starting point for processing; and has considered processing both to be passive and a literal translation of that frozen, two dimensional, pictorial image. This paper considers five problem areas in the analysis of human visually guided locomotion, in which the traditional approach is contrasted to newer ones that utilize dynamic definitions of stimulation, and an active perceiver: (1) differentiation between object motion and self motion, and among the various kinds of self motion (e.g., eyes only, head only, whole body, and their combinations); (2) the sources and contents of visual information that guide movement; (3) the acquisition and performance of perceptual motor skills; (4) the nature of spatial representations, percepts, and the perceived layout of space; and (5) and why the retinal image is a poor starting point for perceptual processing. These newer approaches argue that stimuli must be considered as dynamic: humans process the systematic changes in patterned light when objects move and when they themselves move. Furthermore, the processing of visual stimuli must be active and interactive, so that perceivers can construct panoramic and stable percepts from an interaction of stimulus information and expectancies of what is contained in the visual environment. These developments all suggest a very different approach to the computational analyses of object location and identification, and of the visual guidance of locomotion.

  14. Low-Income Low-Qualified Employees' Access to Workplace Learning

    ERIC Educational Resources Information Center

    McPherson, Rebecca; Wang, Jia

    2014-01-01

    Purpose: The purpose of this paper was to investigate the embedded process that enables or constrains low-income low-qualified employees' access to workplace learning in small organizations. Design/methodology/approach: Informed by the sociomaterial approach and cultural historical activity theory, this study adopted a qualitative cross-case study…

  15. Science Adventures with Children's Literature: A Thematic Approach.

    ERIC Educational Resources Information Center

    Fredericks, Anthony D.

    This guide provides background information on the development and implementation of thematic units that focus on a hands-on approach, process orientation, integrated curriculum, cooperative learning, and critical thinking. Topics of the thematic units and mini-units include wild animals, dinosaurs, rainforests, the human body, earth science,…

  16. Research on Adult Learning and Memory: Retrospect and Prospect.

    ERIC Educational Resources Information Center

    Hultsch, David F.; Pentz, C. A.

    1980-01-01

    Descriptions of cognitive development are determined by the metamodel on which theories and data are based. The associative and information processing approaches have generated much of the research on adult learning and memory. A contextual approach, emphasizing perceiving, comprehending, and remembering, is emerging in the present historical…

  17. Education for Sustainable Consumption through Mindfulness Training: Development of a Consumption-Specific Intervention

    ERIC Educational Resources Information Center

    Stanszus, Laura; Fischer, Daniel; Böhme, Tina; Frank, Pascal; Fritzsche, Jacomo; Geiger, Sonja; Harfensteller, Julia; Grossman, Paul; Schrader, Ulf

    2017-01-01

    Several widespread approaches to Education for Sustainable Consumption (ESC) have emerged from the tradition of consumer information. A major shortcoming of such cognitive-focused approaches is their limited capacity to facilitate reflection on the affective processes underpinning people's engagement with consumption. More holistic pedagogies are…

  18. A Method to Assess Climate-Relevant Decisions: Application in the Chesapeake Bay (2010 External Review Draft)

    EPA Science Inventory

    The goal of this study is to formalize an approach to inventory and analyze management decisions in order to produce useful information targeted toward effective adaptation to climate change. The approach uses as its starting point ongoing planning processes and decisions geared ...

  19. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  20. Quality of Information Approach to Improving Source Selection in Tactical Networks

    DTIC Science & Technology

    2017-02-01

    consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a

  1. Visual communications with side information via distributed printing channels: extended multimedia and security perspectives

    NASA Astrophysics Data System (ADS)

    Voloshynovskiy, Sviatoslav V.; Koval, Oleksiy; Deguillaume, Frederic; Pun, Thierry

    2004-06-01

    In this paper we address visual communications via printing channels from an information-theoretic point of view as communications with side information. The solution to this problem addresses important aspects of multimedia data processing, security and management, since printed documents are still the most common form of visual information representation. Two practical approaches to side information communications for printed documents are analyzed in the paper. The first approach represents a layered joint source-channel coding for printed documents. This approach is based on a self-embedding concept where information is first encoded assuming a Wyner-Ziv set-up and then embedded into the original data using a Gel'fand-Pinsker construction and taking into account properties of printing channels. The second approach is based on Wyner-Ziv and Berger-Flynn-Gray set-ups and assumes two separated communications channels where an appropriate distributed coding should be elaborated. The first printing channel is considered to be a direct visual channel for images ("analog" channel with degradations). The second "digital channel" with constrained capacity is considered to be an appropriate auxiliary channel. We demonstrate both theoretically and practically how one can benefit from this sort of "distributed paper communications".

  2. Improvement of radiology services based on the process management approach.

    PubMed

    Amaral, Creusa Sayuri Tahara; Rozenfeld, Henrique; Costa, Janaina Mascarenhas Hornos; Magon, Maria de Fátima de Andrade; Mascarenhas, Yvone Maria

    2011-06-01

    The health sector requires continuous investments to ensure the improvement of products and services from a technological standpoint, the use of new materials, equipment and tools, and the application of process management methods. Methods associated with the process management approach, such as the development of reference models of business processes, can provide significant innovations in the health sector and respond to the current market trend for modern management in this sector (Gunderman et al. (2008) [4]). This article proposes a process model for diagnostic medical X-ray imaging, from which it derives a primary reference model and describes how this information leads to gains in quality and improvements. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Connecting Biology to Electronics: Molecular Communication via Redox Modality.

    PubMed

    Liu, Yi; Li, Jinyang; Tschirhart, Tanya; Terrell, Jessica L; Kim, Eunkyoung; Tsao, Chen-Yu; Kelly, Deanna L; Bentley, William E; Payne, Gregory F

    2017-12-01

    Biology and electronics are both expert at for accessing, analyzing, and responding to information. Biology uses ions, small molecules, and macromolecules to receive, analyze, store, and transmit information, whereas electronic devices receive input in the form of electromagnetic radiation, process the information using electrons, and then transmit output as electromagnetic waves. Generating the capabilities to connect biology-electronic modalities offers exciting opportunities to shape the future of biosensors, point-of-care medicine, and wearable/implantable devices. Redox reactions offer unique opportunities for bio-device communication that spans the molecular modalities of biology and electrical modality of devices. Here, an approach to search for redox information through an interactive electrochemical probing that is analogous to sonar is adopted. The capabilities of this approach to access global chemical information as well as information of specific redox-active chemical entities are illustrated using recent examples. An example of the use of synthetic biology to recognize external molecular information, process this information through intracellular signal transduction pathways, and generate output responses that can be detected by electrical modalities is also provided. Finally, exciting results in the use of redox reactions to actuate biology are provided to illustrate that synthetic biology offers the potential to guide biological response through electrical cues. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Analysis and Lessons Learned from an Online, Consultative Dialogue between Community Leaders and Climate Experts

    NASA Astrophysics Data System (ADS)

    Sylak-Glassman, E.; Clavin, C.

    2016-12-01

    Common approaches to climate resilience planning in the United States rely upon participatory planning approaches and dialogues between decision-makers, science translators, and subject matter experts. In an effort to explore alternative approaches support community climate resilience planning, a pilot of a public-private collaboration called the Resilience Dialogues was held in February and March of 2016. The Resilience Dialogues pilot was an online, asynchronous conversation between community leaders and climate experts, designed to help communities begin the process of climate resilience planning. In order to identify lessons learned from the pilot, we analyzed the discourse of the facilitated dialogues, administered surveys and conducted interviews with participants. Our analysis of the pilot suggests that participating community leaders found value in the consultative dialogue with climate experts, despite limited community-originated requests for climate information. Community leaders most often asked for advice regarding adaptation planning, including specific engineering guidance and advice on how to engage community members around the topic of resilience. Community leaders that had access to downscaled climate data asked experts about how to incorporate the data into their existing planning processes. The guidance sought by community leaders during the pilot shows a large range of hurdles that communities face in using climate information to inform their decision-making processes. Having a forum that connects community leaders with relevant experts and other community leaders who have familiarity with both climate impacts and municipal planning processes would likely help communities accelerate their resilience efforts.

  5. The contract process: a methodology for negotiation. Part I.

    PubMed

    Kleinschmidt, W M

    1990-05-01

    This is the first of a three-part series on the contract process for acquiring a hospital information system product. Part One addresses negotiation methodology; points which will facilitate effective negotiation. Part Two will cover contract contents focusing on those topics which must be included in a good contract. Part Three will discuss contract philosophy and contract management; subjects which are critical to the good rapport buyers and vendors want. The adversarial approach to the contract process is not the best approach. Rather, the process should be treated as a step in the building of a partnership and relationship in which both parties win.

  6. Screening for Child Sexual Exploitation in Online Sexual Health Services: An Exploratory Study of Expert Views.

    PubMed

    Spencer-Hughes, Victoria; Syred, Jonathan; Allison, Alison; Holdsworth, Gillian; Baraitser, Paula

    2017-02-14

    Sexual health services routinely screen for child sexual exploitation (CSE). Although sexual health services are increasingly provided online, there has been no research on the translation of the safeguarding function to online services. We studied expert practitioner views on safeguarding in this context. The aim was to document expert practitioner views on safeguarding in the context of an online sexual health service. We conducted semistructured interviews with lead professionals purposively sampled from local, regional, or national organizations with a direct influence over CSE protocols, child protection policies, and sexual health services. Interviews were analyzed by three researchers using a matrix-based analytic method. Our respondents described two different approaches to safeguarding. The "information-providing" approach considers that young people experiencing CSE will ask for help when they are ready from someone they trust. The primary function of the service is to provide information, provoke reflection, generate trust, and respond reliably to disclosure. The approach values online services as an anonymous space to test out disclosure without commitment. The "information-gathering" approach considers that young people may withhold information about exploitation. Therefore, services should seek out information to assess risk and initiate disclosure. This approach values face-to-face opportunities for individualized questioning and immediate referral. The information-providing approach is associated with confidential telephone support lines and the information-gathering approach with clinical services. The approach adopted online will depend on ethos and the range of services provided. Effective transition from online to clinic services after disclosure is an essential element of this process and further research is needed to understand and support this transition. ©Victoria Spencer-Hughes, Jonathan Syred, Alison Allison, Gillian Holdsworth, Paula Baraitser. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 14.02.2017.

  7. Diffusion processes in tumors: A nuclear medicine approach

    NASA Astrophysics Data System (ADS)

    Amaya, Helman

    2016-07-01

    The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and 18F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer software was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical 18F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.

  8. Activity-based costing via an information system: an application created for a breast imaging center.

    PubMed

    Hawkins, H; Langer, J; Padua, E; Reaves, J

    2001-06-01

    Activity-based costing (ABC) is a process that enables the estimation of the cost of producing a product or service. More accurate than traditional charge-based approaches, it emphasizes analysis of processes, and more specific identification of both direct and indirect costs. This accuracy is essential in today's healthcare environment, in which managed care organizations necessitate responsible and accountable costing. However, to be successfully utilized, it requires time, effort, expertise, and support. Data collection can be tedious and expensive. By integrating ABC with information management (IM) and systems (IS), organizations can take advantage of the process orientation of both, extend and improve ABC, and decrease resource utilization for ABC projects. In our case study, we have examined the process of a multidisciplinary breast center. We have mapped the constituent activities and established cost drivers. This information has been structured and included in our information system database for subsequent analysis.

  9. Evidence development and publication planning: strategic process.

    PubMed

    Wittek, Michael R; Jo Williams, Mary; Carlson, Angeline M

    2009-11-01

    A number of decisions in the health care field rely heavily on published clinical evidence. A systematic approach to evidence development and publication planning is required to develop a portfolio of evidence that includes at minimum information on efficacy, safety, durability of effect, quality of life, and economic outcomes. The approach requires a critical assessment of available literature, identification of gaps in the literature, and a strategic plan to fill the gaps to ensure the availability of evidence demanded for clinical decisions, coverage/payment decisions and health technology assessments. The purpose of this manuscript is to offer a six-step strategic process leading to a portfolio of evidence that meets the informational needs of providers, payers, and governmental agencies concerning patient access to a therapy.

  10. Population-based imaging biobanks as source of big data.

    PubMed

    Gatidis, Sergios; Heber, Sophia D; Storz, Corinna; Bamberg, Fabian

    2017-06-01

    Advances of computational sciences over the last decades have enabled the introduction of novel methodological approaches in biomedical research. Acquiring extensive and comprehensive data about a research subject and subsequently extracting significant information has opened new possibilities in gaining insight into biological and medical processes. This so-called big data approach has recently found entrance into medical imaging and numerous epidemiological studies have been implementing advanced imaging to identify imaging biomarkers that provide information about physiological processes, including normal development and aging but also on the development of pathological disease states. The purpose of this article is to present existing epidemiological imaging studies and to discuss opportunities, methodological and organizational aspects, and challenges that population imaging poses to the field of big data research.

  11. Mass Spectrometry: A Technique of Many Faces

    PubMed Central

    Olshina, Maya A.; Sharon, Michal

    2016-01-01

    Protein complexes form the critical foundation for a wide range of biological process, however understanding the intricate details of their activities is often challenging. In this review we describe how mass spectrometry plays a key role in the analysis of protein assemblies and the cellular pathways which they are involved in. Specifically, we discuss how the versatility of mass spectrometric approaches provides unprecedented information on multiple levels. We demonstrate this on the ubiquitin-proteasome proteolytic pathway, a process that is responsible for protein turnover. We follow the various steps of this degradation route and illustrate the different mass spectrometry workflows that were applied for elucidating molecular information. Overall, this review aims to stimulate the integrated use of multiple mass spectrometry approaches for analyzing complex biological systems. PMID:28100928

  12. Using trauma informed care as a nursing model of care in an acute inpatient mental health unit: A practice development process.

    PubMed

    Isobel, Sophie; Edwards, Clair

    2017-02-01

    Without agreeing on an explicit approach to care, mental health nurses may resort to problem focused, task oriented practice. Defining a model of care is important but there is also a need to consider the philosophical basis of any model. The use of Trauma Informed Care as a guiding philosophy provides a robust framework from which to review nursing practice. This paper describes a nursing workforce practice development process to implement Trauma Informed Care as an inpatient model of mental health nursing care. Trauma Informed Care is an evidence-based approach to care delivery that is applicable to mental health inpatient units; while there are differing strategies for implementation, there is scope for mental health nurses to take on Trauma Informed Care as a guiding philosophy, a model of care or a practice development project within all of their roles and settings in order to ensure that it has considered, relevant and meaningful implementation. The principles of Trauma Informed Care may also offer guidance for managing workforce stress and distress associated with practice change. © 2016 Australian College of Mental Health Nurses Inc.

  13. Implementation of Building Information Modeling (BIM) in Construction: A Comparative Case Study

    NASA Astrophysics Data System (ADS)

    Rowlinson, Steve; Collins, Ronan; Tuuli, Martin M.; Jia, Yunyan

    2010-05-01

    Building Information Modeling (BIM) approach is increasingly adopted in coordination of construction projects, with a number of parties providing BIM services and software solutions. However, the empirical impact of BIM on construction industry has yet to be investigated. This paper explores the interaction between BIM and the construction industry during its implementation, with a specific focus on the empirical impacts of BIM on the design and construction processes and professional roles during the process. Two cases were selected from recent construction projects coordinated with BIM systems: the Venetian Casino project in Macau and the Cathy Pacific Cargo Terminal project in Hong Kong. The former case illustrates how the conflicts emerged during the design process and procurement were addressed by adopting a BIM approach. The latter demonstrates how the adoption of BIM altered the roles of architect, contractor, and sub-contractors involved in the project. The impacts of BIM were critically reviewed and discussed.

  14. USING FORMATIVE RESEARCH TO DEVELOP A CONTEXT-SPECIFIC APPROACH TO INFORMED CONSENT FOR CLINICAL TRIALS

    PubMed Central

    Corneli, Amy L.; Bentley, Margaret E.; Sorenson, James R.; Henderson, Gail E.; van der Horst, Charles; Moses, Agnes; Nkhoma, Jacqueline; Tenthani, Lyson; Ahmed, Yusuf; Heilig, Charles M.; Jamieson, Denise J.

    2009-01-01

    Participant understanding is of particular concern when obtaining informed consent. Recommendations for improving understanding include disclosing information using culturally appropriate and innovative approaches. To increase the effectiveness of the consent process for a clinical trial in Malawi on interventions to prevent mother-to-child transmission of HIV during breastfeeding, formative research was conducted to explore the community’s understanding of medical research as well as how to explain research through local terms and meanings. Contextual analogies and other approaches were identified to explain consent information. Guided by theory, strategies for developing culturally appropriate interventions, and recommendations from the literature, we demonstrate how the formative data were used to develop culturally appropriate counseling cards specifically for the trial in Malawi. With appropriate contextual modifications, the steps outlined here could be applied in other clinical trials conducted elsewhere, as well as in other types of research. PMID:19385837

  15. Synergistic Information Processing Encrypts Strategic Reasoning in Poker.

    PubMed

    Frey, Seth; Albino, Dominic K; Williams, Paul L

    2018-06-14

    There is a tendency in decision-making research to treat uncertainty only as a problem to be overcome. But it is also a feature that can be leveraged, particularly in social interaction. Comparing the behavior of profitable and unprofitable poker players, we reveal a strategic use of information processing that keeps decision makers unpredictable. To win at poker, a player must exploit public signals from others. But using public inputs makes it easier for an observer to reconstruct that player's strategy and predict his or her behavior. How should players trade off between exploiting profitable opportunities and remaining unexploitable themselves? Using a recent multivariate approach to information theoretic data analysis and 1.75 million hands of online two-player No-Limit Texas Hold'em, we find that the important difference between winning and losing players is not in the amount of information they process, but how they process it. In particular, winning players are better at integrative information processing-creating new information from the interaction between their cards and their opponents' signals. We argue that integrative information processing does not just produce better decisions, it makes decision-making harder for others to reverse engineer, as an expert poker player's cards act like the private key in public-key cryptography. Poker players encrypt their reasoning with the way they process information. The encryption function of integrative information processing makes it possible for players to exploit others while remaining unexploitable. By recognizing the act of information processing as a strategic behavior in its own right, we offer a detailed account of how experts use endemic uncertainty to conceal their intentions in high-stakes competitive environments, and we highlight new opportunities between cognitive science, information theory, and game theory. Copyright © 2018 Cognitive Science Society, Inc.

  16. Information theoretic analysis of linear shift-invariant edge-detection operators

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2012-06-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.

  17. Systems Biology and Biomarkers of Early Effects for Occupational Exposure Limit Setting

    PubMed Central

    DeBord, D. Gayle; Burgoon, Lyle; Edwards, Stephen W.; Haber, Lynne T.; Kanitz, M. Helen; Kuempel, Eileen; Thomas, Russell S.; Yucesoy, Berran

    2015-01-01

    In a recent National Research Council document, new strategies for risk assessment were described to enable more accurate and quicker assessments.( 1 ) This report suggested that evaluating individual responses through increased use of bio-monitoring could improve dose-response estimations. Identi-fication of specific biomarkers may be useful for diagnostics or risk prediction as they have the potential to improve exposure assessments. This paper discusses systems biology, biomarkers of effect, and computational toxicology approaches and their relevance to the occupational exposure limit setting process. The systems biology approach evaluates the integration of biological processes and how disruption of these processes by chemicals or other hazards affects disease outcomes. This type of approach could provide information used in delineating the mode of action of the response or toxicity, and may be useful to define the low adverse and no adverse effect levels. Biomarkers of effect are changes measured in biological systems and are considered to be preclinical in nature. Advances in computational methods and experimental -omics methods that allow the simultaneous measurement of families of macromolecules such as DNA, RNA, and proteins in a single analysis have made these systems approaches feasible for broad application. The utility of the information for risk assessments from -omics approaches has shown promise and can provide information on mode of action and dose-response relationships. As these techniques evolve, estimation of internal dose and response biomarkers will be a critical test of these new technologies for application in risk assessment strategies. While proof of concept studies have been conducted that provide evidence of their value, challenges with standardization and harmonization still need to be overcome before these methods are used routinely. PMID:26132979

  18. Systems Biology and Biomarkers of Early Effects for Occupational Exposure Limit Setting.

    PubMed

    DeBord, D Gayle; Burgoon, Lyle; Edwards, Stephen W; Haber, Lynne T; Kanitz, M Helen; Kuempel, Eileen; Thomas, Russell S; Yucesoy, Berran

    2015-01-01

    In a recent National Research Council document, new strategies for risk assessment were described to enable more accurate and quicker assessments. This report suggested that evaluating individual responses through increased use of bio-monitoring could improve dose-response estimations. Identification of specific biomarkers may be useful for diagnostics or risk prediction as they have the potential to improve exposure assessments. This paper discusses systems biology, biomarkers of effect, and computational toxicology approaches and their relevance to the occupational exposure limit setting process. The systems biology approach evaluates the integration of biological processes and how disruption of these processes by chemicals or other hazards affects disease outcomes. This type of approach could provide information used in delineating the mode of action of the response or toxicity, and may be useful to define the low adverse and no adverse effect levels. Biomarkers of effect are changes measured in biological systems and are considered to be preclinical in nature. Advances in computational methods and experimental -omics methods that allow the simultaneous measurement of families of macromolecules such as DNA, RNA, and proteins in a single analysis have made these systems approaches feasible for broad application. The utility of the information for risk assessments from -omics approaches has shown promise and can provide information on mode of action and dose-response relationships. As these techniques evolve, estimation of internal dose and response biomarkers will be a critical test of these new technologies for application in risk assessment strategies. While proof of concept studies have been conducted that provide evidence of their value, challenges with standardization and harmonization still need to be overcome before these methods are used routinely.

  19. An Interdisciplinary Network Making Progress on Climate Change Communication

    NASA Astrophysics Data System (ADS)

    Spitzer, W.; Anderson, J. C.; Bales, S.; Fraser, J.; Yoder, J. A.

    2012-12-01

    Public understanding of climate change continues to lag far behind the scientific consensus not merely because the public lacks information, but because there is in fact too much complex and contradictory information available. Fortunately, we can now (1) build on careful empirical cognitive and social science research to understand what people already value, believe, and understand; and then (2) design and test strategies for translating complex science so that people can examine evidence, make well-informed inferences, and embrace science-based solutions. Informal science education institutions can help bridge the gap between climate scientists and the public. In the US, more than 1,500 informal science venues (science centers, museums, aquariums, zoos, nature centers, national parks, etc.) are visited annually by 61% of the population. Extensive research shows that these visitors are receptive to learning about climate change and trust these institutions as reliable sources. Ultimately, we need to take a strategic approach to the way climate change is communicated. An interdisciplinary approach is needed to bring together three key areas of expertise (as recommended by Pidgeon and Fischhoff, 2011): 1. Climate and decision science experts - who can summarize and explain what is known, characterize risks, and describe appropriate mitigation and adaptation strategies; 2. Social scientists - who can bring to bear research, theory, and best practices from cognitive, communication, knowledge acquisition, and social learning theory; and 3. Informal educators and program designers - who bring a practitioner perspective and can exponentially facilitate a learning process for additional interpreters. With support from an NSF CCEP Phase I grant, we have tested this approach, bringing together Interdisciplinary teams of colleagues for a five month "study circles" to develop skills to communicate climate change based on research in the social and cognitive sciences. In 2011, social scientists, Ph.D. students studying oceanography, and staff from more than 20 institutions that teach science to the public came together in these learning groups. Most participants were motivated to create new or revised training or public programs based on lessons learned together. The success of this program rests on a twofold approach that combines collaborative learning with a cognitive and social sciences research based approach to communications. The learning process facilitated trust and experimentation among co-learners to practice applications for communications that has continued beyond the study circle experience through the networks established during the process. Examples drawn from the study circle outputs suggest that this approach could have a transformative impact on informal science education on a broad scale. Ultimately, we envision informal science interpreters as "vectors" for effective science communication, ocean and climate scientists with enhanced communication skills, and increased public demand for explanation and dialogue about global issues.

  20. Characterizing and Assessing a Large-Scale Software Maintenance Organization

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1995-01-01

    One important component of a software process is the organizational context in which the process is enacted. This component is often missing or incomplete in current process modeling approaches. One technique for modeling this perspective is the Actor-Dependency (AD) Model. This paper reports on a case study which used this approach to analyze and assess a large software maintenance organization. Our goal was to identify the approach's strengths and weaknesses while providing practical recommendations for improvement and research directions. The AD model was found to be very useful in capturing the important properties of the organizational context of the maintenance process, and aided in the understanding of the flaws found in this process. However, a number of opportunities for extending and improving the AD model were identified. Among others, there is a need to incorporate quantitative information to complement the qualitative model.

  1. Approaches to Research in a Digital Environment--Who Are the New Researchers?

    ERIC Educational Resources Information Center

    Orr, Michael; Fankhauser, Rae

    The research process has been a constant feature of the curriculum in primary and secondary schools for many years. The purpose of this process has traditionally been to develop student research skills and to enhance their knowledge within a particular area. The Information Process diagram, developed by the Australian School Library Association in…

  2. A Dual-Process Approach to Health Risk Decision Making: The Prototype Willingness Model

    ERIC Educational Resources Information Center

    Gerrard, Meg; Gibbons, Frederick X.; Houlihan, Amy E.; Stock, Michelle L.; Pomery, Elizabeth A.

    2008-01-01

    Although dual-process models in cognitive, personality, and social psychology have stimulated a large body of research about analytic and heuristic modes of decision making, these models have seldom been applied to the study of adolescent risk behaviors. In addition, the developmental course of these two kinds of information processing, and their…

  3. Information Acquisition, Analysis and Integration

    DTIC Science & Technology

    2016-08-03

    of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A

  4. An innovative approach to capability-based emergency operations planning

    PubMed Central

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology. PMID:28228987

  5. An innovative approach to capability-based emergency operations planning.

    PubMed

    Keim, Mark E

    2013-01-01

    This paper describes the innovative use information technology for assisting disaster planners with an easily-accessible method for writing and improving evidence-based emergency operations plans. This process is used to identify all key objectives of the emergency response according to capabilities of the institution, community or society. The approach then uses a standardized, objective-based format, along with a consensus-based method for drafting capability-based operational-level plans. This information is then integrated within a relational database to allow for ease of access and enhanced functionality to search, sort and filter and emergency operations plan according to user need and technological capacity. This integrated approach is offered as an effective option for integrating best practices of planning with the efficiency, scalability and flexibility of modern information and communication technology.

  6. Non-extensitivity vs. informative moments for financial models —A unifying framework and empirical results

    NASA Astrophysics Data System (ADS)

    Herrmann, K.

    2009-11-01

    Information-theoretic approaches still play a minor role in financial market analysis. Nonetheless, there have been two very similar approaches evolving during the last years, one in the so-called econophysics and the other in econometrics. Both generalize the notion of GARCH processes in an information-theoretic sense and are able to capture kurtosis better than traditional models. In this article we present both approaches in a more general framework. The latter allows the derivation of a wide range of new models. We choose a third model using an entropy measure suggested by Kapur. In an application to financial market data, we find that all considered models - with similar flexibility in terms of skewness and kurtosis - lead to very similar results.

  7. Determining climate change management priorities: A case study from Wisconsin

    USGS Publications Warehouse

    LeDee, Olivia E.; Ribic, Christine

    2015-01-01

    A burgeoning dialogue exists regarding how to allocate resources to maximize the likelihood of long-term biodiversity conservation within the context of climate change. To make effective decisions in natural resource management, an iterative, collaborative, and learning-based decision process may be more successful than a strictly consultative approach. One important, early step in a decision process is to identify priority species or systems. Although this promotes the conservation of select species or systems, it may inadvertently alter the future of non-target species and systems. We describe a process to screen terrestrial wildlife for potential sensitivity to climate change and then use the results to engage natural resource professionals in a process of identifying priorities for monitoring, research, and adaptation strategy implementation. We demonstrate this approach using a case study from Wisconsin. In Wisconsin, experts identified 23 out of 353 species with sufficient empirical research and management understanding to inform targeted action. Habitat management and management of hydrological conditions were the common strategies for targeted action. Although there may be an interest in adaptation strategy implementation for many species and systems, experts considered existing information inadequate to inform targeted action. According to experts, 40% of the vertebrate species in Wisconsin will require near-term intervention for climate adaptation. These results will inform state-wide conservation planning as well as regional efforts.

  8. A Rules-Based Service for Suggesting Visualizations to Analyze Earth Science Phenomena.

    NASA Astrophysics Data System (ADS)

    Prabhu, A.; Zednik, S.; Fox, P. A.; Ramachandran, R.; Maskey, M.; Shie, C. L.; Shen, S.

    2016-12-01

    Current Earth Science Information Systems lack support for new or interdisciplinary researchers, who may be unfamiliar with the domain vocabulary or the breadth of relevant data available. We need to evolve the current information systems, to reduce the time required for data preparation, processing and analysis. This can be done by effectively salvaging the "dark" resources in Earth Science. We assert that Earth science metadata assets are dark resources, information resources that organizations collect, process, and store for regular business or operational activities but fail to utilize for other purposes. In order to effectively use these dark resources, especially for data processing and visualization, we need a combination of domain, data product and processing knowledge, i.e. a knowledge base from which specific data operations can be performed. In this presentation, we describe a semantic, rules based approach to provide i.e. a service to visualize Earth Science phenomena, based on the data variables extracted using the "dark" metadata resources. We use Jena rules to make assertions about compatibility between a phenomena and various visualizations based on multiple factors. We created separate orthogonal rulesets to map each of these factors to the various phenomena. Some of the factors we have considered include measurements, spatial resolution and time intervals. This approach enables easy additions and deletions based on newly obtained domain knowledge or phenomena related information and thus improving the accuracy of the rules service overall.

  9. The Role of the Health Information Manager in a Research-Based Information Technology Project.

    PubMed

    Freyne, Alice

    2009-06-01

    Information technology advances in healthcare provide many and varied opportunities for the Health Information Manager. Here is one example involving a Melbourne-based research project and an innovative approach to patient information delivery. The research project area of study is multimedia content delivery in the following applications: as an adjunct to the surgical informed consent process, patient information or instruction presentation and clinical education. The objective is to develop evidence-based, effective and accessible information and knowledge resources for patients and health care providers.

  10. 75 FR 20346 - Notice of Proposed Information Collection Requests

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-19

    ... Federal law, or substantially interfere with any agency's ability to perform its statutory obligations... Department; (2) will this information be processed and used in a timely manner; (3) is the estimate of burden... models, approaches, and strategies adopted and implemented by a subset of schools receiving federal...

  11. Non-Procedural Languages for Information Resource Management.

    ERIC Educational Resources Information Center

    Bearley, William L.

    The future of information resources management requires new approaches to implementing systems which will include a type of data base management that frees users to solve data processing problems logically by telling the system what they want, together with powerful non-procedural languages that will permit communication in simple, concise…

  12. 76 FR 3060 - Call for Information: Information Related to the Development of Emission-Estimating Methodologies...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-19

    ... approach that incorporates ``mass balance'' constraints to determine emissions from AFOs. Unfortunately... ventilation rate of the monitored confinement structure. Nitrogen content of process inputs and outputs (e.g., feed, water, bedding, eggs, milk). Nitrogen content of manure excreted. Description of any control...

  13. Overcoming the Grammar Deficit: The Role of Information Technology in Teaching German Grammar to Undergraduates.

    ERIC Educational Resources Information Center

    Hall, Christopher

    1998-01-01

    Examines how application of computer-assisted language learning (CALL) and information technology can be used to overcome "grammar deficit" seen in many British undergraduate German students. A combination of explicit, implicit, and exploratory grammar teaching approaches uses diverse resources, including word processing packages,…

  14. Incorporating Trauma-Informed Care into School-Based Programs

    ERIC Educational Resources Information Center

    Martin, Sandra L.; Ashley, Olivia Silber; White, LeBretia; Axelson, Sarah; Clark, Marc; Burrus, Barri

    2017-01-01

    Background: This article provides an overview of the rationale and process for incorporating trauma-informed approaches into US school-based programs, using school-based adolescent pregnancy prevention programs as an example. Methods: Research literature is reviewed on the prevalence and outcomes of childhood trauma, including the links between…

  15. The Internet and Technical Services: A Point Break Approach.

    ERIC Educational Resources Information Center

    McCombs, Gillian M.

    1994-01-01

    Discusses implications of using the Internet for library technical services. Topics addressed include creative uses of the Internet; three basic applications on the Internet, i.e., electronic mail, remote log-in to another computer, and file transfer; electronic processing of information; electronic access to information; and electronic processing…

  16. Social Networking on the Semantic Web

    ERIC Educational Resources Information Center

    Finin, Tim; Ding, Li; Zhou, Lina; Joshi, Anupam

    2005-01-01

    Purpose: Aims to investigate the way that the semantic web is being used to represent and process social network information. Design/methodology/approach: The Swoogle semantic web search engine was used to construct several large data sets of Resource Description Framework (RDF) documents with social network information that were encoded using the…

  17. Feedback in Information Retrieval.

    ERIC Educational Resources Information Center

    Spink, Amanda; Losee, Robert M.

    1996-01-01

    As Information Retrieval (IR) has evolved, it has become a highly interactive process, rooted in cognitive and situational contexts. Consequently the traditional cybernetic-based IR model does not suffice for interactive IR or the human approach to IR. Reviews different views of feedback in IR and their relationship to cybernetic and social…

  18. An Approach toward the Development of a Functional Encoding Model of Short Term Memory during Reading.

    ERIC Educational Resources Information Center

    Herndon, Mary Anne

    1978-01-01

    In a model of the functioning of short term memory, the encoding of information for subsequent storage in long term memory is simulated. In the encoding process, semantically equivalent paragraphs are detected for recombination into a macro information unit. (HOD)

  19. Information Competency and Creative Initiative of Personality and Their Manifestation in Activity

    ERIC Educational Resources Information Center

    Tabachuk, Natalia P.; Ledovskikh, Irina A.; Shulika, Nadezhda A.; Karpova, Irina V.; Kazinets, Victor A.; Polichka, Anatolii E.

    2018-01-01

    The relevance of the research is due to the global trends of development of the information society that are associated with the rapid advancement of civilization (IT penetration, increased computer availability, variability) and innovation processes in the sphere of education (competency-based approach, humanization and humanitarization). These…

  20. Informal and Formal Learning of General Practitioners

    ERIC Educational Resources Information Center

    Spaan, Nadia Roos; Dekker, Anne R. J.; van der Velden, Alike W.; de Groot, Esther

    2016-01-01

    Purpose: The purpose of this study is to understand the influence of formal learning from a web-based training and informal (workplace) learning afterwards on the behaviour of general practitioners (GPs) with respect to prescription of antibiotics. Design/methodology/approach: To obtain insight in various learning processes, semi-structured…

  1. A clinical decision support system for integrating tuberculosis and HIV care in Kenya: a human-centered design approach.

    PubMed

    Catalani, Caricia; Green, Eric; Owiti, Philip; Keny, Aggrey; Diero, Lameck; Yeung, Ada; Israelski, Dennis; Biondich, Paul

    2014-01-01

    With the aim of integrating HIV and tuberculosis care in rural Kenya, a team of researchers, clinicians, and technologists used the human-centered design approach to facilitate design, development, and deployment processes of new patient-specific TB clinical decision support system for medical providers. In Kenya, approximately 1.6 million people are living with HIV and have a 20-times higher risk of dying of tuberculosis. Although tuberculosis prevention and treatment medication is widely available, proven to save lives, and prioritized by the World Health Organization, ensuring that it reaches the most vulnerable communities remains challenging. Human-centered design, used in the fields of industrial design and information technology for decades, is an approach to improving the effectiveness and impact of innovations that has been scarcely used in the health field. Using this approach, our team followed a 3-step process, involving mixed methods assessment to (1) understand the situation through the collection and analysis of site observation sessions and key informant interviews; (2) develop a new clinical decision support system through iterative prototyping, end-user engagement, and usability testing; and, (3) implement and evaluate the system across 24 clinics in rural West Kenya. Through the application of this approach, we found that human-centered design facilitated the process of digital innovation in a complex and resource-constrained context.

  2. A Clinical Decision Support System for Integrating Tuberculosis and HIV Care in Kenya: A Human-Centered Design Approach

    PubMed Central

    Catalani, Caricia; Green, Eric; Owiti, Philip; Keny, Aggrey; Diero, Lameck; Yeung, Ada; Israelski, Dennis; Biondich, Paul

    2014-01-01

    With the aim of integrating HIV and tuberculosis care in rural Kenya, a team of researchers, clinicians, and technologists used the human-centered design approach to facilitate design, development, and deployment processes of new patient-specific TB clinical decision support system for medical providers. In Kenya, approximately 1.6 million people are living with HIV and have a 20-times higher risk of dying of tuberculosis. Although tuberculosis prevention and treatment medication is widely available, proven to save lives, and prioritized by the World Health Organization, ensuring that it reaches the most vulnerable communities remains challenging. Human-centered design, used in the fields of industrial design and information technology for decades, is an approach to improving the effectiveness and impact of innovations that has been scarcely used in the health field. Using this approach, our team followed a 3-step process, involving mixed methods assessment to (1) understand the situation through the collection and analysis of site observation sessions and key informant interviews; (2) develop a new clinical decision support system through iterative prototyping, end-user engagement, and usability testing; and, (3) implement and evaluate the system across 24 clinics in rural West Kenya. Through the application of this approach, we found that human-centered design facilitated the process of digital innovation in a complex and resource-constrained context. PMID:25170939

  3. Optimizing the stimulus presentation paradigm design for the P300-based brain-computer interface using performance prediction

    NASA Astrophysics Data System (ADS)

    Mainsah, B. O.; Reeves, G.; Collins, L. M.; Throckmorton, C. S.

    2017-08-01

    Objective. The role of a brain-computer interface (BCI) is to discern a user’s intended message or action by extracting and decoding relevant information from brain signals. Stimulus-driven BCIs, such as the P300 speller, rely on detecting event-related potentials (ERPs) in response to a user attending to relevant or target stimulus events. However, this process is error-prone because the ERPs are embedded in noisy electroencephalography (EEG) data, representing a fundamental problem in communication of the uncertainty in the information that is received during noisy transmission. A BCI can be modeled as a noisy communication system and an information-theoretic approach can be exploited to design a stimulus presentation paradigm to maximize the information content that is presented to the user. However, previous methods that focused on designing error-correcting codes failed to provide significant performance improvements due to underestimating the effects of psycho-physiological factors on the P300 ERP elicitation process and a limited ability to predict online performance with their proposed methods. Maximizing the information rate favors the selection of stimulus presentation patterns with increased target presentation frequency, which exacerbates refractory effects and negatively impacts performance within the context of an oddball paradigm. An information-theoretic approach that seeks to understand the fundamental trade-off between information rate and reliability is desirable. Approach. We developed a performance-based paradigm (PBP) by tuning specific parameters of the stimulus presentation paradigm to maximize performance while minimizing refractory effects. We used a probabilistic-based performance prediction method as an evaluation criterion to select a final configuration of the PBP. Main results. With our PBP, we demonstrate statistically significant improvements in online performance, both in accuracy and spelling rate, compared to the conventional row-column paradigm. Significance. By accounting for refractory effects, an information-theoretic approach can be exploited to significantly improve BCI performance across a wide range of performance levels.

  4. Granger causal time-dependent source connectivity in the somatosensory network

    NASA Astrophysics Data System (ADS)

    Gao, Lin; Sommerlade, Linda; Coffman, Brian; Zhang, Tongsheng; Stephen, Julia M.; Li, Dichen; Wang, Jue; Grebogi, Celso; Schelter, Bjoern

    2015-05-01

    Exploration of transient Granger causal interactions in neural sources of electrophysiological activities provides deeper insights into brain information processing mechanisms. However, the underlying neural patterns are confounded by time-dependent dynamics, non-stationarity and observational noise contamination. Here we investigate transient Granger causal interactions using source time-series of somatosensory evoked magnetoencephalographic (MEG) elicited by air puff stimulation of right index finger and recorded using 306-channel MEG from 21 healthy subjects. A new time-varying connectivity approach, combining renormalised partial directed coherence with state space modelling, is employed to estimate fast changing information flow among the sources. Source analysis confirmed that somatosensory evoked MEG was mainly generated from the contralateral primary somatosensory cortex (SI) and bilateral secondary somatosensory cortices (SII). Transient Granger causality shows a serial processing of somatosensory information, 1) from contralateral SI to contralateral SII, 2) from contralateral SI to ipsilateral SII, 3) from contralateral SII to contralateral SI, and 4) from contralateral SII to ipsilateral SII. These results are consistent with established anatomical connectivity between somatosensory regions and previous source modeling results, thereby providing empirical validation of the time-varying connectivity analysis. We argue that the suggested approach provides novel information regarding transient cortical dynamic connectivity, which previous approaches could not assess.

  5. Spatial information semantic query based on SPARQL

    NASA Astrophysics Data System (ADS)

    Xiao, Zhifeng; Huang, Lei; Zhai, Xiaofang

    2009-10-01

    How can the efficiency of spatial information inquiries be enhanced in today's fast-growing information age? We are rich in geospatial data but poor in up-to-date geospatial information and knowledge that are ready to be accessed by public users. This paper adopts an approach for querying spatial semantic by building an Web Ontology language(OWL) format ontology and introducing SPARQL Protocol and RDF Query Language(SPARQL) to search spatial semantic relations. It is important to establish spatial semantics that support for effective spatial reasoning for performing semantic query. Compared to earlier keyword-based and information retrieval techniques that rely on syntax, we use semantic approaches in our spatial queries system. Semantic approaches need to be developed by ontology, so we use OWL to describe spatial information extracted by the large-scale map of Wuhan. Spatial information expressed by ontology with formal semantics is available to machines for processing and to people for understanding. The approach is illustrated by introducing a case study for using SPARQL to query geo-spatial ontology instances of Wuhan. The paper shows that making use of SPARQL to search OWL ontology instances can ensure the result's accuracy and applicability. The result also indicates constructing a geo-spatial semantic query system has positive efforts on forming spatial query and retrieval.

  6. A synoptic description of coal basins via image processing

    NASA Technical Reports Server (NTRS)

    Farrell, K. W., Jr.; Wherry, D. B.

    1978-01-01

    An existing image processing system is adapted to describe the geologic attributes of a regional coal basin. This scheme handles a map as if it were a matrix, in contrast to more conventional approaches which represent map information in terms of linked polygons. The utility of the image processing approach is demonstrated by a multiattribute analysis of the Herrin No. 6 coal seam in Illinois. Findings include the location of a resource and estimation of tonnage corresponding to constraints on seam thickness, overburden, and Btu value, which are illustrative of the need for new mining technology.

  7. Model prototype utilization in the analysis of fault tolerant control and data processing systems

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Tsarev, R. Yu; Gruzenkin, D. V.; Prokopenko, A. V.; Knyazkov, A. N.; Laptenok, V. D.

    2016-04-01

    The procedure assessing the profit of control and data processing system implementation is presented in the paper. The reasonability of model prototype creation and analysis results from the implementing of the approach of fault tolerance provision through the inclusion of structural and software assessment redundancy. The developed procedure allows finding the best ratio between the development cost and the analysis of model prototype and earnings from the results of this utilization and information produced. The suggested approach has been illustrated by the model example of profit assessment and analysis of control and data processing system.

  8. A conceptual framework for intelligent real-time information processing

    NASA Technical Reports Server (NTRS)

    Schudy, Robert

    1987-01-01

    By combining artificial intelligence concepts with the human information processing model of Rasmussen, a conceptual framework was developed for real time artificial intelligence systems which provides a foundation for system organization, control and validation. The approach is based on the description of system processing terms of an abstraction hierarchy of states of knowledge. The states of knowledge are organized along one dimension which corresponds to the extent to which the concepts are expressed in terms of the system inouts or in terms of the system response. Thus organized, the useful states form a generally triangular shape with the sensors and effectors forming the lower two vertices and the full evaluated set of courses of action the apex. Within the triangle boundaries are numerous processing paths which shortcut the detailed processing, by connecting incomplete levels of analysis to partially defined responses. Shortcuts at different levels of abstraction include reflexes, sensory motor control, rule based behavior, and satisficing. This approach was used in the design of a real time tactical decision aiding system, and in defining an intelligent aiding system for transport pilots.

  9. On Roles of Models in Information Systems

    NASA Astrophysics Data System (ADS)

    Sølvberg, Arne

    The increasing penetration of computers into all aspects of human activity makes it desirable that the interplay among software, data and the domains where computers are applied is made more transparent. An approach to this end is to explicitly relate the modeling concepts of the domains, e.g., natural science, technology and business, to the modeling concepts of software and data. This may make it simpler to build comprehensible integrated models of the interactions between computers and non-computers, e.g., interaction among computers, people, physical processes, biological processes, and administrative processes. This chapter contains an analysis of various facets of the modeling environment for information systems engineering. The lack of satisfactory conceptual modeling tools seems to be central to the unsatisfactory state-of-the-art in establishing information systems. The chapter contains a proposal for defining a concept of information that is relevant to information systems engineering.

  10. Online games: a novel approach to explore how partial information influences human random searches

    NASA Astrophysics Data System (ADS)

    Martínez-García, Ricardo; Calabrese, Justin M.; López, Cristóbal

    2017-01-01

    Many natural processes rely on optimizing the success ratio of a search process. We use an experimental setup consisting of a simple online game in which players have to find a target hidden on a board, to investigate how the rounds are influenced by the detection of cues. We focus on the search duration and the statistics of the trajectories traced on the board. The experimental data are explained by a family of random-walk-based models and probabilistic analytical approximations. If no initial information is given to the players, the search is optimized for cues that cover an intermediate spatial scale. In addition, initial information about the extension of the cues results, in general, in faster searches. Finally, strategies used by informed players turn into non-stationary processes in which the length of e ach displacement evolves to show a well-defined characteristic scale that is not found in non-informed searches.

  11. Informal learning processes in support of clinical service delivery in a service-oriented community pharmacy.

    PubMed

    Patterson, Brandon J; Bakken, Brianne K; Doucette, William R; Urmie, Julie M; McDonough, Randal P

    The evolving health care system necessitates pharmacy organizations' adjustments by delivering new services and establishing inter-organizational relationships. One approach supporting pharmacy organizations in making changes may be informal learning by technicians, pharmacists, and pharmacy owners. Informal learning is characterized by a four-step cycle including intent to learn, action, feedback, and reflection. This framework helps explain individual and organizational factors that influence learning processes within an organization as well as the individual and organizational outcomes of those learning processes. A case study of an Iowa independent community pharmacy with years of experience in offering patient care services was made. Nine semi-structured interviews with pharmacy personnel revealed initial evidence in support of the informal learning model in practice. Future research could investigate more fully the informal learning model in delivery of patient care services in community pharmacies. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Online games: a novel approach to explore how partial information influences human random searches.

    PubMed

    Martínez-García, Ricardo; Calabrese, Justin M; López, Cristóbal

    2017-01-06

    Many natural processes rely on optimizing the success ratio of a search process. We use an experimental setup consisting of a simple online game in which players have to find a target hidden on a board, to investigate how the rounds are influenced by the detection of cues. We focus on the search duration and the statistics of the trajectories traced on the board. The experimental data are explained by a family of random-walk-based models and probabilistic analytical approximations. If no initial information is given to the players, the search is optimized for cues that cover an intermediate spatial scale. In addition, initial information about the extension of the cues results, in general, in faster searches. Finally, strategies used by informed players turn into non-stationary processes in which the length of e ach displacement evolves to show a well-defined characteristic scale that is not found in non-informed searches.

  13. Psychodrama: A Creative Approach for Addressing Parallel Process in Group Supervision

    ERIC Educational Resources Information Center

    Hinkle, Michelle Gimenez

    2008-01-01

    This article provides a model for using psychodrama to address issues of parallel process during group supervision. Information on how to utilize the specific concepts and techniques of psychodrama in relation to group supervision is discussed. A case vignette of the model is provided.

  14. The Demonstration of Short-Term Consolidation.

    ERIC Educational Resources Information Center

    Jolicoeur, Pierre; Dell'Acqua, Roberto

    1998-01-01

    Results of seven experiments involving 112 college students or staff using a dual-task approach provide evidence that encoding information into short-term memory involves a distinct process termed short-term consolidation (STC). Results suggest that STC has limited capacity and that it requires central processing mechanisms. (SLD)

  15. A Theory of Perceptual Learning: Uncertainty Reduction and Reading.

    ERIC Educational Resources Information Center

    Henk, William A.

    Behaviorism cannot adequately explain language processing. A synthesis of the psycholinguistic and information processing approaches of cognitive psychology, however, can provide the basis for a speculative analysis of reading, if this synthesis is tempered by a perceptual learning theory of uncertainty reduction. Theorists of information…

  16. Strategic Positioning of the Web in a Multi-Channel Market Approach.

    ERIC Educational Resources Information Center

    Simons, Luuk P. A.; Steinfield, Charles; Bouwman, Harry

    2002-01-01

    Discusses channel economics in retail activities and trends toward unbundling due to the emergence of the Web channel. Highlights include sales processes and physical distribution processes; transaction costs; hybrid electronic commerce strategies; channel management and customer support; information economics, thing economics, and service…

  17. Judging nursing information on the WWW: a theoretical understanding.

    PubMed

    Cader, Raffik; Campbell, Steve; Watson, Don

    2009-09-01

    This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.

  18. Enhancing evidence informed policymaking in complex health systems: lessons from multi-site collaborative approaches.

    PubMed

    Langlois, Etienne V; Becerril Montekio, Victor; Young, Taryn; Song, Kayla; Alcalde-Rabanal, Jacqueline; Tran, Nhan

    2016-03-17

    There is an increasing interest worldwide to ensure evidence-informed health policymaking as a means to improve health systems performance. There is a need to engage policymakers in collaborative approaches to generate and use knowledge in real world settings. To address this gap, we implemented two interventions based on iterative exchanges between researchers and policymakers/implementers. This article aims to reflect on the implementation and impact of these multi-site evidence-to-policy approaches implemented in low-resource settings. The first approach was implemented in Mexico and Nicaragua and focused on implementation research facilitated by communities of practice (CoP) among maternal health stakeholders. We conducted a process evaluation of the CoPs and assessed the professionals' abilities to acquire, analyse, adapt and apply research. The second approach, called the Policy BUilding Demand for evidence in Decision making through Interaction and Enhancing Skills (Policy BUDDIES), was implemented in South Africa and Cameroon. The intervention put forth a 'buddying' process to enhance demand and use of systematic reviews by sub-national policymakers. The Policy BUDDIES initiative was assessed using a mixed-methods realist evaluation design. In Mexico, the implementation research supported by CoPs triggered monitoring by local health organizations of the quality of maternal healthcare programs. Health programme personnel involved in CoPs in Mexico and Nicaragua reported improved capacities to identify and use evidence in solving implementation problems. In South Africa, Policy BUDDIES informed a policy framework for medication adherence for chronic diseases, including both HIV and non-communicable diseases. Policymakers engaged in the buddying process reported an enhanced recognition of the value of research, and greater demand for policy-relevant knowledge. The collaborative evidence-to-policy approaches underline the importance of iterations and continuity in the engagement of researchers and policymakers/programme managers, in order to account for swift evolutions in health policy planning and implementation. In developing and supporting evidence-to-policy interventions, due consideration should be given to fit-for-purpose approaches, as different needs in policymaking cycles require adapted processes and knowledge. Greater consideration should be provided to approaches embedding the use of research in real-world policymaking, better suited to the complex adaptive nature of health systems.

  19. Synchronization and information processing by an on-off coupling

    NASA Astrophysics Data System (ADS)

    Wei, G. W.; Zhao, Shan

    2002-05-01

    This paper proposes an on-off coupling process for chaos synchronization and information processing. An in depth analysis for the net effect of a conventional coupling is performed. The stability of the process is studied. We show that the proposed controlled coupling process can locally minimize the smoothness and the fidelity of dynamical data. A digital filter expression for the on-off coupling process is derived and a connection is made to the Hanning filter. The utility and robustness of the proposed approach is demonstrated by chaos synchronization in Duffing oscillators, the spatiotemporal synchronization of noisy nonlinear oscillators, the estimation of the trend of a time series, and restoration of the contaminated solution of the nonlinear Schrödinger equation.

  20. Workflow management systems in radiology

    NASA Astrophysics Data System (ADS)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  1. Making health care quality reports easier to use.

    PubMed

    Hibbard, J H; Peters, E; Slovic, P; Finucane, M L; Tusler, M

    2001-11-01

    Although there is evidence that consumers want comparative quality information, most studies indicate that consumers make limited use of the data in decision making. The reasons for the limited use appear to be the complexity of the information and the difficulty of processing and using the amount of information in reports. The purpose of this investigation was to determine whether there are approaches to reporting comparative information that make it easier for consumers to comprehend the information. Further, the degree to which consumers who have a low level of skill can accurately use that information when it is presented in a format that is easier to use was examined. The study used an experimental design to examine how different presentation approaches affect the use of information. Participants were randomly assigned to different conditions and were asked to review information and complete a decision task related to using comparative information and making health plan selections. Two separate convenience samples were used in the study: an elderly Medicare sample (N = 253), and a nonelderly sample (N = 239). The findings indicate that there are data presentation approaches that help consumers who have lower skills use information more accurately. Some of these presentation strategies (for example, relative stars) improve comprehension among the lower skilled, and other strategies (for example, evaluative labels) appear to aid those in the midrange of comprehension skill. Using these approaches in reporting would likely increase the use of the comparative information and increase the efficacy of reporting efforts.

  2. Maxwell's demon in biochemical signal transduction with feedback loop

    PubMed Central

    Ito, Sosuke; Sagawa, Takahiro

    2015-01-01

    Signal transduction in living cells is vital to maintain life itself, where information transfer in noisy environment plays a significant role. In a rather different context, the recent intensive research on ‘Maxwell's demon'—a feedback controller that utilizes information of individual molecules—have led to a unified theory of information and thermodynamics. Here we combine these two streams of research, and show that the second law of thermodynamics with information reveals the fundamental limit of the robustness of signal transduction against environmental fluctuations. Especially, we find that the degree of robustness is quantitatively characterized by an informational quantity called transfer entropy. Our information-thermodynamic approach is applicable to biological communication inside cells, in which there is no explicit channel coding in contrast to artificial communication. Our result could open up a novel biophysical approach to understand information processing in living systems on the basis of the fundamental information–thermodynamics link. PMID:26099556

  3. Safety of Rural Nursing Home-to-Emergency Department Transfers: Improving Communication and Patient Information Sharing Across Settings.

    PubMed

    Tupper, Judith B; Gray, Carolyn E; Pearson, Karen B; Coburn, Andrew F

    2015-01-01

    The "siloed" approach to healthcare delivery contributes to communication challenges and to potential patient harm when patients transfer between settings. This article reports on the evaluation of a demonstration in 10 rural communities to improve the safety of nursing facility (NF) transfers to hospital emergency departments by forming interprofessional teams of hospital, emergency medical service, and NF staff to develop and implement tools and protocols for standardizing critical interfacility communication pathways and information sharing. We worked with each of the 10 teams to document current communication processes and information sharing tools and to design, implement, and evaluate strategies/tools to increase effective communication and sharing of patient information across settings. A mixed methods approach was used to evaluate changes from baseline in documentation of patient information shared across settings during the transfer process. Study findings showed significant improvement in key areas across the three settings, including infection status and baseline mental functioning. Improvement strategies and performance varied across settings; however, accurate and consistent information sharing of advance directives and medication lists remains a challenge. Study results demonstrate that with neutral facilitation and technical support, collaborative interfacility teams can assess and effectively address communication and information sharing problems that threaten patient safety.

  4. Single-Atom Demonstration of the Quantum Landauer Principle

    NASA Astrophysics Data System (ADS)

    Yan, L. L.; Xiong, T. P.; Rehan, K.; Zhou, F.; Liang, D. F.; Chen, L.; Zhang, J. Q.; Yang, W. L.; Ma, Z. H.; Feng, M.

    2018-05-01

    One of the outstanding challenges to information processing is the eloquent suppression of energy consumption in the execution of logic operations. The Landauer principle sets an energy constraint in deletion of a classical bit of information. Although some attempts have been made to experimentally approach the fundamental limit restricted by this principle, exploring the Landauer principle in a purely quantum mechanical fashion is still an open question. Employing a trapped ultracold ion, we experimentally demonstrate a quantum version of the Landauer principle, i.e., an equality associated with the energy cost of information erasure in conjunction with the entropy change of the associated quantized environment. Our experimental investigation substantiates an intimate link between information thermodynamics and quantum candidate systems for information processing.

  5. Temporal characteristics of audiovisual information processing.

    PubMed

    Fuhrmann Alpert, Galit; Hein, Grit; Tsai, Nancy; Naumer, Marcus J; Knight, Robert T

    2008-05-14

    In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.

  6. Improving informed consent: Stakeholder views

    PubMed Central

    Anderson, Emily E.; Newman, Susan B.; Matthews, Alicia K.

    2017-01-01

    Purpose Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders—research participants and those responsible for obtaining informed consent—to inform potential development of a multimedia informed consent “app.” Methods This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. Results We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Conclusions Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms. PMID:28949896

  7. Improving informed consent: Stakeholder views.

    PubMed

    Anderson, Emily E; Newman, Susan B; Matthews, Alicia K

    2017-01-01

    Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders-research participants and those responsible for obtaining informed consent-to inform potential development of a multimedia informed consent "app." This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms.

  8. Research on Design Information Management System for Leather Goods

    NASA Astrophysics Data System (ADS)

    Lu, Lei; Peng, Wen-li

    The idea of setting up a design information management system of leather goods was put forward to solve the problems existed in current information management of leather goods. Working principles of the design information management system for leather goods were analyzed in detail. Firstly, the acquiring approach of design information of leather goods was introduced. Secondly, the processing methods of design information were introduced. Thirdly, the management of design information in database was studied. Finally, the application of the system was discussed by taking the shoes products as an example.

  9. Using health outcomes data to inform decision-making: formulary committee perspective.

    PubMed

    Janknegt, R

    2001-01-01

    When healthcare resources are limited, decisions about the treatments to fund can be complex and difficult to make, involving the careful balancing of multiple factors. The decisions taken may have far-reaching consequences affecting many people. Clearly, decisions such as the choice of products on a formulary must be taken using a selection process that is fully transparent and that can be justified to all parties concerned. Although everyone would agree that drug selection should be a rational process that follows the guidelines of evidence-based medicine, many other factors may play a role in decision-making. Although some of these are explicit and rational, others are less clearly defined, and decision-makers may be unaware of the influence exerted by some of these factors. In order to facilitate transparent decision-making that makes rational use of health outcomes information, the System of Objectified Judgement Analysis (SOJA) has been developed by the author. SOJA includes interactive software that combines the quality advantages of the 'top-down' approach to drug selection, based on a thorough literature review, with the compliance advantages of a 'bottom-up' approach, where the final decision is made by the individual formulary committee and not by the authors of the review. The SOJA method, based on decision-making processes in economics, ensures that health outcomes information is given appropriate weight. Such approaches are valuable tools in discussions about product selection for formularies.

  10. Doing the Right Thing: One University's Approach to Digital Accessibility

    ERIC Educational Resources Information Center

    Sieben-Schneider, Jill A.; Hamilton-Brodie, Valerie A.

    2016-01-01

    This article describes the approach employed by one university to address a complaint filed by students with disabilities with the Department of Justice (DOJ) regarding the inaccessibility of information and communication technology (ICT). Prior to the DOJ complaint, the university did not have a process in place to address ICT accessibility.…

  11. Parental Involvement in Child Assessment: A Dynamic Approach.

    ERIC Educational Resources Information Center

    SeokHoon, Alice Seng

    This paper examines the status of parents in the developmental assessment process and considers how involving parents jointly with the professional to assess their young child may yield more accurate and valuable information. The paper explores the use of a mediated learning experience (MLE) approach as a framework for increasing support for…

  12. Enterprise Education Needs Enterprising Educators: A Case Study on Teacher Training Provision

    ERIC Educational Resources Information Center

    Penaluna, Kathryn; Penaluna, Andy; Usei, Caroline; Griffiths, Dinah

    2015-01-01

    Purpose: The purpose of this paper is to reflect upon the process that underpinned and informed the development and delivery of a "creativity-led" credit-bearing teacher training provision and to illuminate key factors of influence for the approaches to teaching and learning. Design/methodology/approach: Based on the assumption that…

  13. Learning in the Liminal Space: A Semiotic Approach to Threshold Concepts

    ERIC Educational Resources Information Center

    Land, Ray; Rattray, Julie; Vivian, Peter

    2014-01-01

    The threshold concepts approach to student learning and curriculum design now informs an empirical research base comprising over 170 disciplinary and professional contexts. It draws extensively on the notion of troublesomeness in a "liminal" space of learning. The latter is a transformative state in the process of learning in which there…

  14. A Self-Regulated Learning Approach for Children with Learning/Behavior Disorders

    ERIC Educational Resources Information Center

    Benevento, Joan A.

    2004-01-01

    This book is designed to be an intervention model based on the concepts of Piaget's study of constructivism. The application of this approach will help children with learning/ behavioral disorders actively participate in a fuller integration of their own psychomotor, affective, and cognitive information processing skills and adaptation. The work…

  15. Assessment for One-Shot Library Instruction: A Conceptual Approach

    ERIC Educational Resources Information Center

    Wang, Rui

    2016-01-01

    The purpose of this study is to explore a conceptual approach to assessment for one-shot library instruction. This study develops a new assessment instrument based on Carol Kuhlthau's information search process (ISP) model. The new instrument focuses on measuring and identifying changes in student readiness to do research along three…

  16. Strategic Planning Within Weapon System Program Offices at Aeronautical Systems Division.

    DTIC Science & Technology

    1985-09-01

    Command, date. 2. II. Department of the Air Force. The Air Force Budget Process. AFP 172-4. Washington: HQ USAF, 1 October 1984. 3. DD. Ansoff , Igor H...informal approach is subjective, it has been successful for some managers and should not be considered an ineffective approach. According to Ansoff

  17. Research on Mathematical Literacy in Schools--Aim, Approach and Attention

    ERIC Educational Resources Information Center

    Haara, Frode Olav; Bolstad, Oda Heidi; Jenssen, Eirik S.

    2017-01-01

    The development of mathematical literacy in schools is of significant concern at the policy level, and research is an important source of information in this process. This review article focuses on areas of research interest identified in empirical projects on mathematical literacy, and how mathematical literacy in schools is approached by…

  18. The Effect of Teaching with Stories on Associate Degree Nursing Students' Approach to Learning and Reflective Practice

    ERIC Educational Resources Information Center

    Bradshaw, Vicki

    2012-01-01

    This action research study is the culmination of several action cycles investigating cognitive information processing and learning strategies based on students approach to learning theory and assessing students' meta-cognitive learning, motivation, and reflective development suggestive of deep learning. The study introduces a reading…

  19. Progress and opportunities in EELS and EDS tomography.

    PubMed

    Collins, Sean M; Midgley, Paul A

    2017-09-01

    Electron tomography using energy loss and X-ray spectroscopy in the electron microscope continues to develop in rapidly evolving and diverse directions, enabling new insight into the three-dimensional chemistry and physics of nanoscale volumes. Progress has been made recently in improving reconstructions from EELS and EDS signals in electron tomography by applying compressed sensing methods, characterizing new detector technologies in detail, deriving improved models of signal generation, and exploring machine learning approaches to signal processing. These disparate threads can be brought together in a cohesive framework in terms of a model-based approach to analytical electron tomography. Models incorporate information on signal generation and detection as well as prior knowledge of structures in the spectrum image data. Many recent examples illustrate the flexibility of this approach and its feasibility for addressing challenges in non-linear or limited signals in EELS and EDS tomography. Further work in combining multiple imaging and spectroscopy modalities, developing synergistic data acquisition, processing, and reconstruction approaches, and improving the precision of quantitative spectroscopic tomography will expand the frontiers of spatial resolution, dose limits, and maximal information recovery. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Virtual k -Space Modulation Optical Microscopy

    NASA Astrophysics Data System (ADS)

    Kuang, Cuifang; Ma, Ye; Zhou, Renjie; Zheng, Guoan; Fang, Yue; Xu, Yingke; Liu, Xu; So, Peter T. C.

    2016-07-01

    We report a novel superresolution microscopy approach for imaging fluorescence samples. The reported approach, termed virtual k -space modulation optical microscopy (VIKMOM), is able to improve the lateral resolution by a factor of 2, reduce the background level, improve the optical sectioning effect and correct for unknown optical aberrations. In the acquisition process of VIKMOM, we used a scanning confocal microscope setup with a 2D detector array to capture sample information at each scanned x -y position. In the recovery process of VIKMOM, we first modulated the captured data by virtual k -space coding and then employed a ptychography-inspired procedure to recover the sample information and correct for unknown optical aberrations. We demonstrated the performance of the reported approach by imaging fluorescent beads, fixed bovine pulmonary artery endothelial (BPAE) cells, and living human astrocytes (HA). As the VIKMOM approach is fully compatible with conventional confocal microscope setups, it may provide a turn-key solution for imaging biological samples with ˜100 nm lateral resolution, in two or three dimensions, with improved optical sectioning capabilities and aberration correcting.

  1. Student Technology Use in the Information-Seeking and Information-Gathering Process: A Critical Incident Approach for Benchmarking Performance

    ERIC Educational Resources Information Center

    Cordes, Sean

    2012-01-01

    This article is an exploratory study of student behavior using online tools to do project-based work for a library science course at a mid-sized Midwestern public university. The population was 22 net generation students aged 18-24, who were enrolled in an Introduction to Information Resources course. The study was designed to better understand…

  2. An Evaluation of Understandability of Patient Journey Models in Mental Health.

    PubMed

    Percival, Jennifer; McGregor, Carolyn

    2016-07-28

    There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.

  3. "Chemical transformers" from nanoparticle ensembles operated with logic.

    PubMed

    Motornov, Mikhail; Zhou, Jian; Pita, Marcos; Gopishetty, Venkateshwarlu; Tokarev, Ihor; Katz, Evgeny; Minko, Sergiy

    2008-09-01

    The pH-responsive nanoparticles were coupled with information-processing enzyme-based systems to yield "smart" signal-responsive hybrid systems with built-in Boolean logic. The enzyme systems performed AND/OR logic operations, transducing biochemical input signals into reversible structural changes (signal-directed self-assembly) of the nanoparticle assemblies, thus resulting in the processing and amplification of the biochemical signals. The hybrid system mimics biological systems in effective processing of complex biochemical information, resulting in reversible changes of the self-assembled structures of the nanoparticles. The bioinspired approach to the nanostructured morphing materials could be used in future self-assembled molecular robotic systems.

  4. Distributed Processing with a Mainframe-Based Hospital Information System: A Generalized Solution

    PubMed Central

    Kirby, J. David; Pickett, Michael P.; Boyarsky, M. William; Stead, William W.

    1987-01-01

    Over the last two years the Medical Center Information Systems Department at Duke University Medical Center has been developing a systematic approach to distributing the processing and data involved in computerized applications at DUMC. The resulting system has been named MAPS- the Micro-ADS Processing System. A key characteristic of MAPS is that it makes it easy to execute any existing mainframe ADS application with a request from a PC. This extends the functionality of the mainframe application set to the PC without compromising the maintainability of the PC or mainframe systems.

  5. Informed spectral analysis: audio signal parameter estimation using side information

    NASA Astrophysics Data System (ADS)

    Fourer, Dominique; Marchand, Sylvain

    2013-12-01

    Parametric models are of great interest for representing and manipulating sounds. However, the quality of the resulting signals depends on the precision of the parameters. When the signals are available, these parameters can be estimated, but the presence of noise decreases the resulting precision of the estimation. Furthermore, the Cramér-Rao bound shows the minimal error reachable with the best estimator, which can be insufficient for demanding applications. These limitations can be overcome by using the coding approach which consists in directly transmitting the parameters with the best precision using the minimal bitrate. However, this approach does not take advantage of the information provided by the estimation from the signal and may require a larger bitrate and a loss of compatibility with existing file formats. The purpose of this article is to propose a compromised approach, called the 'informed approach,' which combines analysis with (coded) side information in order to increase the precision of parameter estimation using a lower bitrate than pure coding approaches, the audio signal being known. Thus, the analysis problem is presented in a coder/decoder configuration where the side information is computed and inaudibly embedded into the mixture signal at the coder. At the decoder, the extra information is extracted and is used to assist the analysis process. This study proposes applying this approach to audio spectral analysis using sinusoidal modeling which is a well-known model with practical applications and where theoretical bounds have been calculated. This work aims at uncovering new approaches for audio quality-based applications. It provides a solution for challenging problems like active listening of music, source separation, and realistic sound transformations.

  6. Intentional and automatic processing of numerical information in mathematical anxiety: testing the influence of emotional priming.

    PubMed

    Ashkenazi, Sarit

    2018-02-05

    Current theoretical approaches suggest that mathematical anxiety (MA) manifests itself as a weakness in quantity manipulations. This study is the first to examine automatic versus intentional processing of numerical information using the numerical Stroop paradigm in participants with high MA. To manipulate anxiety levels, we combined the numerical Stroop task with an affective priming paradigm. We took a group of college students with high MA and compared their performance to a group of participants with low MA. Under low anxiety conditions (neutral priming), participants with high MA showed relatively intact number processing abilities. However, under high anxiety conditions (mathematical priming), participants with high MA showed (1) higher processing of the non-numerical irrelevant information, which aligns with the theoretical view regarding deficits in selective attention in anxiety and (2) an abnormal numerical distance effect. These results demonstrate that abnormal, basic numerical processing in MA is context related.

  7. Parametric boundary reconstruction algorithm for industrial CT metrology application.

    PubMed

    Yin, Zhye; Khare, Kedar; De Man, Bruno

    2009-01-01

    High-energy X-ray computed tomography (CT) systems have been recently used to produce high-resolution images in various nondestructive testing and evaluation (NDT/NDE) applications. The accuracy of the dimensional information extracted from CT images is rapidly approaching the accuracy achieved with a coordinate measuring machine (CMM), the conventional approach to acquire the metrology information directly. On the other hand, CT systems generate the sinogram which is transformed mathematically to the pixel-based images. The dimensional information of the scanned object is extracted later by performing edge detection on reconstructed CT images. The dimensional accuracy of this approach is limited by the grid size of the pixel-based representation of CT images since the edge detection is performed on the pixel grid. Moreover, reconstructed CT images usually display various artifacts due to the underlying physical process and resulting object boundaries from the edge detection fail to represent the true boundaries of the scanned object. In this paper, a novel algorithm to reconstruct the boundaries of an object with uniform material composition and uniform density is presented. There are three major benefits in the proposed approach. First, since the boundary parameters are reconstructed instead of image pixels, the complexity of the reconstruction algorithm is significantly reduced. The iterative approach, which can be computationally intensive, will be practical with the parametric boundary reconstruction. Second, the object of interest in metrology can be represented more directly and accurately by the boundary parameters instead of the image pixels. By eliminating the extra edge detection step, the overall dimensional accuracy and process time can be improved. Third, since the parametric reconstruction approach shares the boundary representation with other conventional metrology modalities such as CMM, boundary information from other modalities can be directly incorporated as prior knowledge to improve the convergence of an iterative approach. In this paper, the feasibility of parametric boundary reconstruction algorithm is demonstrated with both simple and complex simulated objects. Finally, the proposed algorithm is applied to the experimental industrial CT system data.

  8. The effects of viewpoint on the virtual space of pictures

    NASA Technical Reports Server (NTRS)

    Sedgwick, H. A.

    1989-01-01

    Pictorial displays whose primary purpose is to convey accurate information about the 3-D spatial layout of an environment are discussed. How and how well, pictures can convey such information is discussed. It is suggested that picture perception is not best approached as a unitary, indivisible process. Rather, it is a complex process depending on multiple, partially redundant, interacting sources of visual information for both the real surface of the picture and the virtual space beyond. Each picture must be assessed for the particular information that it makes available. This will determine how accurately the virtual space represented by the picture is seen, as well as how it is distorted when seen from the wrong viewpoint.

  9. Context-Aware Recommender Systems

    NASA Astrophysics Data System (ADS)

    Adomavicius, Gediminas; Tuzhilin, Alexander

    The importance of contextual information has been recognized by researchers and practitioners in many disciplines, including e-commerce personalization, information retrieval, ubiquitous and mobile computing, data mining, marketing, and management. While a substantial amount of research has already been performed in the area of recommender systems, most existing approaches focus on recommending the most relevant items to users without taking into account any additional contextual information, such as time, location, or the company of other people (e.g., for watching movies or dining out). In this chapter we argue that relevant contextual information does matter in recommender systems and that it is important to take this information into account when providing recommendations. We discuss the general notion of context and how it can be modeled in recommender systems. Furthermore, we introduce three different algorithmic paradigms - contextual prefiltering, post-filtering, and modeling - for incorporating contextual information into the recommendation process, discuss the possibilities of combining several contextaware recommendation techniques into a single unifying approach, and provide a case study of one such combined approach. Finally, we present additional capabilities for context-aware recommenders and discuss important and promising directions for future research.

  10. Motivation, Cognitive Processing and Achievement in Higher Education

    ERIC Educational Resources Information Center

    Bruinsma, Marjon

    2004-01-01

    This study investigated the question of whether a student's expectancy, values and negative affect influenced their deep information processing approach and achievement at the end of the first and second academic year. Five hundred and sixty-five first-year students completed a self-report questionnaire on three different occasions. The…

  11. Illustrations as Adjuncts to Prose: A Text-Appropriate Processing Approach.

    ERIC Educational Resources Information Center

    Waddill, Paula J.; And Others

    1988-01-01

    The effects of pictorial illustrations on memory for text were studied in 144 college students. Two experiments indicated that illustrations serve a supplementary function; adjunct pictures alone, without special processing instructions, do not help learners encode information that is not normally encoded in the first place. (SLD)

  12. An Operational System for Subject Switching between Controlled Vocabularies: A Computational Linguistics Approach.

    ERIC Educational Resources Information Center

    Silvester, June P.; And Others

    This report describes a new automated process that pioneers full-scale operational use of subject switching by the NASA (National Aeronautics and Space Administration) Scientific and Technical Information (STI) Facility. The subject switching process routinely translates machine-readable subject terms from one controlled vocabulary into the…

  13. A Subjective and Objective Process for Athletic Training Student Selection

    ERIC Educational Resources Information Center

    Hawkins, Jeremy R.; McLoda, Todd A.; Stanek, Justin M.

    2015-01-01

    Context: Admission decisions are made annually concerning whom to accept into athletic training programs. Objective: To present an approach used to make admissions decisions at an undergraduate athletic training program and to corroborate this information by comparing each aspect to nursing program admission processes. Background: Annually,…

  14. "Meta-Talk" as a Composition Tool: Promoting Reflective Dialogue during the Drafting Process

    ERIC Educational Resources Information Center

    Song, Ah-Young

    2017-01-01

    This article argues for expanded opportunities for metalinguistic dialogue and written response rounds in order to better understand students' needs. Encouraging students to reflect on their compositions can invite multiple stylistic approaches and inform a more participatory composition process. The writing explores theoretical underpinnings,…

  15. A Comparative Approach to Educational Forms and Learning Processes.

    ERIC Educational Resources Information Center

    Lave, Jean

    1982-01-01

    Study of processes by which Liberian apprentice tailors learn their craft is the basis for questioning the traditional dichotomy of "formal" and "informal" education. Used as an analogy to demonstrate that anthropologists need not leave the study of learning to the psychologists, but can make valuable contributions by pursuing…

  16. Teaching Information Literacy and Scientific Process Skills: An Integrated Approach.

    ERIC Educational Resources Information Center

    Souchek, Russell; Meier, Marjorie

    1997-01-01

    Describes an online searching and scientific process component taught as part of the laboratory for a general zoology course. The activities were designed to be gradually more challenging, culminating in a student-developed final research project. Student evaluations were positive, and faculty indicated that student research skills transferred to…

  17. Ethnographic process evaluation in primary care: explaining the complexity of implementation.

    PubMed

    Bunce, Arwen E; Gold, Rachel; Davis, James V; McMullen, Carmit K; Jaworski, Victoria; Mercer, MaryBeth; Nelson, Christine

    2014-12-05

    The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. The deep understanding of the 'how' and 'why' behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.

  18. Multi-template image matching using alpha-rooted biquaternion phase correlation with application to logo recognition

    NASA Astrophysics Data System (ADS)

    DelMarco, Stephen

    2011-06-01

    Hypercomplex approaches are seeing increased application to signal and image processing problems. The use of multicomponent hypercomplex numbers, such as quaternions, enables the simultaneous co-processing of multiple signal or image components. This joint processing capability can provide improved exploitation of the information contained in the data, thereby leading to improved performance in detection and recognition problems. In this paper, we apply hypercomplex processing techniques to the logo image recognition problem. Specifically, we develop an image matcher by generalizing classical phase correlation to the biquaternion case. We further incorporate biquaternion Fourier domain alpha-rooting enhancement to create Alpha-Rooted Biquaternion Phase Correlation (ARBPC). We present the mathematical properties which justify use of ARBPC as an image matcher. We present numerical performance results of a logo verification problem using real-world logo data, demonstrating the performance improvement obtained using the hypercomplex approach. We compare results of the hypercomplex approach to standard multi-template matching approaches.

  19. Pre-processing liquid chromatography/high-resolution mass spectrometry data: extracting pure mass spectra by deconvolution from the invariance of isotopic distribution.

    PubMed

    Krishnan, Shaji; Verheij, Elwin E R; Bas, Richard C; Hendriks, Margriet W B; Hankemeier, Thomas; Thissen, Uwe; Coulier, Leon

    2013-05-15

    Mass spectra obtained by deconvolution of liquid chromatography/high-resolution mass spectrometry (LC/HRMS) data can be impaired by non-informative mass-over-charge (m/z) channels. This impairment of mass spectra can have significant negative influence on further post-processing, like quantification and identification. A metric derived from the knowledge of errors in isotopic distribution patterns, and quality of the signal within a pre-defined mass chromatogram block, has been developed to pre-select all informative m/z channels. This procedure results in the clean-up of deconvoluted mass spectra by maintaining the intensity counts from m/z channels that originate from a specific compound/molecular ion, for example, molecular ion, adducts, (13) C-isotopes, multiply charged ions and removing all m/z channels that are not related to the specific peak. The methodology has been successfully demonstrated for two sets of high-resolution LC/MS data. The approach described is therefore thought to be a useful tool in the automatic processing of LC/HRMS data. It clearly shows the advantages compared to other approaches like peak picking and de-isotoping in the sense that all information is retained while non-informative data is removed automatically. Copyright © 2013 John Wiley & Sons, Ltd.

  20. Technical Potential Assessment for the Renewable Energy Zone (REZ) Process: A GIS-Based Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Nathan; Roberts, Billy J

    Geographic Information Systems (GIS)-based energy resource and technical potential assessments identify areas capable of supporting high levels of renewable energy (RE) development as part of a Renewable Energy Zone (REZ) Transmission Planning process. This document expands on the REZ Process to aid practitioners in conducting GIS-based RE resource and technical potential assessments. The REZ process is an approach to plan, approve, and build transmission infrastructure that connects REZs - geographic areas that have high-quality RE resources, suitable topography and land-use designations, and demonstrated developer interest - to the power system. The REZ process helps to increase the share of solarmore » photovoltaic (PV), wind, and other resources while also maintaining reliability and economics.« less

  1. [Multi-criteria decision analysis for health technology resource allocation and assessment: so far and so near?

    PubMed

    Campolina, Alessandro Gonçalves; Soárez, Patrícia Coelho De; Amaral, Fábio Vieira do; Abe, Jair Minoro

    2017-10-26

    Multi-criteria decision analysis (MCDA) is an emerging tool that allows the integration of relevant factors for health technology assessment (HTA). This study aims to present a summary of the methodological characteristics of MCDA: definitions, approaches, applications, and implementation stages. A case study was conducted in the São Paulo State Cancer Institute (ICESP) in order to understand the perspectives of decision-makers in the process of drafting a recommendation for the incorporation of technology in the Brazilian Unified National Health System (SUS), through a report by the Brazilian National Commission for the Incorporation of Technologies in the SUS (CONITEC). Paraconsistent annotated evidential logic Eτ was the methodological approach adopted in the study, since it can serve as an underlying logic for constructs capable of synthesizing objective information (from the scientific literature) and subjective information (from experts' values and preferences in the area of knowledge). It also allows the incorporation of conflicting information (contradictions), as well as vague and even incomplete information in the valuation process, resulting from imperfection of the available scientific evidence. The method has the advantages of allowing explicit consideration of the criteria that influenced the decision, facilitating follow-up and visualization of process stages, allowing assessment of the contribution of each criterion separately, and in aggregate, to the decision's outcome, facilitating the discussion of diverging perspectives by different stakeholder groups, and increasing the understanding of the resulting recommendations. The use of an explicit MCDA approach should facilitate conflict mediation and optimize participation by different stakeholder groups.

  2. Trust-Based Security Level Evaluation Using Bayesian Belief Networks

    NASA Astrophysics Data System (ADS)

    Houmb, Siv Hilde; Ray, Indrakshi; Ray, Indrajit; Chakraborty, Sudip

    Security is not merely about technical solutions and patching vulnerabilities. Security is about trade-offs and adhering to realistic security needs, employed to support core business processes. Also, modern systems are subject to a highly competitive market, often demanding rapid development cycles, short life-time, short time-to-market, and small budgets. Security evaluation standards, such as ISO 14508 Common Criteria and ISO/IEC 27002, are not adequate for evaluating the security of many modern systems for resource limitations, time-to-market, and other constraints. Towards this end, we propose an alternative time and cost effective approach for evaluating the security level of a security solution, system or part thereof. Our approach relies on collecting information from different sources, who are trusted to varying degrees, and on using a trust measure to aggregate available information when deriving security level. Our approach is quantitative and implemented as a Bayesian Belief Network (BBN) topology, allowing us to reason over uncertain information and seemingly aggregating disparate information. We illustrate our approach by deriving the security level of two alternative Denial of Service (DoS) solutions. Our approach can also be used in the context of security solution trade-off analysis.

  3. Holistic versus monomeric strategies for hydrological modelling of human-modified hydrosystems

    NASA Astrophysics Data System (ADS)

    Nalbantis, I.; Efstratiadis, A.; Rozos, E.; Kopsiafti, M.; Koutsoyiannis, D.

    2011-03-01

    The modelling of human-modified basins that are inadequately measured constitutes a challenge for hydrological science. Often, models for such systems are detailed and hydraulics-based for only one part of the system while for other parts oversimplified models or rough assumptions are used. This is typically a bottom-up approach, which seeks to exploit knowledge of hydrological processes at the micro-scale at some components of the system. Also, it is a monomeric approach in two ways: first, essential interactions among system components may be poorly represented or even omitted; second, differences in the level of detail of process representation can lead to uncontrolled errors. Additionally, the calibration procedure merely accounts for the reproduction of the observed responses using typical fitting criteria. The paper aims to raise some critical issues, regarding the entire modelling approach for such hydrosystems. For this, two alternative modelling strategies are examined that reflect two modelling approaches or philosophies: a dominant bottom-up approach, which is also monomeric and, very often, based on output information, and a top-down and holistic approach based on generalized information. Critical options are examined, which codify the differences between the two strategies: the representation of surface, groundwater and water management processes, the schematization and parameterization concepts and the parameter estimation methodology. The first strategy is based on stand-alone models for surface and groundwater processes and for water management, which are employed sequentially. For each model, a different (detailed or coarse) parameterization is used, which is dictated by the hydrosystem schematization. The second strategy involves model integration for all processes, parsimonious parameterization and hybrid manual-automatic parameter optimization based on multiple objectives. A test case is examined in a hydrosystem in Greece with high complexities, such as extended surface-groundwater interactions, ill-defined boundaries, sinks to the sea and anthropogenic intervention with unmeasured abstractions both from surface water and aquifers. Criteria for comparison are the physical consistency of parameters, the reproduction of runoff hydrographs at multiple sites within the studied basin, the likelihood of uncontrolled model outputs, the required amount of computational effort and the performance within a stochastic simulation setting. Our work allows for investigating the deterioration of model performance in cases where no balanced attention is paid to all components of human-modified hydrosystems and the related information. Also, sources of errors are identified and their combined effect are evaluated.

  4. Combining Community Engagement and Scientific Approaches in Next-Generation Monitor Siting: The Case of the Imperial County Community Air Network.

    PubMed

    Wong, Michelle; Bejarano, Esther; Carvlin, Graeme; Fellows, Katie; King, Galatea; Lugo, Humberto; Jerrett, Michael; Meltzer, Dan; Northcross, Amanda; Olmedo, Luis; Seto, Edmund; Wilkie, Alexa; English, Paul

    2018-03-15

    Air pollution continues to be a global public health threat, and the expanding availability of small, low-cost air sensors has led to increased interest in both personal and crowd-sourced air monitoring. However, to date, few low-cost air monitoring networks have been developed with the scientific rigor or continuity needed to conduct public health surveillance and inform policy. In Imperial County, California, near the U.S./Mexico border, we used a collaborative, community-engaged process to develop a community air monitoring network that attains the scientific rigor required for research, while also achieving community priorities. By engaging community residents in the project design, monitor siting processes, data dissemination, and other key activities, the resulting air monitoring network data are relevant, trusted, understandable, and used by community residents. Integration of spatial analysis and air monitoring best practices into the network development process ensures that the data are reliable and appropriate for use in research activities. This combined approach results in a community air monitoring network that is better able to inform community residents, support research activities, guide public policy, and improve public health. Here we detail the monitor siting process and outline the advantages and challenges of this approach.

  5. Nurse adoption of continuous patient monitoring on acute post-surgical units: managing technology implementation.

    PubMed

    Jeskey, Mary; Card, Elizabeth; Nelson, Donna; Mercaldo, Nathaniel D; Sanders, Neal; Higgins, Michael S; Shi, Yaping; Michaels, Damon; Miller, Anne

    2011-10-01

    To report an exploratory action-research process used during the implementation of continuous patient monitoring in acute post-surgical nursing units. Substantial US Federal funding has been committed to implementing new health care technology, but failure to manage implementation processes may limit successful adoption and the realisation of proposed benefits. Effective approaches for managing barriers to new technology implementation are needed. Continuous patient monitoring was implemented in three of 13 medical/surgical units. An exploratory action-feedback approach, using time-series nurse surveys, was used to identify barriers and develop and evaluate responses. Post-hoc interviews and document analysis were used to describe the change implementation process. Significant differences were identified in night- and dayshift nurses' perceptions of technology benefits. Research nurses' facilitated the change process by evolving 'clinical nurse implementation specialist' expertise. Health information technology (HIT)-related patient outcomes are mediated through nurses' acting on new information but HIT designed for critical care may not transfer to acute care settings. Exploratory action-feedback approaches can assist nurse managers in assessing and mitigating the real-world effects of HIT implementations. It is strongly recommended that nurse managers identify stakeholders and develop comprehensive plans for monitoring the effects of HIT in their units. © 2011 Blackwell Publishing Ltd.

  6. Combining Community Engagement and Scientific Approaches in Next-Generation Monitor Siting: The Case of the Imperial County Community Air Network

    PubMed Central

    Wong, Michelle; Bejarano, Esther; Carvlin, Graeme; King, Galatea; Lugo, Humberto; Jerrett, Michael; Northcross, Amanda; Olmedo, Luis; Seto, Edmund; Wilkie, Alexa; English, Paul

    2018-01-01

    Air pollution continues to be a global public health threat, and the expanding availability of small, low-cost air sensors has led to increased interest in both personal and crowd-sourced air monitoring. However, to date, few low-cost air monitoring networks have been developed with the scientific rigor or continuity needed to conduct public health surveillance and inform policy. In Imperial County, California, near the U.S./Mexico border, we used a collaborative, community-engaged process to develop a community air monitoring network that attains the scientific rigor required for research, while also achieving community priorities. By engaging community residents in the project design, monitor siting processes, data dissemination, and other key activities, the resulting air monitoring network data are relevant, trusted, understandable, and used by community residents. Integration of spatial analysis and air monitoring best practices into the network development process ensures that the data are reliable and appropriate for use in research activities. This combined approach results in a community air monitoring network that is better able to inform community residents, support research activities, guide public policy, and improve public health. Here we detail the monitor siting process and outline the advantages and challenges of this approach. PMID:29543726

  7. Rejection of an innovation: health information management training materials in east Africa.

    PubMed

    Gladwin, J; Dixon, R A; Wilson, T D

    2002-12-01

    A shift towards decentralization in many low-income countries has meant more skills are demanded of primary health care managers, including data and information handling at all levels of the health care system. Ministries of Health are changing their central reporting health information systems to health management information systems with emphasis on managers utilizing information at the point of collection. This paper reports on a research study to investigate the introduction of new information management strategies intended to promote an informational approach to management at the operational health service level in low-income countries. It aims to understand the process taking place when externally developed training materials (PHC MAP), which are intended to strengthen health management information systems, are introduced to potential users in an east African country. A case study has been undertaken and this research has demonstrated that the dynamic equilibrium approach to organizational change is applicable to the introduction of new information management strategies and management approaches in low-income countries. Although PHC MAP developers envisaged a technical innovation needing implementation, potential users saw the situation as one of organizational change. Contributions to theory have been made and many implications for introducing new information systems or the informational approach to management are identified. This theoretical framework could also facilitate the introduction of future information management innovations and would allow practitioners to perceive the introduction of information management innovations as one of organizational change that needs to be managed. Consequently, issues that may facilitate or inhibit adoption could be identified in advance.

  8. An Approach for Automatic Generation of Adaptive Hypermedia in Education with Multilingual Knowledge Discovery Techniques

    ERIC Educational Resources Information Center

    Alfonseca, Enrique; Rodriguez, Pilar; Perez, Diana

    2007-01-01

    This work describes a framework that combines techniques from Adaptive Hypermedia and Natural Language processing in order to create, in a fully automated way, on-line information systems from linear texts in electronic format, such as textbooks. The process is divided into two steps: an "off-line" processing step, which analyses the source text,…

  9. An Extended Petri-Net Based Approach for Supply Chain Process Enactment in Resource-Centric Web Service Environment

    NASA Astrophysics Data System (ADS)

    Wang, Xiaodong; Zhang, Xiaoyu; Cai, Hongming; Xu, Boyi

    Enacting a supply-chain process involves variant partners and different IT systems. REST receives increasing attention for distributed systems with loosely coupled resources. Nevertheless, resource model incompatibilities and conflicts prevent effective process modeling and deployment in resource-centric Web service environment. In this paper, a Petri-net based framework for supply-chain process integration is proposed. A resource meta-model is constructed to represent the basic information of resources. Then based on resource meta-model, XML schemas and documents are derived, which represent resources and their states in Petri-net. Thereafter, XML-net, a high level Petri-net, is employed for modeling control and data flow of process. From process model in XML-net, RESTful services and choreography descriptions are deduced. Therefore, unified resource representation and RESTful services description are proposed for cross-system integration in a more effective way. A case study is given to illustrate the approach and the desirable features of the approach are discussed.

  10. Scenes for Social Information Processing in Adolescence: Item and factor analytic procedures for psychometric appraisal.

    PubMed

    Vagos, Paula; Rijo, Daniel; Santos, Isabel M

    2016-04-01

    Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.

  11. Density estimation in tiger populations: combining information for strong inference

    USGS Publications Warehouse

    Gopalaswamy, Arjun M.; Royle, J. Andrew; Delampady, Mohan; Nichols, James D.; Karanth, K. Ullas; Macdonald, David W.

    2012-01-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture–recapture data. The model, which combined information, provided the most precise estimate of density (8.5 ± 1.95 tigers/100 km2 [posterior mean ± SD]) relative to a model that utilized only one data source (photographic, 12.02 ± 3.02 tigers/100 km2 and fecal DNA, 6.65 ± 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  12. Density estimation in tiger populations: combining information for strong inference.

    PubMed

    Gopalaswamy, Arjun M; Royle, J Andrew; Delampady, Mohan; Nichols, James D; Karanth, K Ullas; Macdonald, David W

    2012-07-01

    A productive way forward in studies of animal populations is to efficiently make use of all the information available, either as raw data or as published sources, on critical parameters of interest. In this study, we demonstrate two approaches to the use of multiple sources of information on a parameter of fundamental interest to ecologists: animal density. The first approach produces estimates simultaneously from two different sources of data. The second approach was developed for situations in which initial data collection and analysis are followed up by subsequent data collection and prior knowledge is updated with new data using a stepwise process. Both approaches are used to estimate density of a rare and elusive predator, the tiger, by combining photographic and fecal DNA spatial capture-recapture data. The model, which combined information, provided the most precise estimate of density (8.5 +/- 1.95 tigers/100 km2 [posterior mean +/- SD]) relative to a model that utilized only one data source (photographic, 12.02 +/- 3.02 tigers/100 km2 and fecal DNA, 6.65 +/- 2.37 tigers/100 km2). Our study demonstrates that, by accounting for multiple sources of available information, estimates of animal density can be significantly improved.

  13. 76 FR 38187 - International Conference on Harmonisation; Draft Guidance on Q11 Development and Manufacture of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-29

    ...The Food and Drug Administration (FDA) is announcing the availability of a draft guidance entitled ``Q11 Development and Manufacture of Drug Substances.'' The draft guidance was prepared under the auspices of the International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH). The draft guidance describes approaches to developing process and drug substance understanding and provides guidance on what information should be provided in certain sections of the Common Technical Document (CTD). The draft guidance is intended to harmonize the scientific and technical principles relating to the description and justification of the development and manufacturing process of drug substances (both chemical entities and biotechnological/biological entities) to enable a consistent approach for providing and evaluating this information across the three regions.

  14. MedEx/J: A One-Scan Simple and Fast NLP Tool for Japanese Clinical Texts.

    PubMed

    Aramaki, Eiji; Yano, Ken; Wakamiya, Shoko

    2017-01-01

    Because of recent replacement of physical documents with electronic medical records (EMR), the importance of information processing in the medical field has increased. In light of this trend, we have been developing MedEx/J, which retrieves important Japanese language information from medical reports. MedEx/J executes two tasks simultaneously: (1) term extraction, and (2) positive and negative event classification. We designate this approach as a one-scan approach, providing simplicity of systems and reasonable accuracy. MedEx/J performance on the two tasks is described herein: (1) term extraction (Fβ = 1 = 0.87) and (2) positive-negative classification (Fβ = 1 = 0.63). This paper also presents discussion and explains remaining issues in the medical natural language processing field.

  15. Multimedia Approach and Its Effect in Teaching Mathematics for the Prospective Teachers

    ERIC Educational Resources Information Center

    Joan, D. R. Robert; Denisia, S. P.

    2012-01-01

    Multimedia improves the effectiveness of teaching learning process of multimedia in formal or informal setting and utilizing scientific principle. It allows us to sort out the information to analyse and make meaning for conceptualization and applications which is suitable for individual learners. The objectives of the study was to measure the…

  16. Sex Differences in Object Location Memory: Some Further Methodological Considerations

    ERIC Educational Resources Information Center

    Gallagher, Peter; Neave, Nick; Hamilton, Colin; Gray, John M.

    2006-01-01

    Previously it has been reported that female performance on the recall of objects and their locations in a spatial array is superior to that of males. This may reflect underlying information-processing biases whereby males organize information in a self-referential manner while females adopt a more comprehensive approach. The known female advantage…

  17. Organizational Learning and Power Dynamics: A Study in a Brazilian University

    ERIC Educational Resources Information Center

    Santos, Jane Lucia Silva; Steil, Andrea Valéria

    2015-01-01

    Purpose: This paper aims to describe and analyze organizational learning processes and power dynamics during the adoption and use of an information system (IS) at a Brazilian public organization. Design/methodology/approach: A case study was chosen as the research method. Data were gathered from documents and interviews with key informants.…

  18. Environmental Education and Networking in Mafeteng Primary Schools: A Participatory Approach

    ERIC Educational Resources Information Center

    Bitso, Constance

    2006-01-01

    This paper explores a participatory process of Environmental Education (EE) networking in Mafeteng primary schools. It gives an overview of the existing EE efforts in Lesotho, particularly the models schools of the National Curriculum Development Centre. It also provides information about Lesotho Environmental Information Network as the body that…

  19. The New Improved Big6 Workshop Handbook. Professional Growth Series.

    ERIC Educational Resources Information Center

    Eisenberg, Michael B.; Berkowitz, Robert E.

    This handbook is intended to help classroom teachers, teacher-librarians, technology teachers, administrators, parents, community members, and students to learn about the Big6 Skills approach to information and technology skills, to use the Big6 process in their own activities, and to implement a Big6 information and technology skills program. The…

  20. People and Process: Managing the Human Side of Information Technology Application. Professional Paper Series, #7.

    ERIC Educational Resources Information Center

    Baltzer, Jan A.

    Recognizing that the hard part of making the application of technology successful is the development of appropriate management structures and approaches, this paper reviews the research and writings of several top management and communications professionals and correlates these theories to the information technology environment on campus. Six…

Top