Sample records for information processing due

  1. Reversal of alcohol-induced effects on response control due to changes in proprioceptive information processing.

    PubMed

    Stock, Ann-Kathrin; Mückschel, Moritz; Beste, Christian

    2017-01-01

    Recent research has drawn interest to the effects of binge drinking on response selection. However, choosing an appropriate response is a complex endeavor that usually requires us to process and integrate several streams of information. One of them is proprioceptive information about the position of limbs. As to now, it has however remained elusive how binge drinking affects the processing of proprioceptive information during response selection and control in healthy individuals. We investigated this question using neurophysiological (EEG) techniques in a response selection task, where we manipulated proprioceptive information. The results show a reversal of alcohol-induced effects on response control due to changes in proprioceptive information processing. The most likely explanation for this finding is that proprioceptive information does not seem to be properly integrated in response selection processes during acute alcohol intoxication as found in binge drinking. The neurophysiological data suggest that processes related to the preparation and execution of the motor response, but not upstream processes related to conflict monitoring and spatial attentional orienting, underlie these binge drinking-dependent modulations. Taken together, the results show that even high doses of alcohol have very specific effects within the cascade of neurophysiological processes underlying response control and the integration of proprioceptive information during this process. © 2015 Society for the Study of Addiction.

  2. Usage of information safety requirements in improving tube bending process

    NASA Astrophysics Data System (ADS)

    Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.

    2018-05-01

    This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.

  3. 45 CFR 303.72 - Requests for collection of past-due support by Federal tax refund offset.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the accuracy of the past-due support amount. If the State IV-D agency has verified this information...) The amount of past-due support owed; (iv) The State codes as contained in the Federal Information Processing Standards (FIPS) publication of the National Bureau of Standards and also promulgated by the...

  4. Justice and Due Process.

    ERIC Educational Resources Information Center

    McKinney-Browning, Mabel C.

    1981-01-01

    Presents a directory of educational materials in the areas of justice and due process. Materials are listed in three categories--films, books, and project-created materials. For each entry, information is presented on title, author, publisher or developer, publication date, price, and annotation. (DB)

  5. Information Technology Project Processes: Understanding the Barriers to Improvement and Adoption

    ERIC Educational Resources Information Center

    Williams, Bernard L.

    2009-01-01

    Every year, organizations lose millions of dollars due to IT (Information Technology) project failures. Over time, organizations have developed processes and procedures to help reduce the incidence of challenged IT projects. Research has shown that IT project processes can work to help reduce the number of challenged projects. The research in this…

  6. Proceduralism and Bureaucracy: Due Process in the School Setting

    ERIC Educational Resources Information Center

    Kirp, David L.

    1976-01-01

    The likely consequences of applying traditional due process standards, expecially formal adversary hearings, to the public school are examined. The ruling in Goss v. Lopez suggests that fair treatment can still be expected if the hearings are treated as opportunities for candid and informal exchange rather than prepunishment ceremonies. (LBH)

  7. Mechanism on brain information processing: Energy coding

    NASA Astrophysics Data System (ADS)

    Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa

    2006-09-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.

  8. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  9. Meghan Rene, et al., v. Dr. Suellen Reed, et al. "Due Process." Lesson Plans for Secondary School Teachers on the Constitutional Requirement of "Due Process of Law." Courts in the Classroom: Curriculum Concepts and Other Information on Indiana's Courts for the K-12 Educator.

    ERIC Educational Resources Information Center

    Osborn, Elizabeth

    In the Rene v. Reed case, Meghan Rene and other disabled students argued that their due process rights were violated in regard to the Indiana Statewide Testing for Educational Progress (ISTEP) graduation examination. This set of four lesson plans uses the case of Rene v. Reed, which was first argued before the Indiana Supreme Court, to study the…

  10. A Multilevel Comprehensive Assessment of International Accreditation for Business Programmes-Based on AMBA Accreditation of GDUFS

    ERIC Educational Resources Information Center

    Jiang, Yong

    2017-01-01

    Traditional mathematical methods built around exactitude have limitations when applied to the processing of educational information, due to their uncertainty and imperfection. Alternative mathematical methods, such as grey system theory, have been widely applied in processing incomplete information systems and have proven effective in a number of…

  11. An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi

    NASA Astrophysics Data System (ADS)

    Deng, D.-P.; Lemmens, R.

    2011-08-01

    The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.

  12. Estimating User Influence in Online Social Networks Subject to Information Overload

    NASA Astrophysics Data System (ADS)

    Li, Pei; Sun, Yunchuan; Chen, Yingwen; Tian, Zhi

    2014-11-01

    Online social networks have attracted remarkable attention since they provide various approaches for hundreds of millions of people to stay connected with their friends. Due to the existence of information overload, the research on diffusion dynamics in epidemiology cannot be adopted directly to that in online social networks. In this paper, we consider diffusion dynamics in online social networks subject to information overload, and model the information-processing process of a user by a queue with a batch arrival and a finite buffer. We use the average number of times a message is processed after it is generated by a given user to characterize the user influence, which is then estimated through theoretical analysis for a given network. We validate the accuracy of our estimation by simulations, and apply the results to study the impacts of different factors on the user influence. Among the observations, we find that the impact of network size on the user influence is marginal while the user influence decreases with assortativity due to information overload, which is particularly interesting.

  13. Computer Self-Efficacy among Health Information Students

    ERIC Educational Resources Information Center

    Hendrix, Dorothy Marie

    2011-01-01

    Roles and functions of health information professionals are evolving due to the mandated electronic health record adoption process for healthcare facilities. A knowledgeable workforce with computer information technology skill sets is required for the successful collection of quality patient-care data, improvement of productivity, and…

  14. A Survey of Stemming Algorithms in Information Retrieval

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angélica; Imbert, Ricardo; Ramírez, Jaime

    2014-01-01

    Background: During the last fifty years, improved information retrieval techniques have become necessary because of the huge amount of information people have available, which continues to increase rapidly due to the use of new technologies and the Internet. Stemming is one of the processes that can improve information retrieval in terms of…

  15. An Analysis of Informal Reasoning Fallacy and Critical Thinking Dispositions among Malaysian Undergraduates

    ERIC Educational Resources Information Center

    Ramasamy, Shamala

    2011-01-01

    In this information age, the amount of complex information available due to technological advancement would require undergraduates to be extremely competent in processing information systematically. Critical thinking ability of undergraduates has been the focal point among educators, employers and the public at large. One of the dimensions of…

  16. Neural processing of visual information under interocular suppression: a critical review

    PubMed Central

    Sterzer, Philipp; Stein, Timo; Ludwig, Karin; Rothkirch, Marcus; Hesselmann, Guido

    2014-01-01

    When dissimilar stimuli are presented to the two eyes, only one stimulus dominates at a time while the other stimulus is invisible due to interocular suppression. When both stimuli are equally potent in competing for awareness, perception alternates spontaneously between the two stimuli, a phenomenon called binocular rivalry. However, when one stimulus is much stronger, e.g., due to higher contrast, the weaker stimulus can be suppressed for prolonged periods of time. A technique that has recently become very popular for the investigation of unconscious visual processing is continuous flash suppression (CFS): High-contrast dynamic patterns shown to one eye can render a low-contrast stimulus shown to the other eye invisible for up to minutes. Studies using CFS have produced new insights but also controversies regarding the types of visual information that can be processed unconsciously as well as the neural sites and the relevance of such unconscious processing. Here, we review the current state of knowledge in regard to neural processing of interocularly suppressed information. Focusing on recent neuroimaging findings, we discuss whether and to what degree such suppressed visual information is processed at early and more advanced levels of the visual processing hierarchy. We review controversial findings related to the influence of attention on early visual processing under interocular suppression, the putative differential roles of dorsal and ventral areas in unconscious object processing, and evidence suggesting privileged unconscious processing of emotional and other socially relevant information. On a more general note, we discuss methodological and conceptual issues, from practical issues of how unawareness of a stimulus is assessed to the overarching question of what constitutes an adequate operational definition of unawareness. Finally, we propose approaches for future research to resolve current controversies in this exciting research area. PMID:24904469

  17. Culturally Competent Informed-Consent Process to Evaluate a Social Policy for Older Persons With Low Literacy: The Mexican Case

    PubMed Central

    Aguila, Emma; Weidmer, Beverly A.; Illingworth, Alfonso Rivera; Martinez, Homero

    2017-01-01

    The informed-consent process seeks to provide complete information to participants about a research project and to protect personal information they may disclose. In this article, we present an informed-consent process that we piloted and improved to obtain consent from older adults in Yucatan, Mexico. Respondents had limited fluency in Spanish, spoke the local Mayan language, and had some physical limitations due to their age. We describe how we adapted the informed-consent process to comply with U.S. and Mexican regulations, while simplifying the forms and providing them in Spanish and Mayan. We present the challenges and lessons learned when dealing with low-literacy older populations, some with diminished autonomy, in a bilingual context and a binational approach to the legal framework. PMID:28824826

  18. Is the Survival-Processing Memory Advantage Due to Richness of Encoding?

    ERIC Educational Resources Information Center

    Röer, Jan P.; Bell, Raoul; Buchner, Axel

    2013-01-01

    Memory for words rated according to their relevance in a grassland survival context is exceptionally good. According to Nairne, Thompson, and Pandeirada's (2007) evolutionary-based explanation, natural selection processes have tuned the human memory system to prioritize the processing of fitness-relevant information. The survival-processing memory…

  19. Anxiety, anticipation and contextual information: A test of attentional control theory.

    PubMed

    Cocks, Adam J; Jackson, Robin C; Bishop, Daniel T; Williams, A Mark

    2016-09-01

    We tested the assumptions of Attentional Control Theory (ACT) by examining the impact of anxiety on anticipation using a dynamic, time-constrained task. Moreover, we examined the involvement of high- and low-level cognitive processes in anticipation and how their importance may interact with anxiety. Skilled and less-skilled tennis players anticipated the shots of opponents under low- and high-anxiety conditions. Participants viewed three types of video stimuli, each depicting different levels of contextual information. Performance effectiveness (response accuracy) and processing efficiency (response accuracy divided by corresponding mental effort) were measured. Skilled players recorded higher levels of response accuracy and processing efficiency compared to less-skilled counterparts. Processing efficiency significantly decreased under high- compared to low-anxiety conditions. No difference in response accuracy was observed. When reviewing directional errors, anxiety was most detrimental to performance in the condition conveying only contextual information, suggesting that anxiety may have a greater impact on high-level (top-down) cognitive processes, potentially due to a shift in attentional control. Our findings provide partial support for ACT; anxiety elicited greater decrements in processing efficiency than performance effectiveness, possibly due to predominance of the stimulus-driven attentional system.

  20. Autism, Context/Noncontext Information Processing, and Atypical Development

    PubMed Central

    Skoyles, John R.

    2011-01-01

    Autism has been attributed to a deficit in contextual information processing. Attempts to understand autism in terms of such a defect, however, do not include more recent computational work upon context. This work has identified that context information processing depends upon the extraction and use of the information hidden in higher-order (or indirect) associations. Higher-order associations underlie the cognition of context rather than that of situations. This paper starts by examining the differences between higher-order and first-order (or direct) associations. Higher-order associations link entities not directly (as with first-order ones) but indirectly through all the connections they have via other entities. Extracting this information requires the processing of past episodes as a totality. As a result, this extraction depends upon specialised extraction processes separate from cognition. This information is then consolidated. Due to this difference, the extraction/consolidation of higher-order information can be impaired whilst cognition remains intact. Although not directly impaired, cognition will be indirectly impaired by knock on effects such as cognition compensating for absent higher-order information with information extracted from first-order associations. This paper discusses the implications of this for the inflexible, literal/immediate, and inappropriate information processing of autistic individuals. PMID:22937255

  1. Temporal Information Partitioning Networks (TIPNets): A process network approach to infer ecohydrologic shifts

    NASA Astrophysics Data System (ADS)

    Goodwell, Allison E.; Kumar, Praveen

    2017-07-01

    In an ecohydrologic system, components of atmospheric, vegetation, and root-soil subsystems participate in forcing and feedback interactions at varying time scales and intensities. The structure of this network of complex interactions varies in terms of connectivity, strength, and time scale due to perturbations or changing conditions such as rainfall, drought, or land use. However, characterization of these interactions is difficult due to multivariate and weak dependencies in the presence of noise, nonlinearities, and limited data. We introduce a framework for Temporal Information Partitioning Networks (TIPNets), in which time-series variables are viewed as nodes, and lagged multivariate mutual information measures are links. These links are partitioned into synergistic, unique, and redundant information components, where synergy is information provided only jointly, unique information is only provided by a single source, and redundancy is overlapping information. We construct TIPNets from 1 min weather station data over several hour time windows. From a comparison of dry, wet, and rainy conditions, we find that information strengths increase when solar radiation and surface moisture are present, and surface moisture and wind variability are redundant and synergistic influences, respectively. Over a growing season, network trends reveal patterns that vary with vegetation and rainfall patterns. The framework presented here enables us to interpret process connectivity in a multivariate context, which can lead to better inference of behavioral shifts due to perturbations in ecohydrologic systems. This work contributes to more holistic characterizations of system behavior, and can benefit a wide variety of studies of complex systems.

  2. Non-Integrated Information and Communication Technologies in the Kidney Transplantation Process in Brazil.

    PubMed

    Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan

    2015-01-01

    The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.

  3. 75 FR 68702 - Regulation SHO

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-09

    ... extended compliance period will give industry participants additional time for programming and testing for... time for programming and testing for compliance with the Rule's requirements. We have been informed that there have been some delays in the programming process, due in part to certain information, which...

  4. 75 FR 57304 - Periodic Reporting Proposals

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-20

    .... Proposal Seven would introduce a mailflow-based model of mail processing costs for Standard Mail Parcels... and invites public comment. DATES: Comments are due October 8, 2010. FOR FURTHER INFORMATION CONTACT: Stephen L. Sharfman, General Counsel, [email protected] or 202-789-6820. SUPPLEMENTARY INFORMATION...

  5. The effects of solar incidence angle over digital processing of LANDSAT data

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.

    1983-01-01

    A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.

  6. Rational Learning and Information Sampling: On the "Naivety" Assumption in Sampling Explanations of Judgment Biases

    ERIC Educational Resources Information Center

    Le Mens, Gael; Denrell, Jerker

    2011-01-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them.…

  7. Enhanced intelligence through optimized TCPED concepts for airborne ISR

    NASA Astrophysics Data System (ADS)

    Spitzer, M.; Kappes, E.; Böker, D.

    2012-06-01

    Current multinational operations show an increased demand for high quality actionable intelligence for different operational levels and users. In order to achieve sufficient availability, quality and reliability of information, various ISR assets are orchestrated within operational theatres. Especially airborne Intelligence, Surveillance and Reconnaissance (ISR) assets provide - due to their endurance, non-intrusiveness, robustness, wide spectrum of sensors and flexibility to mission changes - significant intelligence coverage of areas of interest. An efficient and balanced utilization of airborne ISR assets calls for advanced concepts for the entire ISR process framework including the Tasking, Collection, Processing, Exploitation and Dissemination (TCPED). Beyond this, the employment of current visualization concepts, shared information bases and information customer profiles, as well as an adequate combination of ISR sensors with different information age and dynamic (online) retasking process elements provides the optimization of interlinked TCPED processes towards higher process robustness, shorter process duration, more flexibility between ISR missions and, finally, adequate "entry points" for information requirements by operational users and commands. In addition, relevant Trade-offs of distributed and dynamic TCPED processes are examined and future trends are depicted.

  8. Validating archetypes for the Multiple Sclerosis Functional Composite.

    PubMed

    Braun, Michael; Brandt, Alexander Ulrich; Schulz, Stefan; Boeker, Martin

    2014-08-03

    Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions.This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model.

  9. Validating archetypes for the Multiple Sclerosis Functional Composite

    PubMed Central

    2014-01-01

    Background Numerous information models for electronic health records, such as openEHR archetypes are available. The quality of such clinical models is important to guarantee standardised semantics and to facilitate their interoperability. However, validation aspects are not regarded sufficiently yet. The objective of this report is to investigate the feasibility of archetype development and its community-based validation process, presuming that this review process is a practical way to ensure high-quality information models amending the formal reference model definitions. Methods A standard archetype development approach was applied on a case set of three clinical tests for multiple sclerosis assessment: After an analysis of the tests, the obtained data elements were organised and structured. The appropriate archetype class was selected and the data elements were implemented in an iterative refinement process. Clinical and information modelling experts validated the models in a structured review process. Results Four new archetypes were developed and publicly deployed in the openEHR Clinical Knowledge Manager, an online platform provided by the openEHR Foundation. Afterwards, these four archetypes were validated by domain experts in a team review. The review was a formalised process, organised in the Clinical Knowledge Manager. Both, development and review process turned out to be time-consuming tasks, mostly due to difficult selection processes between alternative modelling approaches. The archetype review was a straightforward team process with the goal to validate archetypes pragmatically. Conclusions The quality of medical information models is crucial to guarantee standardised semantic representation in order to improve interoperability. The validation process is a practical way to better harmonise models that diverge due to necessary flexibility left open by the underlying formal reference model definitions. This case study provides evidence that both community- and tool-enabled review processes, structured in the Clinical Knowledge Manager, ensure archetype quality. It offers a pragmatic but feasible way to reduce variation in the representation of clinical information models towards a more unified and interoperable model. PMID:25087081

  10. Methods of Labor Economy Increasing in Educational Organization

    ERIC Educational Resources Information Center

    Dorozhkin, Evgenij M.; Krotov, Yakov E.; Tkacheva, Oksana N.; Kruchkov, Konstantin V.; Korotaev, Ivan S.

    2016-01-01

    The urgency of problem under investigation due to fact that increasing demand of the information technology infrastructure development in current conditions of educational institutions functioning, including formation of the information-educational environment point of view. Offered organizational and economic model of constructing processes for…

  11. Calibration of skill and judgment in driving: development of a conceptual framework and the implications for road safety.

    PubMed

    Horrey, William J; Lesch, Mary F; Mitsopoulos-Rubens, Eve; Lee, John D

    2015-03-01

    Humans often make inflated or erroneous estimates of their own ability or performance. Such errors in calibration can be due to incomplete processing, neglect of available information or due to improper weighing or integration of the information and can impact our decision-making, risk tolerance, and behaviors. In the driving context, these outcomes can have important implications for safety. The current paper discusses the notion of calibration in the context of self-appraisals and self-competence as well as in models of self-regulation in driving. We further develop a conceptual framework for calibration in the driving context borrowing from earlier models of momentary demand regulation, information processing, and lens models for information selection and utilization. Finally, using the model we describe the implications for calibration (or, more specifically, errors in calibration) for our understanding of driver distraction, in-vehicle automation and autonomous vehicles, and the training of novice and inexperienced drivers. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Advocating for a Population-Specific Health Literacy for People With Visual Impairments.

    PubMed

    Harrison, Tracie; Lazard, Allison

    2015-01-01

    Health literacy, the ability to access, process, and understand health information, is enhanced by the visual senses among people who are typically sighted. Emotions, meaning, speed of knowledge transfer, level of attention, and degree of relevance are all manipulated by the visual design of health information when people can see. When consumers of health information are blind or visually impaired, they access, process, and understand their health information in a multitude of methods using a variety of accommodations depending upon their severity and type of impairment. They are taught, or they learn how, to accommodate their differences by using alternative sensory experiences and interpretations. In this article, we argue that due to the unique and powerful aspects of visual learning and due to the differences in knowledge creation when people are not visually oriented, health literacy must be considered a unique construct for people with visual impairment, which requires a distinctive theoretical basis for determining the impact of their mind-constructed representations of health.

  13. [Postdonation information: the French fourth hemovigilance sub-process].

    PubMed

    Py, J-Y; Sandid, I; Jbilou, S; Dupuis, M; Adda, R; Narbey, D; Djoudi, R

    2014-11-01

    Postdonation information is the knowledge of information about the donor or his donation, occurring after it, which challenges quality or safety of the blood products stemming from this or other donations. Classical hemovigilance sub-processes concerning donors or recipients adverse events do not cover this topic. France is just about to make it official as a fourth sub-process. Less formal management of postdonation information is already set up for more than ten years. French data of the year 2013 are presented, including the regional notification level and the national reporting one. A significant level of heterogeneity is observed as for other hemovigilance sub-processes. It is mainly due to subjective rather than objective differences in risk appreciation. A real consensual work is expected about it in the future. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  14. Energy coding in biological neural networks

    PubMed Central

    Zhang, Zhikang

    2007-01-01

    According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513

  15. Holistic processing, contact, and the other-race effect in face recognition.

    PubMed

    Zhao, Mintao; Hayward, William G; Bülthoff, Isabelle

    2014-12-01

    Face recognition, holistic processing, and processing of configural and featural facial information are known to be influenced by face race, with better performance for own- than other-race faces. However, whether these various other-race effects (OREs) arise from the same underlying mechanisms or from different processes remains unclear. The present study addressed this question by measuring the OREs in a set of face recognition tasks, and testing whether these OREs are correlated with each other. Participants performed different tasks probing (1) face recognition, (2) holistic processing, (3) processing of configural information, and (4) processing of featural information for both own- and other-race faces. Their contact with other-race people was also assessed with a questionnaire. The results show significant OREs in tasks testing face memory and processing of configural information, but not in tasks testing either holistic processing or processing of featural information. Importantly, there was no cross-task correlation between any of the measured OREs. Moreover, the level of other-race contact predicted only the OREs obtained in tasks testing face memory and processing of configural information. These results indicate that these various cross-race differences originate from different aspects of face processing, in contrary to the view that the ORE in face recognition is due to cross-race differences in terms of holistic processing. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. How multiple social networks affect user awareness: The information diffusion process in multiplex networks

    NASA Astrophysics Data System (ADS)

    Li, Weihua; Tang, Shaoting; Fang, Wenyi; Guo, Quantong; Zhang, Xiao; Zheng, Zhiming

    2015-10-01

    The information diffusion process in single complex networks has been extensively studied, especially for modeling the spreading activities in online social networks. However, individuals usually use multiple social networks at the same time, and can share the information they have learned from one social network to another. This phenomenon gives rise to a new diffusion process on multiplex networks with more than one network layer. In this paper we account for this multiplex network spreading by proposing a model of information diffusion in two-layer multiplex networks. We develop a theoretical framework using bond percolation and cascading failure to describe the intralayer and interlayer diffusion. This allows us to obtain analytical solutions for the fraction of informed individuals as a function of transmissibility T and the interlayer transmission rate θ . Simulation results show that interaction between layers can greatly enhance the information diffusion process. And explosive diffusion can occur even if the transmissibility of the focal layer is under the critical threshold, due to interlayer transmission.

  17. First and Second Graders Writing Informational Text

    ERIC Educational Resources Information Center

    Read, Sylvia

    2005-01-01

    Process approaches to writing instruction in primary-grade classrooms have become widespread due to the influence of Graves (1983), Calkins (1986), Avery (1993), and others. Their work emphasizes expressive writing, particularly personal narrative, more than expository or informational writing. As a consequence, expressive writing is what children…

  18. Comparative Study on Interaction of Form and Motion Processing Streams by Applying Two Different Classifiers in Mechanism for Recognition of Biological Movement

    PubMed Central

    2014-01-01

    Research on psychophysics, neurophysiology, and functional imaging shows particular representation of biological movements which contains two pathways. The visual perception of biological movements formed through the visual system called dorsal and ventral processing streams. Ventral processing stream is associated with the form information extraction; on the other hand, dorsal processing stream provides motion information. Active basic model (ABM) as hierarchical representation of the human object had revealed novelty in form pathway due to applying Gabor based supervised object recognition method. It creates more biological plausibility along with similarity with original model. Fuzzy inference system is used for motion pattern information in motion pathway creating more robustness in recognition process. Besides, interaction of these paths is intriguing and many studies in various fields considered it. Here, the interaction of the pathways to get more appropriated results has been investigated. Extreme learning machine (ELM) has been implied for classification unit of this model, due to having the main properties of artificial neural networks, but crosses from the difficulty of training time substantially diminished in it. Here, there will be a comparison between two different configurations, interactions using synergetic neural network and ELM, in terms of accuracy and compatibility. PMID:25276860

  19. An Informative Interpretation of Decision Theory: The Information Theoretic Basis for Signal-to-Noise Ratio and Log Likelihood Ratio

    DOE PAGES

    Polcari, J.

    2013-08-16

    The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less

  20. 5 CFR 1303.10 - Access to information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request is denied, the written notification to the person making the request shall include the names of... engaged in disseminating information; (iii) The loss of substantial due process rights; or (iv) A matter... to grant it and will notify the requester of the decision. If a request for expedited treatment is...

  1. 78 FR 75580 - Information Collection Request; Submission for OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-12

    ... information collection request to the Office of Management and Budget (OMB) for review and approval. The... individuals, including technical and language skills, and availability for Peace Corps service. The Peace... skills and interests. Due to this change in the way applicants are processed and an overall agency effort...

  2. Information Competency and Creative Initiative of Personality and Their Manifestation in Activity

    ERIC Educational Resources Information Center

    Tabachuk, Natalia P.; Ledovskikh, Irina A.; Shulika, Nadezhda A.; Karpova, Irina V.; Kazinets, Victor A.; Polichka, Anatolii E.

    2018-01-01

    The relevance of the research is due to the global trends of development of the information society that are associated with the rapid advancement of civilization (IT penetration, increased computer availability, variability) and innovation processes in the sphere of education (competency-based approach, humanization and humanitarization). These…

  3. Microscopic information processing and communication in crowd dynamics

    NASA Astrophysics Data System (ADS)

    Henein, Colin Marc; White, Tony

    2010-11-01

    Due, perhaps, to the historical division of crowd dynamics research into psychological and engineering approaches, microscopic crowd models have tended toward modelling simple interchangeable particles with an emphasis on the simulation of physical factors. Despite the fact that people have complex (non-panic) behaviours in crowd disasters, important human factors in crowd dynamics such as information discovery and processing, changing goals and communication have not yet been well integrated at the microscopic level. We use our Microscopic Human Factors methodology to fuse a microscopic simulation of these human factors with a popular microscopic crowd model. By tightly integrating human factors with the existing model we can study the effects on the physical domain (movement, force and crowd safety) when human behaviour (information processing and communication) is introduced. In a large-room egress scenario with ample exits, information discovery and processing yields a crowd of non-interchangeable individuals who, despite close proximity, have different goals due to their different beliefs. This crowd heterogeneity leads to complex inter-particle interactions such as jamming transitions in open space; at high crowd energies, we found a freezing by heating effect (reminiscent of the disaster at Central Lenin Stadium in 1982) in which a barrier formation of naïve individuals trying to reach blocked exits prevented knowledgeable ones from exiting. Communication, when introduced, reduced this barrier formation, increasing both exit rates and crowd safety.

  4. Understanding the Budget Battle.

    ERIC Educational Resources Information Center

    Hritz, Townley

    1996-01-01

    Describes Head Start's financial uncertainty for the future due to the government's budget battle. Presents information on the key points in the budget process, how that process got off track in fiscal year 1996, the resulting government shutdowns, and how Head Start can prepare for the 1997 budget debates. (MOK)

  5. Instructional Screencast: A Research Conceptual Framework

    ERIC Educational Resources Information Center

    Abdul Razak, Muhammad Razuan; Mohamad Ali, Ahmad Zamzuri

    2016-01-01

    The literature review indicates that the benefit of screencast as an instructional media has not clearly proved effective for all categories of students. This is due to the individual differences in processing the information. Inadequate screencast design will cause strain to students' cognitive process which might impede learning. This…

  6. Strategic Positioning of the Web in a Multi-Channel Market Approach.

    ERIC Educational Resources Information Center

    Simons, Luuk P. A.; Steinfield, Charles; Bouwman, Harry

    2002-01-01

    Discusses channel economics in retail activities and trends toward unbundling due to the emergence of the Web channel. Highlights include sales processes and physical distribution processes; transaction costs; hybrid electronic commerce strategies; channel management and customer support; information economics, thing economics, and service…

  7. Information Diffusion in Facebook-Like Social Networks Under Information Overload

    NASA Astrophysics Data System (ADS)

    Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui

    2013-07-01

    Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.

  8. 21 CFR 60.2 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... eligibility and regulatory review period length, and (2) Ensure that parties interested in due diligence challenges will have an opportunity to participate in that process, including informal hearings. (b) The...

  9. Comparison of ship dismantling processes in India and the U.S.

    NASA Astrophysics Data System (ADS)

    Ahluwalia, Rashpal S.; Sibal, Pooja; Govindarajulu, Sriram

    2004-03-01

    This paper compares ship-dismantling processes in India and the U.S. The information for India was collected during an informal visit to the ship dismantling sites in Alang, India. The information for the U.S. was obtained from the MARAD report. For a 10,000-ton passenger ship, the Indian contractor makes a profit of about 24% compared to a loss of about 15% in the U.S. The loss in the US is primarily due to high labor costs, compliance to safety and health regulations and lack of market for used components and scrap metal.

  10. Stochastic and information-thermodynamic structures of population dynamics in a fluctuating environment

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuya J.; Sughiyama, Yuki

    2017-07-01

    Adaptation in a fluctuating environment is a process of fueling environmental information to gain fitness. Living systems have gradually developed strategies for adaptation from random and passive diversification of the phenotype to more proactive decision making, in which environmental information is sensed and exploited more actively and effectively. Understanding the fundamental relation between fitness and information is therefore crucial to clarify the limits and universal properties of adaptation. In this work, we elucidate the underlying stochastic and information-thermodynamic structure in this process, by deriving causal fluctuation relations (FRs) of fitness and information. Combined with a duality between phenotypic and environmental dynamics, the FRs reveal the limit of fitness gain, the relation of time reversibility with the achievability of the limit, and the possibility and condition for gaining excess fitness due to environmental fluctuation. The loss of fitness due to causal constraints and the limited capacity of real organisms is shown to be the difference between time-forward and time-backward path probabilities of phenotypic and environmental dynamics. Furthermore, the FRs generalize the concept of the evolutionary stable state (ESS) for fluctuating environment by giving the probability that the optimal strategy on average can be invaded by a suboptimal one owing to rare environmental fluctuation. These results clarify the information-thermodynamic structures in adaptation and evolution.

  11. 36 CFR 404.4 - Access to information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to the person making the request shall include the names of the individuals who participated in the...; (iii) The loss of substantial due process rights; or (iv) A matter of widespread and exceptional media... making the request if the request cannot be processed within the time limit specified in paragraph (f) of...

  12. 17 CFR 240.15d-15 - Controls and procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... control framework that is established by a body or group that has followed due-process procedures..., without limitation, controls and procedures designed to ensure that information required to be disclosed... disclosure. (f) The term internal control over financial reporting is defined as a process designed by, or...

  13. 17 CFR 240.15d-15 - Controls and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... control framework that is established by a body or group that has followed due-process procedures..., without limitation, controls and procedures designed to ensure that information required to be disclosed... disclosure. (f) The term internal control over financial reporting is defined as a process designed by, or...

  14. Left Lateralized Enhancement of Orofacial Somatosensory Processing Due to Speech Sounds

    ERIC Educational Resources Information Center

    Ito, Takayuki; Johns, Alexis R.; Ostry, David J.

    2013-01-01

    Purpose: Somatosensory information associated with speech articulatory movements affects the perception of speech sounds and vice versa, suggesting an intimate linkage between speech production and perception systems. However, it is unclear which cortical processes are involved in the interaction between speech sounds and orofacial somatosensory…

  15. The Impartial Hearing Officer: A Procedural Safeguards Training Manual for Utah.

    ERIC Educational Resources Information Center

    Utah State Univ., Salt Lake City. Intermountain Plains Regional Resource Center.

    The manual for due process hearing officers in Utah provides information on prehearing activities, hearing activities, decision making processes, and final reporting on hearings regarding conflicts or disagreements between the parents and the school district concerning the most appropriate educational program for the handicapped child. An…

  16. Modelling information dissemination under privacy concerns in social media

    NASA Astrophysics Data System (ADS)

    Zhu, Hui; Huang, Cheng; Lu, Rongxing; Li, Hui

    2016-05-01

    Social media has recently become an important platform for users to share news, express views, and post messages. However, due to user privacy preservation in social media, many privacy setting tools are employed, which inevitably change the patterns and dynamics of information dissemination. In this study, a general stochastic model using dynamic evolution equations was introduced to illustrate how privacy concerns impact the process of information dissemination. Extensive simulations and analyzes involving the privacy settings of general users, privileged users, and pure observers were conducted on real-world networks, and the results demonstrated that user privacy settings affect information differently. Finally, we also studied the process of information diffusion analytically and numerically with different privacy settings using two classic networks.

  17. Web-GIS platform for green infrastructure in Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Sercaianu, Mihai; Petrescu, Florian; Aldea, Mihaela; Oana, Luca; Rotaru, George

    2015-06-01

    In the last decade, reducing urban pollution and improving quality of public spaces became a more and more important issue for public administration authorities in Romania. The paper describes the development of a web-GIS solution dedicated to monitoring of the green infrastructure in Bucharest, Romania. Thus, the system allows the urban residents (citizens) to collect themselves and directly report relevant information regarding the current status of the green infrastructure of the city. Consequently, the citizens become an active component of the decision-support process within the public administration. Besides the usual technical characteristics of such geo-information processing systems, due to the complex legal and organizational problems that arise in collecting information directly from the citizens, additional analysis was required concerning, for example, local government involvement, environmental protection agencies regulations or public entities requirements. Designing and implementing the whole information exchange process, based on the active interaction between the citizens and public administration bodies, required the use of the "citizen-sensor" concept deployed with GIS tools. The information collected and reported from the field is related to a lot of factors, which are not always limited to the city level, providing the possibility to consider the green infrastructure as a whole. The "citizen-request" web-GIS for green infrastructure monitoring solution is characterized by a very diverse urban information, due to the fact that the green infrastructure itself is conditioned by a lot of urban elements, such as urban infrastructures, urban infrastructure works and construction density.

  18. Path Analysis: Health Promotion Information Access of Parent Caretaking Pattern through Parenting Education

    ERIC Educational Resources Information Center

    Sunarsih, Tri; Murti, Bhisma; Anantanyu, Sapja; Wijaya, Mahendra

    2016-01-01

    Parents often inhibit learning process organized by education, due to their ignorance about how to educate child well. Incapability of dealing with those changes leads to dysfunctional families, and problematic children. This research aimed: to analyzed the health promotion information access pattern of parent caretaking pattern through parenting…

  19. 76 FR 7145 - Notice of Public Information Collection Requirements Submitted to OMB for Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-09

    ... involved parties due to their familiarity with the certification process and salary costs and benefits... documents other than rules #0;or proposed rules that are applicable to the public. Notices of hearings #0... Public Information Collection Requirements Submitted to OMB for Review SUMMARY: This Federal Register...

  20. Software Design Document MCC CSCI (1). Volume 1 Sections 1.0-2.18

    DTIC Science & Technology

    1991-06-01

    AssociationUserProtocol /simnet/common!include/prot ____________________ ____________________ ocol/p assoc.h Primitive long Standard C type...Information. 2.2.1.4.2 ProcessMessage ProcessMessage processes a message from another process. type describes the message as either one-way, a synchronous or...Macintosh Consoles. This is sometimes necessary due to normal clock skew so that operations among the MCC components will remain synchronized . This

  1. Idle waves in high-performance computing

    NASA Astrophysics Data System (ADS)

    Markidis, Stefano; Vencels, Juris; Peng, Ivy Bo; Akhmetova, Dana; Laure, Erwin; Henri, Pierre

    2015-01-01

    The vast majority of parallel scientific applications distributes computation among processes that are in a busy state when computing and in an idle state when waiting for information from other processes. We identify the propagation of idle waves through processes in scientific applications with a local information exchange between the two processes. Idle waves are nondispersive and have a phase velocity inversely proportional to the average busy time. The physical mechanism enabling the propagation of idle waves is the local synchronization between two processes due to remote data dependency. This study provides a description of the large number of processes in parallel scientific applications as a continuous medium. This work also is a step towards an understanding of how localized idle periods can affect remote processes, leading to the degradation of global performance in parallel scientific applications.

  2. Distributed Information System Development: Review of Some Management Issues

    NASA Astrophysics Data System (ADS)

    Mishra, Deepti; Mishra, Alok

    Due to the proliferation of the Internet and globalization, distributed information system development is becoming popular. In this paper we have reviewed some significant management issues like process management, project management, requirements management and knowledge management issues which have received much attention in distributed development perspective. In this literature review we found that areas like quality and risk management issues could get only scant attention in distributed information system development.

  3. Head-up displays: Effect of information location on the processing of superimposed symbology

    NASA Technical Reports Server (NTRS)

    Sanford, Beverly D.; Foyle, David C.; Mccann, Robert S.; Jordan, Kevin

    1993-01-01

    Head-up display (HUD) symbology superimposes vehicle status information onto the external terrain, providing simultaneous visual access to both sources of information. Relative to a baseline condition in which the superimposed altitude indicator was omitted, altitude maintenance was improved by the presence of the altitude indicator, and this improvement was the same magnitude regardless of the position of the altitude indicator on the screen. However, a concurrent decifit in heading maintenance was observed only when the altitude indicator was proximal to the path information. These results did not support a model of the concurrent processing deficit based on an inability to attend to multiple locations in parallel. They are consistent with previous claims that the deficit is the product of attentional limits on subjects' ability to process two separate objects (HUD symbology and terrain information) concurrently. The absence of a performance tradeoff when the HUD and the path information were less proximal is attributed to a breaking of attentional tunneling on the HUD, possibly due to eye movements.

  4. Development of a pioneering clinical support system utilizing information technology.

    PubMed

    Hayashi, Doubun; Imai, Yasushi; Morita, Hiroyuki; Fujita, Hideo; Monzen, Koshiro; Harada, Tomohiro; Nojiri, Takefumi; Yamazaki, Tadashi; Yamazaki, Tsutomu; Nagai, Ryozo

    2004-03-01

    Nowadays, evidence-based medicine has entered the mainstream of clinical judgement and the human genome has been completely decoded. Even the concept of individually designed medicine, that is, tailor-made medicine, is now being discussed. Due to their complexity, however, management methods for clinical information have yet to be established. We have conducted a study on a universal technique which enables one to select or produce by employing information processing technology clinical findings from various clinical information generated in vast quantity in day-to-day clinical practice, and to share such information and/or the results of analysis between two or more institutions. In this study, clinically useful findings have been successfully obtained by systematizing actual clinical information and genomic information obtained by an appropriate collecting and management method of information with due consideration to ethical issues. We report here these medical achievements as well as technological ones which will play a role in propagating such medical achievements.

  5. 37 CFR 102.6 - Time limits and expedited processing.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... requests. Refusal to reasonably modify the scope of a request or arrange an alternate time frame may affect... imminent threat to the life or physical safety of an individual; (ii) The loss of substantial due process... the Government's integrity that affect public confidence; or (iv) An urgency to inform the public...

  6. New Methodology for Measuring Semantic Functional Similarity Based on Bidirectional Integration

    ERIC Educational Resources Information Center

    Jeong, Jong Cheol

    2013-01-01

    1.2 billion users in Facebook, 17 million articles in Wikipedia, and 190 million tweets per day have demanded significant increase of information processing through Internet in recent years. Similarly life sciences and bioinformatics also have faced issues of processing Big data due to the explosion of publicly available genomic information…

  7. Firm Size and Skill Formation Processes: An Emerging Debate

    ERIC Educational Resources Information Center

    Bishop, Daniel

    2012-01-01

    Recent research has established that small firms tend to develop and acquire the skills they need in different ways to those employed by larger organisations. More specifically, due to certain characteristics inherent to their small size, small firms generally display greater informality in their learning processes. As such, it is now broadly…

  8. Probing r-Process Production of Nuclei Beyond 209Bi with Gamma Rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Y.-Z.; Vogel, P.; Wasserburg, G. J.

    We estimate gamma-ray fluxes due to the decay of nuclei beyond 209Bi from a supernova or a supernova remnant assuming that the r-process occurs in supernovae. We find that a detector with a sensitivity of {approx}10-7 {gamma} cm-2 s-1 at energies from {approx}40 keV to {approx}3 MeV may detect fluxes due to the decay of 226Ra, 229Th, 241Am, 243Am, 249Cf, and 251Cf in the newly discovered supernova remnant near Vela. In addition, such a detector may detect fluxes due to the decay of 227Ac and 228Ra produced in a future supernova at a distance of {approx}1 kpc. Because nuclei withmore » mass numbers A>209 are produced solely by the r-process, such detections are the best proof for a supernova r-process site. Further, they provide the most direct information on yields of progenitor nuclei with A>209 at r-process freeze-out. Finally, detection of fluxes due to the decay of r-process nuclei over a range of masses from a supernova or a supernova remnant provides the opportunity to compare yields in a single supernova event with the solar r-process abundance pattern. (c) (c) 1999. The American Astronomical Society.« less

  9. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  10. Hydrothermal reactions of agricultural and food processing wastes in sub- and supercritical water: a review of fundamentals, mechanisms, and state of research.

    PubMed

    Pavlovič, Irena; Knez, Željko; Škerget, Mojca

    2013-08-28

    Hydrothermal (HT) reactions of agricultural and food-processing waste have been proposed as an alternative to conventional waste treatment technologies due to allowing several improvements in terms of process performance and energy and economical advantages, especially due to their great ability to process high moisture content biomass waste without prior dewatering. Complex structures of wastes and unique properties of water at higher temperatures and pressures enable a variety of physical-chemical reactions and a wide spectra of products. This paper's aim is to give extensive information about the fundamentals and mechanisms of HT reactions and provide state of the research of agri-food waste HT conversion.

  11. Recognizing Value of Educational Collaboration between High Schools and Universities Facilitated by Modern ICT

    ERIC Educational Resources Information Center

    Zielinski, K.; Czekierda, L.; Malawski, F.; Stras, R.; Zielinski, S.

    2017-01-01

    In this paper, we address the problem of an educational gap existing between high schools and universities: many students consider their choice of field of study as inappropriate, mostly due to insufficient information regarding the discipline and the university educational process. To solve this problem, we define an innovative, information and…

  12. 78 FR 70066 - 60-Day Notice of Proposed Information Collection: FHA Stakeholder Feedback for the New FHA Single...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... interested parties on the proposed collection of information. The purpose of this notice is to allow for 60 days of public comment. DATES: Comments Due Date: January 21, 2014. ADDRESSES: Interested persons are... simple, more directive language Aligning the flow of the handbook to the lender/mortgage process. Without...

  13. Effects of flow restoration on mussel growth in a Wild and Scenic North American River

    PubMed Central

    2013-01-01

    Background Freshwater mussels remain among the most imperiled species in North America due primarily to habitat loss or degradation. Understanding how mussels respond to habitat changes can improve conservation efforts. Mussels deposit rings in their shell in which age and growth information can be read, and thus used to evaluate how mussels respond to changes in habitat. However, discrepancies between methodological approaches to obtain life history information from growth rings has led to considerable uncertainty regarding the life history characteristics of many mussel species. In this study we compared two processing methods, internal and external ring examination, to obtain age and growth information of two populations of mussels in the St. Croix River, MN, and evaluated how mussel growth responded to changes in the operation of a hydroelectric dam. Results External ring counts consistently underestimated internal ring counts by 4 years. Despite this difference, internal and external growth patterns were consistent. In 2000, the hydroelectric dam switched from operating on a peaking schedule to run-of-the-river/partial peaking. Growth patterns between an upstream and downstream site of the dam were similar both before and after the change in operation. At the downstream site, however, older mussels had higher growth rates after the change in operation than the same sized mussels collected before the change. Conclusions Because growth patterns between internal and external processing methods were consistent, we suggest that external processing is an effective method to obtain growth information despite providing inaccurate age information. External processing is advantageous over internal processing due to its non-destructive nature. Applying this information to analyze the influence of the operation change in the hydroelectric dam, we suggest that changing to run-of-the-river/partial peaking operation has benefited the growth of older mussels below the dam. PMID:23452382

  14. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  15. Inhibition of irrelevant information is not necessary to performance of expert chess players.

    PubMed

    Postal, Virginie

    2012-08-01

    Some studies on expertise have demonstrated that the difference between novices and experts can be partly due to a lack of knowledge about which information is relevant for a given situation. This lack of knowledge seems to be associated with the selection of correct information and with inhibitory processes. However, while the efficiency of inhibitory processes can lead to better performance in the normal population, it seems that experts in chess do not base their performance on this process but rather on an automatic and parallel encoding of information. Two experiments investigated the processes involved in a check detection task. The congruence of the information was manipulated in a Stroop situation similar to Reingold, Charness, Scheltetus, & Stampe (2001). The results showed that the experts did not benefit from cuing with a congruent cue and that they did not show any interference effect by the incongruent cue, contrary to less skilled chess players who benefited from cuing (Exp. 1). An attentional priming procedure confirmed the automatic encoding of chess relations in the more skilled chess players by showing no advantage from the prime in this group (Exp. 2). Taken together, the results indicate that the processing was serial for the less skilled chess players and that it was automatic and parallel for the more expert chess players. The inhibition of irrelevant information does not seem necessary to process information rapidly and efficiently.

  16. Future electro-optical sensors and processing in urban operations

    NASA Astrophysics Data System (ADS)

    Grönwall, Christina; Schwering, Piet B.; Rantakokko, Jouni; Benoist, Koen W.; Kemp, Rob A. W.; Steinvall, Ove; Letalick, Dietmar; Björkert, Stefan

    2013-10-01

    In the electro-optical sensors and processing in urban operations (ESUO) study we pave the way for the European Defence Agency (EDA) group of Electro-Optics experts (IAP03) for a common understanding of the optimal distribution of processing functions between the different platforms. Combinations of local, distributed and centralized processing are proposed. In this way one can match processing functionality to the required power, and available communication systems data rates, to obtain the desired reaction times. In the study, three priority scenarios were defined. For these scenarios, present-day and future sensors and signal processing technologies were studied. The priority scenarios were camp protection, patrol and house search. A method for analyzing information quality in single and multi-sensor systems has been applied. A method for estimating reaction times for transmission of data through the chain of command has been proposed and used. These methods are documented and can be used to modify scenarios, or be applied to other scenarios. Present day data processing is organized mainly locally. Very limited exchange of information with other platforms is present; this is performed mainly at a high information level. Main issues that arose from the analysis of present-day systems and methodology are the slow reaction time due to the limited field of view of present-day sensors and the lack of robust automated processing. Efficient handover schemes between wide and narrow field of view sensors may however reduce the delay times. The main effort in the study was in forecasting the signal processing of EO-sensors in the next ten to twenty years. Distributed processing is proposed between hand-held and vehicle based sensors. This can be accompanied by cloud processing on board several vehicles. Additionally, to perform sensor fusion on sensor data originating from different platforms, and making full use of UAV imagery, a combination of distributed and centralized processing is essential. There is a central role for sensor fusion of heterogeneous sensors in future processing. The changes that occur in the urban operations of the future due to the application of these new technologies will be the improved quality of information, with shorter reaction time, and with lower operator load.

  17. Brain white matter structure and information processing speed in healthy older age.

    PubMed

    Kuznetsova, Ksenia A; Maniega, Susana Muñoz; Ritchie, Stuart J; Cox, Simon R; Storkey, Amos J; Starr, John M; Wardlaw, Joanna M; Deary, Ian J; Bastin, Mark E

    2016-07-01

    Cognitive decline, especially the slowing of information processing speed, is associated with normal ageing. This decline may be due to brain cortico-cortical disconnection caused by age-related white matter deterioration. We present results from a large, narrow age range cohort of generally healthy, community-dwelling subjects in their seventies who also had their cognitive ability tested in youth (age 11 years). We investigate associations between older age brain white matter structure, several measures of information processing speed and childhood cognitive ability in 581 subjects. Analysis of diffusion tensor MRI data using Tract-based Spatial Statistics (TBSS) showed that all measures of information processing speed, as well as a general speed factor composed from these tests (g speed), were significantly associated with fractional anisotropy (FA) across the white matter skeleton rather than in specific tracts. Cognitive ability measured at age 11 years was not associated with older age white matter FA, except for the g speed-independent components of several individual processing speed tests. These results indicate that quicker and more efficient information processing requires global connectivity in older age, and that associations between white matter FA and information processing speed (both individual test scores and g speed), unlike some other aspects of later life brain structure, are generally not accounted for by cognitive ability measured in youth.

  18. Particle sizing in rocket motor studies utilizing hologram image processing

    NASA Technical Reports Server (NTRS)

    Netzer, David; Powers, John

    1987-01-01

    A technique of obtaining particle size information from holograms of combustion products is described. The holograms are obtained with a pulsed ruby laser through windows in a combustion chamber. The reconstruction is done with a krypton laser with the real image being viewed through a microscope. The particle size information is measured with a Quantimet 720 image processing system which can discriminate various features and perform measurements of the portions of interest in the image. Various problems that arise in the technique are discussed, especially those that are a consequence of the speckle due to the diffuse illumination used in the recording process.

  19. Information processing in the vertebrate habenula.

    PubMed

    Fore, Stephanie; Palumbo, Fabrizio; Pelgrims, Robbrecht; Yaksi, Emre

    2018-06-01

    The habenula is a brain region that has gained increasing popularity over the recent years due to its role in processing value-related and experience-dependent information with a strong link to depression, addiction, sleep and social interactions. This small diencephalic nucleus is proposed to act as a multimodal hub or a switchboard, where inputs from different brain regions converge. These diverse inputs to the habenula carry information about the sensory world and the animal's internal state, such as reward expectation or mood. However, it is not clear how these diverse habenular inputs interact with each other and how such interactions contribute to the function of habenular circuits in regulating behavioral responses in various tasks and contexts. In this review, we aim to discuss how information processing in habenular circuits, can contribute to specific behavioral programs that are attributed to the habenula. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. The removal of information from working memory.

    PubMed

    Lewis-Peacock, Jarrod A; Kessler, Yoav; Oberauer, Klaus

    2018-05-09

    What happens to goal-relevant information in working memory after it is no longer needed? Here, we review evidence for a selective removal process that operates on outdated information to limit working memory load and hence facilitates the maintenance of goal-relevant information. Removal alters the representations of irrelevant content so as to reduce access to it, thereby improving access to the remaining relevant content and also facilitating the encoding of new information. Both behavioral and neural evidence support the existence of a removal process that is separate from forgetting due to decay or interference. We discuss the potential mechanisms involved in removal and characterize the time course and duration of the process. In doing so, we propose the existence of two forms of removal: one is temporary, and reversible, which modifies working memory content without impacting content-to-context bindings, and another is permanent, which unbinds the content from its context in working memory (without necessarily impacting long-term forgetting). Finally, we discuss limitations on removal and prescribe conditions for evaluating evidence for or against this process. © 2018 New York Academy of Sciences.

  1. Cyber-Informed Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert S.; Benjamin, Jacob; Wright, Virginia L.

    A continuing challenge for engineers who utilize digital systems is to understand the impact of cyber-attacks across the entire product and program lifecycle. This is a challenge due to the evolving nature of cyber threats that may impact the design, development, deployment, and operational phases of all systems. Cyber Informed Engineering is the process by which engineers are made aware of both how to use their engineering knowledge to positively impact the cyber security in the processes by which they architect and design components and the services and security of the components themselves.

  2. Quantum simulator review

    NASA Astrophysics Data System (ADS)

    Bednar, Earl; Drager, Steven L.

    2007-04-01

    Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.

  3. Security Risk Assessment Process for UAS in the NAS CNPC Architecture

    NASA Technical Reports Server (NTRS)

    Iannicca, Dennis C.; Young, Dennis P.; Thadani, Suresh K.; Winter, Gilbert A.

    2013-01-01

    This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper.

  4. Security Risk Assessment Process for UAS in the NAS CNPC Architecture

    NASA Technical Reports Server (NTRS)

    Iannicca, Dennis Christopher; Young, Daniel Paul; Suresh, Thadhani; Winter, Gilbert A.

    2013-01-01

    This informational paper discusses the risk assessment process conducted to analyze Control and Non-Payload Communications (CNPC) architectures for integrating civil Unmanned Aircraft Systems (UAS) into the National Airspace System (NAS). The assessment employs the National Institute of Standards and Technology (NIST) Risk Management framework to identify threats, vulnerabilities, and risks to these architectures and recommends corresponding mitigating security controls. This process builds upon earlier work performed by RTCA Special Committee (SC) 203 and the Federal Aviation Administration (FAA) to roadmap the risk assessment methodology and to identify categories of information security risks that pose a significant impact to aeronautical communications systems. A description of the deviations from the typical process is described in regards to this aeronautical communications system. Due to the sensitive nature of the information, data resulting from the risk assessment pertaining to threats, vulnerabilities, and risks is beyond the scope of this paper

  5. An Own-Race Advantage for Components as Well as Configurations in Face Recognition

    ERIC Educational Resources Information Center

    Hayward, William G.; Rhodes, Gillian; Schwaninger, Adrian

    2008-01-01

    The own-race advantage in face recognition has been hypothesized as being due to a superiority in the processing of configural information for own-race faces. Here we examined the contributions of both configural and component processing to the own-race advantage. We recruited 48 Caucasian participants in Australia and 48 Chinese participants in…

  6. Rational learning and information sampling: on the "naivety" assumption in sampling explanations of judgment biases.

    PubMed

    Le Mens, Gaël; Denrell, Jerker

    2011-04-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them. Here, we show that this "naivety" assumption is not necessary. Systematically biased judgments can emerge even when decision makers process available information perfectly and are also aware of how the information sample has been generated. Specifically, we develop a rational analysis of Denrell's (2005) experience sampling model, and we prove that when information search is interested rather than disinterested, even rational information sampling and processing can give rise to systematic patterns of errors in judgments. Our results illustrate that a tendency to favor alternatives for which outcome information is more accessible can be consistent with rational behavior. The model offers a rational explanation for behaviors that had previously been attributed to cognitive and motivational biases, such as the in-group bias or the tendency to prefer popular alternatives. 2011 APA, all rights reserved

  7. Always look on the broad side of life: happiness increases the breadth of sensory memory.

    PubMed

    Kuhbandner, Christof; Lichtenfeld, Stephanie; Pekrun, Reinhard

    2011-08-01

    Research has shown that positive affect increases the breadth of information processing at several higher stages of information processing, such as attentional selection or knowledge activation. In the present study, we examined whether these affective influences are already present at the level of transiently storing incoming information in sensory memory, before attentional selection takes place. After inducing neutral, happy, or sad affect, participants performed an iconic memory task which measures visual sensory memory. In all conditions, iconic memory performance rapidly decreased with increasing delay between stimulus presentation and test, indicating that affect did not influence the decay of iconic memory. However, positive affect increased the amount of incoming information stored in iconic memory. In particular, our results showed that this occurs due to an elimination of the spatial bias typically observed in iconic memory. Whereas performance did not differ at positions where observers in the neutral and negative conditions showed the highest performance, positive affect enhanced performance at all positions where observers in the neutral and negative conditions were relatively "blind." These findings demonstrate that affect influences the breadth of information processing already at earliest processing stages, suggesting that affect may produce an even more fundamental shift in information processing than previously believed. 2011 APA, all rights reserved

  8. Converting Optically Scanned Regular or Irregular Tables to a Standardised Markup Format to Be Accessible to Vision-Impaired

    ERIC Educational Resources Information Center

    Nazemi, Azadeh; Murray, Iain; Fernaando, Chandrika; McMeekin, David A.

    2016-01-01

    Documents use tables to communicate multidimensional information clearly, summarise and present data in an easy-to-interpret way. Tabular information in scanned PDF due to its nature without further processing is not accessible for vision-impaired people who use assistive technology such as screen readers. The lack of access to table contents…

  9. 76 FR 78083 - Agency Information Collection (Appeal to Board of Veterans' Appeals) Activities Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ... Human Resources and Housing Branch, New Executive Office Building, Room 10235, Washington, DC 20503 (202... appellant and the BVA must be informed so that the appellant's rights may be adequately protected and so... required by basic Constitutional due-process and by Title 38 U.S.C. 7107(b). From time to time, hearing...

  10. Measuring the effect of attention on simple visual search.

    PubMed

    Palmer, J; Ames, C T; Lindsey, D T

    1993-02-01

    Set-size in visual search may be due to 1 or more of 3 factors: sensory processes such as lateral masking between stimuli, attentional processes limiting the perception of individual stimuli, or attentional processes affecting the decision rules for combining information from multiple stimuli. These possibilities were evaluated in tasks such as searching for a longer line among shorter lines. To evaluate sensory contributions, display set-size effects were compared with cuing conditions that held sensory phenomena constant. Similar effects for the display and cue manipulations suggested that sensory processes contributed little under the conditions of this experiment. To evaluate the contribution of decision processes, the set-size effects were modeled with signal detection theory. In these models, a decision effect alone was sufficient to predict the set-size effects without any attentional limitation due to perception.

  11. Modeling recent economic debates

    NASA Astrophysics Data System (ADS)

    Skiadas, Christos H.

    The previous years' disaster in the stock markets all over the world and the resulting economic crisis lead to serious criticisms of the various models used. It was evident that large fluctuations and sudden losses may occur even in the case of a well organized and supervised context as it looks to be the European Union. In order to explain the economic systems, we explore models of interacting and conflicting populations. The populations are conflicting into the same environment (a Stock Market or a Group of Countries as the EU). Three models where introduced 1) the Lotka-Volterra 2) the Lanchester or the Richardson model and 3) a new model for two conflicting populations. These models assume immediate interaction between the two conflicting populations. This is usually not the case in a stock market or between countries as delays in the information process arise. The main rules present include mutual interaction between adopters, potential adopters, word-of-mouth communication and of course by taking into consideration the innovation diffusion process. In a previous paper (Skiadas, 2010 [9]) we had proposed and analyzed a model including mutual interaction with delays due to the innovation diffusion process. The model characteristics where expressed by third order terms providing four characteristic symmetric stationary points. In this paper we summarize the previous results and we analyze the case of a non-symmetric case where the leading part receives the information immediately while the second part receives the information following a delay mechanism due to the innovation diffusion process (the spread of information) which can be expressed by a third order term. In the later case the non-symmetric process leads to gains of the leading part while the second part oscillates between gains and losses during time.

  12. Modeling of information diffusion in Twitter-like social networks under information overload.

    PubMed

    Li, Pei; Li, Wei; Wang, Hui; Zhang, Xin

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations.

  13. Modeling of Information Diffusion in Twitter-Like Social Networks under Information Overload

    PubMed Central

    Li, Wei

    2014-01-01

    Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations. PMID:24795541

  14. Close coupling of pre- and post-processing vision stations using inexact algorithms

    NASA Astrophysics Data System (ADS)

    Shih, Chi-Hsien V.; Sherkat, Nasser; Thomas, Peter D.

    1996-02-01

    Work has been reported using lasers to cut deformable materials. Although the use of laser reduces material deformation, distortion due to mechanical feed misalignment persists. Changes in the lace patten are also caused by the release of tension in the lace structure as it is cut. To tackle the problem of distortion due to material flexibility, the 2VMethod together with the Piecewise Error Compensation Algorithm incorporating the inexact algorithms, i.e., fuzzy logic, neural networks and neural fuzzy technique, are developed. A spring mounted pen is used to emulate the distortion of the lace pattern caused by tactile cutting and feed misalignment. Using pre- and post-processing vision systems, it is possible to monitor the scalloping process and generate on-line information for the artificial intelligence engines. This overcomes the problems of lace distortion due to the trimming process. Applying the algorithms developed, the system can produce excellent results, much better than a human operator.

  15. Emotional attention for erotic stimuli: Cognitive and brain mechanisms.

    PubMed

    Sennwald, Vanessa; Pool, Eva; Brosch, Tobias; Delplanque, Sylvain; Bianchi-Demicheli, Francesco; Sander, David

    2016-06-01

    It has long been posited that among emotional stimuli, only negative threatening information modulates early shifts of attention. However, in the last few decades there has been an increase in research showing that attention is also involuntarily oriented toward positive rewarding stimuli such as babies, food, and erotic information. Because reproduction-related stimuli have some of the largest effects among positive stimuli on emotional attention, the present work reviews recent literature and proposes that the cognitive and cerebral mechanisms underlying the involuntarily attentional orientation toward threat-related information are also sensitive to erotic information. More specifically, the recent research suggests that both types of information involuntarily orient attention due to their concern relevance and that the amygdala plays an important role in detecting concern-relevant stimuli, thereby enhancing perceptual processing and influencing emotional attentional processes. © 2015 Wiley Periodicals, Inc.

  16. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation

    PubMed Central

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency. PMID:26609303

  17. Information Dissemination of Public Health Emergency on Social Networks and Intelligent Computation.

    PubMed

    Hu, Hongzhi; Mao, Huajuan; Hu, Xiaohua; Hu, Feng; Sun, Xuemin; Jing, Zaiping; Duan, Yunsuo

    2015-01-01

    Due to the extensive social influence, public health emergency has attracted great attention in today's society. The booming social network is becoming a main information dissemination platform of those events and caused high concerns in emergency management, among which a good prediction of information dissemination in social networks is necessary for estimating the event's social impacts and making a proper strategy. However, information dissemination is largely affected by complex interactive activities and group behaviors in social network; the existing methods and models are limited to achieve a satisfactory prediction result due to the open changeable social connections and uncertain information processing behaviors. ACP (artificial societies, computational experiments, and parallel execution) provides an effective way to simulate the real situation. In order to obtain better information dissemination prediction in social networks, this paper proposes an intelligent computation method under the framework of TDF (Theory-Data-Feedback) based on ACP simulation system which was successfully applied to the analysis of A (H1N1) Flu emergency.

  18. Development of a program logic model and evaluation plan for a participatory ergonomics intervention in construction.

    PubMed

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2014-03-01

    Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.

  19. Development of a Program Logic Model and Evaluation Plan for a Participatory Ergonomics Intervention in Construction

    PubMed Central

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2013-01-01

    Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097

  20. End users transforming experiences into formal information and process models for personalised health interventions.

    PubMed

    Lindgren, Helena; Lundin-Olsson, Lillemor; Pohl, Petra; Sandlund, Marlene

    2014-01-01

    Five physiotherapists organised a user-centric design process of a knowledge-based support system for promoting exercise and preventing falls. The process integrated focus group studies with 17 older adults and prototyping. The transformation of informal medical and rehabilitation expertise and older adults' experiences into formal information and process models during the development was studied. As tool they used ACKTUS, a development platform for knowledge-based applications. The process became agile and incremental, partly due to the diversity of expectations and preferences among both older adults and physiotherapists, and the participatory approach to design and development. In addition, there was a need to develop the knowledge content alongside with the formal models and their presentations, which allowed the participants to test hands-on and evaluate the ideas, content and design. The resulting application is modular, extendable, flexible and adaptable to the individual end user. Moreover, the physiotherapists are able to modify the information and process models, and in this way further develop the application. The main constraint was found to be the lack of support for the initial phase of concept modelling, which lead to a redesigned user interface and functionality of ACKTUS.

  1. Customized data container for improved performance in optical cryptosystems

    NASA Astrophysics Data System (ADS)

    Vélez Zea, Alejandro; Fredy Barrera, John; Torroba, Roberto

    2016-12-01

    Coherent optical encryption procedures introduce speckle noise to the output, limiting many practical applications. Until now the only method available to avoid this noise is to codify the information to be processed into a container that is encrypted instead of the original data. Although the decrypted container presents the noise due to the optical processing, their features remain recognizable enough to allow decoding, bringing the original information free of any kind of degradation. The first adopted containers were the quick response (QR) codes. However, the limitations of optical encryption procedures and the features of QR codes imply that in practice only simple codes containing small amounts of data can be processed without large experimental requirements. In order to overcome this problem, we introduce the first tailor made container to be processed in optical cryptosystems, ensuring larger noise tolerance and the ability to process more information with less experimental requirements. We present both simulations and experimental results to demonstrate the advantages of our proposal.

  2. Prioritization of engineering support requests and advanced technology projects using decision support and industrial engineering models

    NASA Technical Reports Server (NTRS)

    Tavana, Madjid

    1995-01-01

    The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.

  3. Individual differences in working memory, secondary memory, and fluid intelligence: evidence from the levels-of-processing span task.

    PubMed

    Rose, Nathan S

    2013-12-01

    Individual differences in working memory (WM) are related to performance on secondary memory (SM), and fluid intelligence (gF) tests. However, the source of the relation remains unclear, in part because few studies have controlled for the nature of encoding; therefore, it is unclear whether individual variation is due to encoding, maintenance, or retrieval processes. In the current study, participants performed a WM task (the levels-of-processing span task; Rose, Myerson, Roediger III, & Hale, 2010) and a SM test that tested for both targets and the distracting processing words from the initial WM task. Deeper levels of processing at encoding did not benefit WM, but did benefit subsequent SM, although the amount of benefit was smaller for those with lower WM spans. This result suggests that, despite encoding cues that facilitate retrieval from SM, low spans may have engaged in shallower, maintenance-focused processing to maintain the words in WM. Low spans also recalled fewer targets, more distractors, and more extralist intrusions than high spans, although this was partially due to low spans' poorer recall of targets, which resulted in a greater number of opportunities to commit recall errors. Delayed recall of intrusions and commission of source errors (labeling targets as processing words and vice versa) were significant negative predictors of gF. These results suggest that the ability to use source information to recall relevant information and withhold recall of irrelevant information is a critical source of both individual variation in WM and the relation between WM, SM, and gF. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  4. Genealogical Properties of Subsamples in Highly Fecund Populations

    NASA Astrophysics Data System (ADS)

    Eldon, Bjarki; Freund, Fabian

    2018-03-01

    We consider some genealogical properties of nested samples. The complete sample is assumed to have been drawn from a natural population characterised by high fecundity and sweepstakes reproduction (abbreviated HFSR). The random gene genealogies of the samples are—due to our assumption of HFSR—modelled by coalescent processes which admit multiple mergers of ancestral lineages looking back in time. Among the genealogical properties we consider are the probability that the most recent common ancestor is shared between the complete sample and the subsample nested within the complete sample; we also compare the lengths of `internal' branches of nested genealogies between different coalescent processes. The results indicate how `informative' a subsample is about the properties of the larger complete sample, how much information is gained by increasing the sample size, and how the `informativeness' of the subsample varies between different coalescent processes.

  5. Cognitive control in media multitaskers

    PubMed Central

    Ophir, Eyal; Nass, Clifford; Wagner, Anthony D.

    2009-01-01

    Chronic media multitasking is quickly becoming ubiquitous, although processing multiple incoming streams of information is considered a challenge for human cognition. A series of experiments addressed whether there are systematic differences in information processing styles between chronically heavy and light media multitaskers. A trait media multitasking index was developed to identify groups of heavy and light media multitaskers. These two groups were then compared along established cognitive control dimensions. Results showed that heavy media multitaskers are more susceptible to interference from irrelevant environmental stimuli and from irrelevant representations in memory. This led to the surprising result that heavy media multitaskers performed worse on a test of task-switching ability, likely due to reduced ability to filter out interference from the irrelevant task set. These results demonstrate that media multitasking, a rapidly growing societal trend, is associated with a distinct approach to fundamental information processing. PMID:19706386

  6. Deciding treatment for miscarriage--experiences of women and healthcare professionals.

    PubMed

    Olesen, Mette Linnet; Graungaard, Anette H; Husted, Gitte R

    2015-06-01

    Women experiencing miscarriage are offered a choice of different treatments to terminate their wanted pregnancy at a time when they are often shocked and distressed. Women's and healthcare professionals' experiences of the decision-making process are not well described. We aimed to gain insight into this process and the circumstances that may affect it. A qualitative study using a grounded theory approach. Data were obtained through semi-structured interviews with six women who had chosen and completed either surgical, medical or expectant treatment for miscarriage and five healthcare professionals involved in the decision-making at an emergency gynaecological department in Denmark. An inductive explorative method was chosen due to limited knowledge about the decision-making process, and a theoretical perspective was not applied until the final analysis. Despite information and pretreatment counselling, choice of treatment was often determined by unspoken emotional considerations, including fear of seeing the foetus or fear of anaesthesia. These considerations were not discussed during the decision-making process, which was a time when the women were under time pressure and experienced emotional distress. Healthcare professionals did not explore women's considerations for choosing a particular treatment and prioritised information differently. We found theory about coping and decision-making in stressful situations useful in increasing our understanding of the women's reactions. In relation to theory about informed consent, our findings suggest that women need more understanding of the treatments before making a decision. This study is limited due to a small sample size, but it generates important findings that need to be examined in a larger sample. Frequently, women did not use information provided about treatment pros and cons in their decision-making process. Because of unspoken thoughts, and women's needs being unexplored by healthcare professionals, information did not target women's needs and their reasoning remained unapparent. © 2014 Nordic College of Caring Science.

  7. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  8. South Atlantic Anomaly

    Atmospheric Science Data Center

    2013-04-19

    ... This map was created by specially processing MISR "dark" data taken between 3 February and 16 February 2000, while the cover was still ... Individual orbit tracks are visible, and some tracks are missing due to data gaps, missing spacecraft navigation information, or other ...

  9. The National Response Framework: A Cross-Case Analysis

    DTIC Science & Technology

    2014-06-13

    media representatives wanting access to top officials.95 People with disabilities reported a lack of live captioning and sign language interpreters... use due to an inefficient check-in process. Public information shortfalls included providing inadequate information for people with disabilities , not...numbers, suggesting that the commonly used term “low-probably, high- consequence events” to describe major disasters is misleading. U.S. shores, forests

  10. Constructing Preference from Experience: The Endowment Effect Reflected in External Information Search

    ERIC Educational Resources Information Center

    Pachur, Thorsten; Scheibehenne, Benjamin

    2012-01-01

    People often attach a higher value to an object when they own it (i.e., as seller) compared with when they do not own it (i.e., as buyer)--a phenomenon known as the "endowment effect". According to recent cognitive process accounts of the endowment effect, the effect is due to differences between sellers and buyers in information search.…

  11. How to convince your manager to invest in an HIS preimplementation methodology for appraisal of material, process and human costs and benefits.

    PubMed Central

    Bossard, B.; Renard, J. M.; Capelle, P.; Paradis, P.; Beuscart, M. C.

    2000-01-01

    Investing in information technology has become a crucial process in hospital management today. Medical and administrative managers are faced with difficulties in measuring medical information technology costs and benefits due to the complexity of the domain. This paper proposes a preimplementation methodology for evaluating and appraising material, process and human costs and benefits. Based on the users needs and organizational process analysis, the methodology provides an evaluative set of financial and non financial indicators which can be integrated in a decision making and investment evaluation process. We describe the first results obtained after a few months of operation for the Computer-Based Patient Record (CPR) project. Its full acceptance, in spite of some difficulties, encourages us to diffuse the method for the entire project. PMID:11079851

  12. Quantum Information Processing with Large Nuclear Spins in GaAs Semiconductors

    NASA Astrophysics Data System (ADS)

    Leuenberger, Michael N.; Loss, Daniel; Poggio, M.; Awschalom, D. D.

    2002-10-01

    We propose an implementation for quantum information processing based on coherent manipulations of nuclear spins I=3/2 in GaAs semiconductors. We describe theoretically an NMR method which involves multiphoton transitions and which exploits the nonequidistance of nuclear spin levels due to quadrupolar splittings. Starting from known spin anisotropies we derive effective Hamiltonians in a generalized rotating frame, valid for arbitrary I, which allow us to describe the nonperturbative time evolution of spin states generated by magnetic rf fields. We identify an experimentally observable regime for multiphoton Rabi oscillations. In the nonlinear regime, we find Berry phase interference.

  13. The parietal cortex in sensemaking: the dissociation of multiple types of spatial information.

    PubMed

    Sun, Yanlong; Wang, Hongbin

    2013-01-01

    According to the data-frame theory, sensemaking is a macrocognitive process in which people try to make sense of or explain their observations by processing a number of explanatory structures called frames until the observations and frames become congruent. During the sensemaking process, the parietal cortex has been implicated in various cognitive tasks for the functions related to spatial and temporal information processing, mathematical thinking, and spatial attention. In particular, the parietal cortex plays important roles by extracting multiple representations of magnitudes at the early stages of perceptual analysis. By a series of neural network simulations, we demonstrate that the dissociation of different types of spatial information can start early with a rather similar structure (i.e., sensitivity on a common metric), but accurate representations require specific goal-directed top-down controls due to the interference in selective attention. Our results suggest that the roles of the parietal cortex rely on the hierarchical organization of multiple spatial representations and their interactions. The dissociation and interference between different types of spatial information are essentially the result of the competition at different levels of abstraction.

  14. The Parietal Cortex in Sensemaking: The Dissociation of Multiple Types of Spatial Information

    PubMed Central

    Sun, Yanlong; Wang, Hongbin

    2013-01-01

    According to the data-frame theory, sensemaking is a macrocognitive process in which people try to make sense of or explain their observations by processing a number of explanatory structures called frames until the observations and frames become congruent. During the sensemaking process, the parietal cortex has been implicated in various cognitive tasks for the functions related to spatial and temporal information processing, mathematical thinking, and spatial attention. In particular, the parietal cortex plays important roles by extracting multiple representations of magnitudes at the early stages of perceptual analysis. By a series of neural network simulations, we demonstrate that the dissociation of different types of spatial information can start early with a rather similar structure (i.e., sensitivity on a common metric), but accurate representations require specific goal-directed top-down controls due to the interference in selective attention. Our results suggest that the roles of the parietal cortex rely on the hierarchical organization of multiple spatial representations and their interactions. The dissociation and interference between different types of spatial information are essentially the result of the competition at different levels of abstraction. PMID:23710165

  15. The role of vision processing in prosthetic vision.

    PubMed

    Barnes, Nick; He, Xuming; McCarthy, Chris; Horne, Lachlan; Kim, Junae; Scott, Adele; Lieby, Paulette

    2012-01-01

    Prosthetic vision provides vision which is reduced in resolution and dynamic range compared to normal human vision. This comes about both due to residual damage to the visual system from the condition that caused vision loss, and due to limitations of current technology. However, even with limitations, prosthetic vision may still be able to support functional performance which is sufficient for tasks which are key to restoring independent living and quality of life. Here vision processing can play a key role, ensuring that information which is critical to the performance of key tasks is available within the capability of the available prosthetic vision. In this paper, we frame vision processing for prosthetic vision, highlight some key areas which present problems in terms of quality of life, and present examples where vision processing can help achieve better outcomes.

  16. Decision-Making Process Related to Participation in Phase I Clinical Trials: A Nonsystematic Review of the Existing Evidence.

    PubMed

    Gorini, Alessandra; Mazzocco, Ketti; Pravettoni, Gabriella

    2015-01-01

    Due to the lack of other treatment options, patient candidates for participation in phase I clinical trials are considered the most vulnerable, and many ethical concerns have emerged regarding the informed consent process used in the experimental design of such trials. Starting with these considerations, this nonsystematic review is aimed at analyzing the decision-making processes underlying patients' decision about whether to participate (or not) in phase I trials in order to clarify the cognitive and emotional aspects most strongly implicated in this decision. Considering that there is no uniform decision calculus and that many different variables other than the patient-physician relationship (including demographic, clinical, and personal characteristics) may influence patients' preferences for and processing of information, we conclude that patients' informed decision-making can be facilitated by creating a rigorously developed, calibrated, and validated computer tool modeled on each single patient's knowledge, values, and emotional and cognitive decisional skills. Such a tool will also help oncologists to provide tailored medical information that is useful to improve the shared decision-making process, thereby possibly increasing patient participation in clinical trials. © 2015 S. Karger AG, Basel.

  17. Propositional Versus Dual-Process Accounts of Evaluative Conditioning: I. The Effects of Co-Occurrence and Relational Information on Implicit and Explicit Evaluations.

    PubMed

    Hu, Xiaoqing; Gawronski, Bertram; Balas, Robert

    2017-01-01

    Evaluative conditioning (EC) is defined as the change in the evaluation of a conditioned stimulus (CS) due to its pairing with a valenced unconditioned stimulus (US). According to propositional accounts, EC effects should be qualified by the relation between the CS and the US. Dual-process accounts suggest that relational information should qualify EC effects on explicit evaluations, whereas implicit evaluations should reflect the frequency of CS-US co-occurrences. Experiments 1 and 2 showed that, when relational information was provided before the encoding of CS-US pairings, it moderated EC effects on explicit, but not implicit, evaluations. In Experiment 3, relational information moderated EC effects on both explicit and implicit evaluations when it was provided simultaneously with CS-US pairings. Frequency of CS-US pairings had no effect on implicit evaluations. Although the results can be reconciled with both propositional and dual-process accounts, they are more parsimoniously explained by propositional accounts.

  18. Exploring the Notion of Context in Medical Data.

    PubMed

    Mylonas, Phivos

    2017-01-01

    Scientific and technological knowledge and skills are becoming crucial for most data analysis activities. Two rather distinct, but at the same time collaborating, domains are the ones of computer science and medicine; the former offers significant aid towards a more efficient understanding of the latter's research trends. Still, the process of meaningfully analyzing and understanding medical information and data is a tedious one, bound to several challenges. One of them is the efficient utilization of contextual information in the process leading to optimized, context-aware data analysis results. Nowadays, researchers are provided with tools and opportunities to analytically study medical data, but at the same time significant and rather complex computational challenges are yet to be tackled, among others due to the humanistic nature and increased rate of new content and information production imposed by related hardware and applications. So, the ultimate goal of this position paper is to provide interested parties an overview of major contextual information types to be identified within the medical data processing framework.

  19. Use of Facebook in the maternal grief process: An exploratory qualitative study.

    PubMed

    Perluxo, Diana; Francisco, Rita

    2018-02-01

    This study seeks to explore the potential implications of Facebook use in the process of maternal grief. The participants were 11 women who had lost their children due to accidents or prolonged illness. Semistructured interviews were conducted and subjected to thematic analysis. The participants stated that they used Facebook to receive support, to identify with other mothers, to remember the child who died, to access the child's information, to honor him/her, and to express their feelings. The use of Facebook can play a very important role in the initial phase of grieving due to the functions of this social network.

  20. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    ERIC Educational Resources Information Center

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  1. Micro-evolution due to pollution: possible consequences for ecosystem responses to toxic stress.

    PubMed

    Medina, Matías H; Correa, Juan A; Barata, Carlos

    2007-05-01

    Polluting events can change community structure and ecosystem functioning. Selection of genetically inherited tolerance on exposed populations, here referred as micro-evolution due to pollution, has been recognized as one of the causes of these changes. However, there is a gap between studies addressing this process and those assessing effects at higher levels of biological organization. In this review we attempt to address these evolutionary considerations into the ecological risk assessment (ERA) of polluting events and to trigger the discussion about the consequences of this process for the ecosystem response to toxic stress. We provide clear evidence that pollution drives micro-evolutionary processes in several species. When this process occurs, populations inhabiting environments that become polluted may persist. However, due to the existence of ecological costs derived from the loss of genetic variability, negative pleiotropy with fitness traits and/or from physiological alterations, micro-evolution due to pollution may alter different properties of the affected populations. Despite the existence of empirical evidence showing that safety margins currently applied in the ERA process may account for pollution-induced genetic changes in tolerance, information regarding long-term ecological consequences at higher levels of biological organization due to ecological costs is not explicitly considered in these procedures. In relation to this, we present four testable hypotheses considering that micro-evolution due to pollution acts upon the variability of functional response traits of the exposed populations and generates changes on their functional effect traits, therefore, modifying the way species exploit their ecological niches and participate in the overall ecosystem functioning.

  2. Legal and psychological considerations for obtaining informed consent for reverse total shoulder arthroplasty.

    PubMed

    Blackwood, Craig; Dixon, Jen; Reilly, Peter; Emery, Roger J

    2017-01-01

    This paper seeks to outline recent legal developments and requirements pertinent to obtaining informed consent. We argue that this is of particular relevance to patients considering a reverse total shoulder arthroplasty, due to the high complication rate associated with this procedure. By examining the cognitive processes involved in decision-making, and other clinician-related factors such as delivery of information, gender bias and conflict of interest, we explore some of the barriers that can undermine the processes of shared decision-making and obtaining genuine informed consent. We argue that these issues highlight the importance for surgeons in understanding the cognitive processes and other influential factors involved in patients' comprehension and decision-making. We recommend, based on strong evidence, that decision aids could prove useful in overcoming such challenges and could provide one way of mitigating the ethical, professional and legal consequences of failing to obtain proper informed consent. They are not widely used in orthopaedics at present, although it would be in the interests of both the surgeon and patient for such measures to be explored.

  3. Legal and psychological considerations for obtaining informed consent for reverse total shoulder arthroplasty

    PubMed Central

    Blackwood, Craig; Reilly, Peter; Emery, Roger J

    2016-01-01

    This paper seeks to outline recent legal developments and requirements pertinent to obtaining informed consent. We argue that this is of particular relevance to patients considering a reverse total shoulder arthroplasty, due to the high complication rate associated with this procedure. By examining the cognitive processes involved in decision-making, and other clinician-related factors such as delivery of information, gender bias and conflict of interest, we explore some of the barriers that can undermine the processes of shared decision-making and obtaining genuine informed consent. We argue that these issues highlight the importance for surgeons in understanding the cognitive processes and other influential factors involved in patients’ comprehension and decision-making. We recommend, based on strong evidence, that decision aids could prove useful in overcoming such challenges and could provide one way of mitigating the ethical, professional and legal consequences of failing to obtain proper informed consent. They are not widely used in orthopaedics at present, although it would be in the interests of both the surgeon and patient for such measures to be explored. PMID:28572846

  4. A strategy to improve priority setting in developing countries.

    PubMed

    Kapiriri, Lydia; Martin, Douglas K

    2007-09-01

    Because the demand for health services outstrips the available resources, priority setting is one of the most difficult issues faced by health policy makers, particularly those in developing countries. Priority setting in developing countries is fraught with uncertainty due to lack of credible information, weak priority setting institutions, and unclear priority setting processes. Efforts to improve priority setting in these contexts have focused on providing information and tools. In this paper we argue that priority setting is a value laden and political process, and although important, the available information and tools are not sufficient to address the priority setting challenges in developing countries. Additional complementary efforts are required. Hence, a strategy to improve priority setting in developing countries should also include: (i) capturing current priority setting practices, (ii) improving the legitimacy and capacity of institutions that set priorities, and (iii) developing fair priority setting processes.

  5. Astrocytic control of synaptic function.

    PubMed

    Papouin, Thomas; Dunphy, Jaclyn; Tolman, Michaela; Foley, Jeannine C; Haydon, Philip G

    2017-03-05

    Astrocytes intimately interact with synapses, both morphologically and, as evidenced in the past 20 years, at the functional level. Ultrathin astrocytic processes contact and sometimes enwrap the synaptic elements, sense synaptic transmission and shape or alter the synaptic signal by releasing signalling molecules. Yet, the consequences of such interactions in terms of information processing in the brain remain very elusive. This is largely due to two major constraints: (i) the exquisitely complex, dynamic and ultrathin nature of distal astrocytic processes that renders their investigation highly challenging and (ii) our lack of understanding of how information is encoded by local and global fluctuations of intracellular calcium concentrations in astrocytes. Here, we will review the existing anatomical and functional evidence of local interactions between astrocytes and synapses, and how it underlies a role for astrocytes in the computation of synaptic information.This article is part of the themed issue 'Integrating Hebbian and homeostatic plasticity'. © 2017 The Author(s).

  6. Phone Conversation while Processing Information: Chronometric Analysis of Load Effects in Everyday-media Multitasking

    PubMed Central

    Steinborn, Michael B.; Huestegge, Lynn

    2017-01-01

    This is a pilot study that examined the effect of cell-phone conversation on cognition using a continuous multitasking paradigm. Current theorizing argues that phone conversation affects behavior (e.g., driving) by interfering at a level of cognitive processes (not peripheral activity) and by implying an attentional-failure account. Within the framework of an intermittent spare–utilized capacity threading model, we examined the effect of aspects of (secondary-task) phone conversation on (primary-task) continuous arithmetic performance, asking whether phone use makes components of automatic and controlled information-processing (i.e., easy vs. hard mental arithmetic) run more slowly, or alternatively, makes processing run less reliably albeit with the same processing speed. The results can be summarized as follows: While neither expecting a text message nor expecting an impending phone call had any detrimental effects on performance, active phone conversation was clearly detrimental to primary-task performance. Crucially, the decrement imposed by secondary-task (conversation) was not due to a constant slowdown but is better be characterized by an occasional breakdown of information processing, which differentially affected automatic and controlled components of primary-task processing. In conclusion, these findings support the notion that phone conversation makes individuals not constantly slower but more vulnerable to commit attention failure, and in this way, hampers stability of (primary-task) information processing. PMID:28634458

  7. Phone Conversation while Processing Information: Chronometric Analysis of Load Effects in Everyday-media Multitasking.

    PubMed

    Steinborn, Michael B; Huestegge, Lynn

    2017-01-01

    This is a pilot study that examined the effect of cell-phone conversation on cognition using a continuous multitasking paradigm. Current theorizing argues that phone conversation affects behavior (e.g., driving) by interfering at a level of cognitive processes (not peripheral activity) and by implying an attentional-failure account. Within the framework of an intermittent spare-utilized capacity threading model, we examined the effect of aspects of (secondary-task) phone conversation on (primary-task) continuous arithmetic performance, asking whether phone use makes components of automatic and controlled information-processing (i.e., easy vs. hard mental arithmetic) run more slowly, or alternatively, makes processing run less reliably albeit with the same processing speed. The results can be summarized as follows: While neither expecting a text message nor expecting an impending phone call had any detrimental effects on performance, active phone conversation was clearly detrimental to primary-task performance. Crucially, the decrement imposed by secondary-task (conversation) was not due to a constant slowdown but is better be characterized by an occasional breakdown of information processing, which differentially affected automatic and controlled components of primary-task processing. In conclusion, these findings support the notion that phone conversation makes individuals not constantly slower but more vulnerable to commit attention failure, and in this way, hampers stability of (primary-task) information processing.

  8. The War Crimes Act: Current Issues

    DTIC Science & Technology

    2006-10-02

    divided into three overlapping categories: (1) defense of entrapment by estoppel , available when a defendant is informed by a government official that...defenses, the defense of entrapment by estoppel stems from the due process notions of fairness, rather than from common law concerning contract, equity

  9. 38 CFR 3.103 - Procedural due process and appellate rights.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... VETERANS AFFAIRS ADJUDICATION Pension, Compensation, and Dependency and Indemnity Compensation...) of this section, no award of compensation, pension or dependency and indemnity compensation shall be... on factual and unambiguous information or statements as to income, net worth, or dependency or...

  10. 38 CFR 3.103 - Procedural due process and appellate rights.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... VETERANS AFFAIRS ADJUDICATION Pension, Compensation, and Dependency and Indemnity Compensation...) of this section, no award of compensation, pension or dependency and indemnity compensation shall be... on factual and unambiguous information or statements as to income, net worth, or dependency or...

  11. 38 CFR 3.103 - Procedural due process and appellate rights.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... VETERANS AFFAIRS ADJUDICATION Pension, Compensation, and Dependency and Indemnity Compensation...) of this section, no award of compensation, pension or dependency and indemnity compensation shall be... on factual and unambiguous information or statements as to income, net worth, or dependency or...

  12. Marine Emissions and Atmospheric Processing Influence Aerosol Mixing States in the Bering Strait and Chukchi Sea

    NASA Astrophysics Data System (ADS)

    Kirpes, R.; Rodriguez, B.; Kim, S.; Park, K.; China, S.; Laskin, A.; Pratt, K.

    2017-12-01

    The Arctic region is rapidly changing due to sea ice loss and increasing oil/gas development and shipping activity. These changes influence aerosol sources and composition, resulting in complex aerosol-cloud-climate feedbacks. Atmospheric particles were collected aboard the R/V Araon in July-August 2016 in the Alaskan Arctic along the Bering Strait and Chukchi Sea. Offline analysis of individual particles by microscopic and spectroscopic techniques provided information on particle size, morphology, and chemical composition. Sea spray aerosol (SSA) and organic aerosol (OA) particles were the most commonly observed particle types, and sulfate was internally mixed with both SSA and OA. Evidence of multiphase sea spray aerosol reactions was observed, with varying degrees of chlorine depletion observed along the cruise. Notably, atmospherically processed SSA, completely depleted in chlorine, and internally mixed organic and sulfate particles, were observed in samples influenced by the central Arctic Ocean. Changes in particle composition due to fog processing were also investigated. Due to the changing aerosol sources and atmospheric processes in the Arctic region, it is crucial to understand aerosol composition in order to predict climate impacts.

  13. Superior memory efficiency of quantum devices for the simulation of continuous-time stochastic processes

    NASA Astrophysics Data System (ADS)

    Elliott, Thomas J.; Gu, Mile

    2018-03-01

    Continuous-time stochastic processes pervade everyday experience, and the simulation of models of these processes is of great utility. Classical models of systems operating in continuous-time must typically track an unbounded amount of information about past behaviour, even for relatively simple models, enforcing limits on precision due to the finite memory of the machine. However, quantum machines can require less information about the past than even their optimal classical counterparts to simulate the future of discrete-time processes, and we demonstrate that this advantage extends to the continuous-time regime. Moreover, we show that this reduction in the memory requirement can be unboundedly large, allowing for arbitrary precision even with a finite quantum memory. We provide a systematic method for finding superior quantum constructions, and a protocol for analogue simulation of continuous-time renewal processes with a quantum machine.

  14. Thermal sensors to control polymer forming. Challenge and solutions

    NASA Astrophysics Data System (ADS)

    Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.

    2017-10-01

    Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.

  15. Cause and effect: the linkage between the health information seeking behavior and the online environment--a review.

    PubMed

    Bratucu, R; Gheorghe, I R; Purcarea, R M; Gheorghe, C M; Popa Velea, O; Purcarea, V L

    2014-09-15

    Today, health care consumers are taking more control over their health care problems, investing more time in finding and getting information as well as looking for proper methods in order to investigate more closely the health care information received from their physicians. Unfortunately, in health care consumers' views, the trustworthiness of health authorities and institutions has declined in the last years. So, consumers have found a new solution to their health problems, that is, the Internet. Recently, studies revealed that consumers seeking for health information have more options to look for data in comparison to the methods used a few years ago. Therefore, due to the available technology, consumers have more outlets to search for information. For instance, the Internet is a source that has revolutionized the way consumers seek data due its customized methods of assessing both quantitative and qualitative information which may be achieved with minimal effort and low costs, offering at the same time, several advantages such as making the decision process more efficient.

  16. Cause and effect: the linkage between the health information seeking behavior and the online environment- a review

    PubMed Central

    Bratucu, R; Gheorghe, IR; Purcarea, RM; Gheorghe, CM; Popa Velea, O; Purcarea, VL

    2014-01-01

    Abstract Today, health care consumers are taking more control over their health care problems, investing more time in finding and getting information as well as looking for proper methods in order to investigate more closely the health care information received from their physicians. Unfortunately, in health care consumers’ views, the trustworthiness of health authorities and institutions has declined in the last years. So, consumers have found a new solution to their health problems, that is, the Internet. Recently, studies revealed that consumers seeking for health information have more options to look for data in comparison to the methods used a few years ago. Therefore, due to the available technology, consumers have more outlets to search for information. For instance, the Internet is a source that has revolutionized the way consumers seek data due its customized methods of assessing both quantitative and qualitative information which may be achieved with minimal effort and low costs, offering at the same time, several advantages such as making the decision process more efficient. PMID:25408746

  17. How visual timing and form information affect speech and non-speech processing.

    PubMed

    Kim, Jeesun; Davis, Chris

    2014-10-01

    Auditory speech processing is facilitated when the talker's face/head movements are seen. This effect is typically explained in terms of visual speech providing form and/or timing information. We determined the effect of both types of information on a speech/non-speech task (non-speech stimuli were spectrally rotated speech). All stimuli were presented paired with the talker's static or moving face. Two types of moving face stimuli were used: full-face versions (both spoken form and timing information available) and modified face versions (only timing information provided by peri-oral motion available). The results showed that the peri-oral timing information facilitated response time for speech and non-speech stimuli compared to a static face. An additional facilitatory effect was found for full-face versions compared to the timing condition; this effect only occurred for speech stimuli. We propose the timing effect was due to cross-modal phase resetting; the form effect to cross-modal priming. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Base Stock Policy in a Join-Type Production Line with Advanced Demand Information

    NASA Astrophysics Data System (ADS)

    Hiraiwa, Mikihiko; Tsubouchi, Satoshi; Nakade, Koichi

    Production control such as the base stock policy, the kanban policy and the constant work-in-process policy in a serial production line has been studied by many researchers. Production lines, however, usually have fork-type, join-type or network-type figures. In addition, in most previous studies on production control, a finished product is required at the same time as arrival of demand at the system. Demand information is, however, informed before due date in practice. In this paper a join-type (assembly) production line under base stock control with advanced demand information in discrete time is analyzed. The recursive equations for the work-in-process are derived. The heuristic algorithm for finding appropriate base stock levels of all machines at short time is proposed and the effect of advanced demand information is examined by simulation with the proposed algorithm. It is shown that the inventory cost can decreases with little backlogs by using the appropriate amount of demand information and setting appropriate base stock levels.

  19. Working memory load eliminates the survival processing effect.

    PubMed

    Kroneisen, Meike; Rummel, Jan; Erdfelder, Edgar

    2014-01-01

    In a series of experiments, Nairne, Thompson, and Pandeirada (2007) demonstrated that words judged for their relevance to a survival scenario are remembered better than words judged for a scenario not relevant on a survival dimension. They explained this survival-processing effect by arguing that nature "tuned" our memory systems to process and remember fitness-relevant information. Kroneisen and Erdfelder (2011) proposed that it may not be survival processing per se that facilitates recall but the richness and distinctiveness with which information is encoded. To further test this account, we investigated how the survival processing effect is affected by cognitive load. If the survival processing effect is due to automatic processes or, alternatively, if survival processing is routinely prioritized in dual-task contexts, we would expect this effect to persist under cognitive load conditions. If the effect relies on cognitively demanding processes like richness and distinctiveness of encoding, however, the survival processing benefit should be hampered by increased cognitive load during encoding. Results were in line with the latter prediction, that is, the survival processing effect vanished under dual-task conditions.

  20. Preparing routine health information systems for immediate health responses to disasters

    PubMed Central

    Aung, Eindra; Whittaker, Maxine

    2013-01-01

    During disaster times, we need specific information to rapidly plan a disaster response, especially in sudden-onset disasters. Due to the inadequate capacity of Routine Health Information Systems (RHIS), many developing countries face a lack of quality pre-disaster health-related data and efficient post-disaster data processes in the immediate aftermath of a disaster. Considering the significance of local capacity during the early stages of disaster response, RHIS at local, provincial/state and national levels need to be strengthened so that they provide relief personnel up-to-date information to plan, organize and monitor immediate relief activities. RHIS professionals should be aware of specific information needs in disaster response (according to the Sphere Project’s Humanitarian Minimum Standards) and requirements in data processes to fulfil those information needs. Preparing RHIS for disasters can be guided by key RHIS-strengthening frameworks; and disaster preparedness must be incorporated into countries’ RHIS. Mechanisms must be established in non-disaster times and maintained between RHIS and information systems of non-health sectors for exchanging disaster-related information and sharing technologies and cost. PMID:23002249

  1. Enhanced and diminished visuo-spatial information processing in autism depends on stimulus complexity.

    PubMed

    Bertone, Armando; Mottron, Laurent; Jelenic, Patricia; Faubert, Jocelyn

    2005-10-01

    Visuo-perceptual processing in autism is characterized by intact or enhanced performance on static spatial tasks and inferior performance on dynamic tasks, suggesting a deficit of dorsal visual stream processing in autism. However, previous findings by Bertone et al. indicate that neuro-integrative mechanisms used to detect complex motion, rather than motion perception per se, may be impaired in autism. We present here the first demonstration of concurrent enhanced and decreased performance in autism on the same visuo-spatial static task, wherein the only factor dichotomizing performance was the neural complexity required to discriminate grating orientation. The ability of persons with autism was found to be superior for identifying the orientation of simple, luminance-defined (or first-order) gratings but inferior for complex, texture-defined (or second-order) gratings. Using a flicker contrast sensitivity task, we demonstrated that this finding is probably not due to abnormal information processing at a sub-cortical level (magnocellular and parvocellular functioning). Together, these findings are interpreted as a clear indication of altered low-level perceptual information processing in autism, and confirm that the deficits and assets observed in autistic visual perception are contingent on the complexity of the neural network required to process a given type of visual stimulus. We suggest that atypical neural connectivity, resulting in enhanced lateral inhibition, may account for both enhanced and decreased low-level information processing in autism.

  2. Top-down modulation of visual and auditory cortical processing in aging.

    PubMed

    Guerreiro, Maria J S; Eck, Judith; Moerel, Michelle; Evers, Elisabeth A T; Van Gerven, Pascal W M

    2015-02-01

    Age-related cognitive decline has been accounted for by an age-related deficit in top-down attentional modulation of sensory cortical processing. In light of recent behavioral findings showing that age-related differences in selective attention are modality dependent, our goal was to investigate the role of sensory modality in age-related differences in top-down modulation of sensory cortical processing. This question was addressed by testing younger and older individuals in several memory tasks while undergoing fMRI. Throughout these tasks, perceptual features were kept constant while attentional instructions were varied, allowing us to devise all combinations of relevant and irrelevant, visual and auditory information. We found no top-down modulation of auditory sensory cortical processing in either age group. In contrast, we found top-down modulation of visual cortical processing in both age groups, and this effect did not differ between age groups. That is, older adults enhanced cortical processing of relevant visual information and suppressed cortical processing of visual distractors during auditory attention to the same extent as younger adults. The present results indicate that older adults are capable of suppressing irrelevant visual information in the context of cross-modal auditory attention, and thereby challenge the view that age-related attentional and cognitive decline is due to a general deficits in the ability to suppress irrelevant information. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Use of evidence in a categorization task: analytic and holistic processing modes.

    PubMed

    Greco, Alberto; Moretti, Stefania

    2017-11-01

    Category learning performance can be influenced by many contextual factors, but the effects of these factors are not the same for all learners. The present study suggests that these differences can be due to the different ways evidence is used, according to two main basic modalities of processing information, analytically or holistically. In order to test the impact of the information provided, an inductive rule-based task was designed, in which feature salience and comparison informativeness between examples of two categories were manipulated during the learning phases, by introducing and progressively reducing some perceptual biases. To gather data on processing modalities, we devised the Active Feature Composition task, a production task that does not require classifying new items but reproducing them by combining features. At the end, an explicit rating task was performed, which entailed assessing the accuracy of a set of possible categorization rules. A combined analysis of the data collected with these two different tests enabled profiling participants in regard to the kind of processing modality, the structure of representations and the quality of categorial judgments. Results showed that despite the fact that the information provided was the same for all participants, those who adopted analytic processing better exploited evidence and performed more accurately, whereas with holistic processing categorization is perfectly possible but inaccurate. Finally, the cognitive implications of the proposed procedure, with regard to involved processes and representations, are discussed.

  4. The War Crimes Act: Current Issues

    DTIC Science & Technology

    2009-01-22

    defense of entrapment by estoppel , available when a defendant is informed by a government official that certain conduct is legal, and thereafter...entrapment by estoppel stems from the due process notions of fairness, rather than from common law concerning contract, equity, or agency. United States v

  5. Quantum Information Processing with Large Nuclear Spins in GaAs Semiconductors

    NASA Astrophysics Data System (ADS)

    Leuenberger, Michael N.; Loss, Daniel; Poggio, M.; Awschalom, D. D.

    2003-03-01

    We propose an implementation for quantum information processing based on coherent manipulations of nuclear spins I=3/2 in GaAs semiconductors. We describe theoretically an NMR method which involves multiphoton transitions and which exploits the nonequidistance of nuclear spin levels due to quadrupolar splittings. Starting from known spin anisotropies we derive effective Hamiltonians in a generalized rotating frame, valid for arbitrary I, which allow us to describe the nonperturbative time evolution of spin states generated by magnetic rf fields. We identify an experimentally observable regime for multiphoton Rabi oscillations. In the nonlinear regime, we find Berry phase interference. Ref: PRL 89, 207601 (2002).

  6. Improving Air Force Imagery Reconnaissance Support to Ground Commanders.

    DTIC Science & Technology

    1983-06-03

    reconnaissance support in Southeast Asia due to the long response times of film recovery and 26 processing capabilities and inadequate command and control...reconnaissance is an integral part of the C31 information explosion. Traditional silver halide film products, chemically processed and manually distributed are...being replaced with electronic near-real-time (NRT) imaging sensors. The term "imagery" now includes not only conventional film based products (black

  7. Estimation of Fine and Oversize Particle Ratio in a Heterogeneous Compound with Acoustic Emissions.

    PubMed

    Nsugbe, Ejay; Ruiz-Carcel, Cristobal; Starr, Andrew; Jennions, Ian

    2018-03-13

    The final phase of powder production typically involves a mixing process where all of the particles are combined and agglomerated with a binder to form a single compound. The traditional means of inspecting the physical properties of the final product involves an inspection of the particle sizes using an offline sieving and weighing process. The main downside of this technique, in addition to being an offline-only measurement procedure, is its inability to characterise large agglomerates of powders due to sieve blockage. This work assesses the feasibility of a real-time monitoring approach using a benchtop test rig and a prototype acoustic-based measurement approach to provide information that can be correlated to product quality and provide the opportunity for future process optimisation. Acoustic emission (AE) was chosen as the sensing method due to its low cost, simple setup process, and ease of implementation. The performance of the proposed method was assessed in a series of experiments where the offline quality check results were compared to the AE-based real-time estimations using data acquired from a benchtop powder free flow rig. A designed time domain based signal processing method was used to extract particle size information from the acquired AE signal and the results show that this technique is capable of estimating the required ratio in the washing powder compound with an average absolute error of 6%.

  8. China’s Comprehensive Approach: Refining the U.S. Targeting Process to Inform U.S. Strategy

    DTIC Science & Technology

    2018-04-20

    control demonstrated by China, the subject matter expertise required to generate a comprehensive approach like China’s does exist. However, due to a vast...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...code) NATIONAL DEFENSE UNIVERSITY JOINT FORCES STAFF COLLEGE JOINT ADVANCED WARFIGHTING SCHOOL CHINA’S COMPREHENSIVE APPROACH

  9. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems.

    PubMed

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-11-30

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015.

  10. Process Mining Methodology for Health Process Tracking Using Real-Time Indoor Location Systems

    PubMed Central

    Fernandez-Llatas, Carlos; Lizondo, Aroa; Monton, Eduardo; Benedi, Jose-Miguel; Traver, Vicente

    2015-01-01

    The definition of efficient and accurate health processes in hospitals is crucial for ensuring an adequate quality of service. Knowing and improving the behavior of the surgical processes in a hospital can improve the number of patients that can be operated on using the same resources. However, the measure of this process is usually made in an obtrusive way, forcing nurses to get information and time data, affecting the proper process and generating inaccurate data due to human errors during the stressful journey of health staff in the operating theater. The use of indoor location systems can take time information about the process in an unobtrusive way, freeing nurses, allowing them to engage in purely welfare work. However, it is necessary to present these data in a understandable way for health professionals, who cannot deal with large amounts of historical localization log data. The use of process mining techniques can deal with this problem, offering an easily understandable view of the process. In this paper, we present a tool and a process mining-based methodology that, using indoor location systems, enables health staff not only to represent the process, but to know precise information about the deployment of the process in an unobtrusive and transparent way. We have successfully tested this tool in a real surgical area with 3613 patients during February, March and April of 2015. PMID:26633395

  11. Similarity-based interference in a working memory numerical updating task: age-related differences between younger and older adults.

    PubMed

    Pelegrina, Santiago; Borella, Erika; Carretti, Barbara; Lechuga, M Teresa

    2012-01-01

    Similarity among representations held simultaneously in working memory (WM) is a factor which increases interference and hinders performance. The aim of the current study was to investigate age-related differences between younger and older adults in a working memory numerical updating task, in which the similarity between information held in WM was manipulated. Results showed a higher susceptibility of older adults to similarity-based interference when accuracy, and not response times, was considered. It was concluded that older adults' WM difficulties appear to be due to the availability of stored information, which, in turn, might be related to the ability to generate distinctive representations and to the process of binding such representations to their context when similar information has to be processed in WM.

  12. Don't Discount Societal Value in Cost-Effectiveness Comment on "Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness".

    PubMed

    Hall, William

    2017-01-14

    As healthcare resources become increasingly scarce due to growing demand and stagnating budgets, the need for effective priority setting and resource allocation will become ever more critical to providing sustainable care to patients. While societal values should certainly play a part in guiding these processes, the methodology used to capture these values need not necessarily be limited to multi-criterion decision analysis (MCDA)-based processes including 'evidence-informed deliberative processes.' However, if decision-makers intend to not only incorporates the values of the public they serve into decisions but have the decisions enacted as well, consideration should be given to more direct involvement of stakeholders. Based on the examples provided by Baltussen et al, MCDA-based processes like 'evidence-informed deliberative processes' could be one way of achieving this laudable goal. © 2017 The Author(s); Published by Kerman University of Medical Sciences. This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  13. Engineering workstation: Sensor modeling

    NASA Technical Reports Server (NTRS)

    Pavel, M; Sweet, B.

    1993-01-01

    The purpose of the engineering workstation is to provide an environment for rapid prototyping and evaluation of fusion and image processing algorithms. Ideally, the algorithms are designed to optimize the extraction of information that is useful to a pilot for all phases of flight operations. Successful design of effective fusion algorithms depends on the ability to characterize both the information available from the sensors and the information useful to a pilot. The workstation is comprised of subsystems for simulation of sensor-generated images, image processing, image enhancement, and fusion algorithms. As such, the workstation can be used to implement and evaluate both short-term solutions and long-term solutions. The short-term solutions are being developed to enhance a pilot's situational awareness by providing information in addition to his direct vision. The long term solutions are aimed at the development of complete synthetic vision systems. One of the important functions of the engineering workstation is to simulate the images that would be generated by the sensors. The simulation system is designed to use the graphics modeling and rendering capabilities of various workstations manufactured by Silicon Graphics Inc. The workstation simulates various aspects of the sensor-generated images arising from phenomenology of the sensors. In addition, the workstation can be used to simulate a variety of impairments due to mechanical limitations of the sensor placement and due to the motion of the airplane. Although the simulation is currently not performed in real-time, sequences of individual frames can be processed, stored, and recorded in a video format. In that way, it is possible to examine the appearance of different dynamic sensor-generated and fused images.

  14. Preparation for Speeded Action as a Psychophysiological Concept

    ERIC Educational Resources Information Center

    Jennings, J. Richard; van der Molen, Maurits W.

    2005-01-01

    Mental preparation aids performance and induces multiple physiological changes that should inform concepts of preparation. To date, however, these changes have been interpreted as being due to a global preparatory process (e.g., attention or alertness). The authors review psychophysiological and performance investigations of preparation. Concepts…

  15. Proteogenomics | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Proteogenomics, or the integration of proteomics with genomics and transcriptomics, is an emerging approach that promises to advance basic, translational and clinical research.  By combining genomic and proteomic information, leading scientists are gaining new insights due to a more complete and unified understanding of complex biological processes.

  16. NPY2-receptor variation modulates iconic memory processes.

    PubMed

    Arning, Larissa; Stock, Ann-Kathrin; Kloster, Eugen; Epplen, Jörg T; Beste, Christian

    2014-08-01

    Sensory memory systems are modality-specific buffers that comprise information about external stimuli, which represent the earliest stage of information processing. While these systems have been the subject of cognitive neuroscience research for decades, little is known about the neurobiological basis of sensory memory. However, accumulating evidence suggests that the glutamatergic system and systems influencing glutamatergic neural transmission are important. In the current study we examine if functional promoter variations in neuropeptide Y (NPY) and its receptor gene NPY2R affect iconic memory processes using a partial report paradigm. We found that iconic memory decayed much faster in individuals carrying the rare promoter NPY2R G allele which is associated with increased expression of the Y2 receptor. Possibly this effect is due to altered presynaptic inhibition of glutamate release, known to be modulated by Y2 receptors. Altogether, our results provide evidence that the functionally relevant single nucleotide polymorphism (SNP) in the NPY2R promoter gene affect circumscribed processes of early sensory processing, i.e. only the stability of information in sensory memory buffers. This leads us to suggest that especially the stability of information in sensory memory buffers depends on glutamatergic neural transmission and factors modulating glutamatergic turnover. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.

  17. Neurocognitive mechanisms underlying deceptive hazard evaluation: An event-related potentials investigation.

    PubMed

    Fu, Huijian; Qiu, Wenwei; Ma, Haiying; Ma, Qingguo

    2017-01-01

    Deceptive behavior is common in human social interactions. Researchers have been trying to uncover the cognitive process and neural basis underlying deception due to its theoretical and practical significance. We used Event-related potentials (ERPs) to investigate the neural correlates of deception when the participants completed a hazard judgment task. Pictures conveying or not conveying hazard information were presented to the participants who were then requested to discriminate the hazard content (safe or hazardous) and make a response corresponding to the cues (truthful or deceptive). Behavioral and electrophysiological data were recorded during the entire experiment. Results showed that deceptive responses, compared to truthful responses, were associated with longer reaction time (RT), lower accuracy, increased N2 and reduced late positive potential (LPP), suggesting a cognitively more demanding process to respond deceptively. The decrement in LPP correlated negatively with the increment in RT for deceptive relative to truthful responses, regardless of hazard content. In addition, hazardous information evoked larger N1 and P300 than safe information, reflecting an early processing bias and a later evaluative categorization process based on motivational significance, respectively. Finally, the interaction between honesty (truthful/deceptive) and safety (safe/hazardous) on accuracy and LPP indicated that deceptive responses towards safe information required more effort than deceptive responses towards hazardous information. Overall, these results demonstrate the neurocognitive substrates underlying deception about hazard information.

  18. Neurocognitive mechanisms underlying deceptive hazard evaluation: An event-related potentials investigation

    PubMed Central

    Qiu, Wenwei; Ma, Haiying; Ma, Qingguo

    2017-01-01

    Deceptive behavior is common in human social interactions. Researchers have been trying to uncover the cognitive process and neural basis underlying deception due to its theoretical and practical significance. We used Event-related potentials (ERPs) to investigate the neural correlates of deception when the participants completed a hazard judgment task. Pictures conveying or not conveying hazard information were presented to the participants who were then requested to discriminate the hazard content (safe or hazardous) and make a response corresponding to the cues (truthful or deceptive). Behavioral and electrophysiological data were recorded during the entire experiment. Results showed that deceptive responses, compared to truthful responses, were associated with longer reaction time (RT), lower accuracy, increased N2 and reduced late positive potential (LPP), suggesting a cognitively more demanding process to respond deceptively. The decrement in LPP correlated negatively with the increment in RT for deceptive relative to truthful responses, regardless of hazard content. In addition, hazardous information evoked larger N1 and P300 than safe information, reflecting an early processing bias and a later evaluative categorization process based on motivational significance, respectively. Finally, the interaction between honesty (truthful/deceptive) and safety (safe/hazardous) on accuracy and LPP indicated that deceptive responses towards safe information required more effort than deceptive responses towards hazardous information. Overall, these results demonstrate the neurocognitive substrates underlying deception about hazard information. PMID:28793344

  19. Self-Referential Information Alleviates Retrieval Inhibition of Directed Forgetting Effects-An ERP Evidence of Source Memory.

    PubMed

    Mao, Xinrui; Wang, Yujuan; Wu, Yanhong; Guo, Chunyan

    2017-01-01

    Directed forgetting (DF) assists in preventing outdated information from interfering with cognitive processing. Previous studies pointed that self-referential items alleviated DF effects due to the elaboration of encoding processes. However, the retrieval mechanism of this phenomenon remains unknown. Based on the dual-process framework of recognition, the retrieval of self-referential information was involved in familiarity and recollection. Using source memory tasks combined with event-related potential (ERP) recording, our research investigated the retrieval processes of alleviative DF effects elicited by self-referential information. The FN400 (frontal negativity at 400 ms) is a frontal potential at 300-500 ms related to familiarity and the late positive complex (LPC) is a later parietal potential at 500-800 ms related to recollection. The FN400 effects of source memory suggested that familiarity processes were promoted by self-referential effects without the modulation of to-be-forgotten (TBF) instruction. The ERP results of DF effects were involved with LPCs of source memory, which indexed retrieval processing of recollection. The other-referential source memory of TBF instruction caused the absence of LPC effects, while the self-referential source memory of TBF instruction still elicited the significant LPC effects. Therefore, our neural findings suggested that self-referential processing improved both familiarity and recollection. Furthermore, the self-referential processing advantage which was caused by the autobiographical retrieval alleviated retrieval inhibition of DF, supporting that the self-referential source memory alleviated DF effects.

  20. Apply creative thinking of decision support in electrical nursing record.

    PubMed

    Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung

    2006-01-01

    The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.

  1. Self-Referential Information Alleviates Retrieval Inhibition of Directed Forgetting Effects—An ERP Evidence of Source Memory

    PubMed Central

    Mao, Xinrui; Wang, Yujuan; Wu, Yanhong; Guo, Chunyan

    2017-01-01

    Directed forgetting (DF) assists in preventing outdated information from interfering with cognitive processing. Previous studies pointed that self-referential items alleviated DF effects due to the elaboration of encoding processes. However, the retrieval mechanism of this phenomenon remains unknown. Based on the dual-process framework of recognition, the retrieval of self-referential information was involved in familiarity and recollection. Using source memory tasks combined with event-related potential (ERP) recording, our research investigated the retrieval processes of alleviative DF effects elicited by self-referential information. The FN400 (frontal negativity at 400 ms) is a frontal potential at 300–500 ms related to familiarity and the late positive complex (LPC) is a later parietal potential at 500–800 ms related to recollection. The FN400 effects of source memory suggested that familiarity processes were promoted by self-referential effects without the modulation of to-be-forgotten (TBF) instruction. The ERP results of DF effects were involved with LPCs of source memory, which indexed retrieval processing of recollection. The other-referential source memory of TBF instruction caused the absence of LPC effects, while the self-referential source memory of TBF instruction still elicited the significant LPC effects. Therefore, our neural findings suggested that self-referential processing improved both familiarity and recollection. Furthermore, the self-referential processing advantage which was caused by the autobiographical retrieval alleviated retrieval inhibition of DF, supporting that the self-referential source memory alleviated DF effects. PMID:29066962

  2. Forensic hash for multimedia information

    NASA Astrophysics Data System (ADS)

    Lu, Wenjun; Varna, Avinash L.; Wu, Min

    2010-01-01

    Digital multimedia such as images and videos are prevalent on today's internet and cause significant social impact, which can be evidenced by the proliferation of social networking sites with user generated contents. Due to the ease of generating and modifying images and videos, it is critical to establish trustworthiness for online multimedia information. In this paper, we propose novel approaches to perform multimedia forensics using compact side information to reconstruct the processing history of a document. We refer to this as FASHION, standing for Forensic hASH for informatION assurance. Based on the Radon transform and scale space theory, the proposed forensic hash is compact and can effectively estimate the parameters of geometric transforms and detect local tampering that an image may have undergone. Forensic hash is designed to answer a broader range of questions regarding the processing history of multimedia data than the simple binary decision from traditional robust image hashing, and also offers more efficient and accurate forensic analysis than multimedia forensic techniques that do not use any side information.

  3. Learning to recognize face shapes through serial exploration.

    PubMed

    Wallraven, Christian; Whittingstall, Lisa; Bülthoff, Heinrich H

    2013-05-01

    Human observers are experts at visual face recognition due to specialized visual mechanisms for face processing that evolve with perceptual expertize. Such expertize has long been attributed to the use of configural processing, enabled by fast, parallel information encoding of the visual information in the face. Here we tested whether participants can learn to efficiently recognize faces that are serially encoded-that is, when only partial visual information about the face is available at any given time. For this, ten participants were trained in gaze-restricted face recognition in which face masks were viewed through a small aperture controlled by the participant. Tests comparing trained with untrained performance revealed (1) a marked improvement in terms of speed and accuracy, (2) a gradual development of configural processing strategies, and (3) participants' ability to rapidly learn and accurately recognize novel exemplars. This performance pattern demonstrates that participants were able to learn new strategies to compensate for the serial nature of information encoding. The results are discussed in terms of expertize acquisition and relevance for other sensory modalities relying on serial encoding.

  4. Environmental impact of mushroom compost production.

    PubMed

    Leiva, Francisco; Saenz-Díez, Juan-Carlos; Martínez, Eduardo; Jiménez, Emilio; Blanco, Julio

    2016-09-01

    This research analyses the environmental impact of the creation of Agaricus bisporus compost packages. The composting process is the intermediate stage of the mushroom production process, subsequent to the mycelium cultivation stage and prior to the fruiting bodies cultivation stage. A full life cycle assessment model of the Agaricus bisporus composting process has been developed through the identification and analysis of the inputs-outputs and energy consumption of the activities involved in the production process. The study has been developed based on data collected from a plant during a 1 year campaign, thereby obtaining accurate information used to analyse the environmental impact of the process. A global analysis of the main stages of the process shows that the process that has the greatest impact in most categories is the compost batch preparation process. This is due to an increased consumption of energy resources by the machinery that mixes the raw materials to create the batch. At the composting process inside the tunnel stage, the activity that has the greatest impact in almost all categories studied is the initial stage of composting. This is due to higher energy consumption during the process compared to the other stages. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  5. FPGA implementation of sparse matrix algorithm for information retrieval

    NASA Astrophysics Data System (ADS)

    Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio

    2005-06-01

    Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.

  6. The Effect of Economic Feedback on Providers’ Prescription Habits: Are Outcomes Improved? Are Institutional Savings Realized?

    DTIC Science & Technology

    2000-02-01

    dependent (Type II) Diabetes Millitus . The PEC published an informative article on the treatment and cost differences between Glucotrol XL and...studies use. This information, although different from diabetes as a disease-state study, is essentially the same thought process the researcher used...literature of the particular disease. Diabetes is the disease that will be studied due to its relative ease of acuity measures, its large scope with respect

  7. Spatial memory and integration processes in congenital blindness.

    PubMed

    Vecchi, Tomaso; Tinti, Carla; Cornoldi, Cesare

    2004-12-22

    The paper tests the hypothesis that difficulties met by the blind in spatial processing are due to the simultaneous treatment of independent spatial representations. Results showed that lack of vision does not impede the ability to process and transform mental images; however, blind people are significantly poorer in the recall of more than a single spatial pattern at a time than in the recall of the corresponding material integrated into a single pattern. It is concluded that the simultaneous maintenance of different spatial information is affected by congenital blindness, while cognitive processes that may involve sequential manipulation are not.

  8. The Impact of Sleep on Learning and Behavior in Adolescents.

    ERIC Educational Resources Information Center

    Mitru, Georgios; Millrood, Daniel L.; Mateika, Jason H.

    2002-01-01

    Many adolescents experience sleep deprivation due to such factors as academic workload and social and employment opportunities. The ability to effectively interact with peers while learning and processing new information may be diminished in sleep deprived adolescents. Some school districts are changing school start times to allow students more…

  9. 77 FR 10621 - Changes to the In-Bond Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-22

    ... submit in-bond applications electronically using a CBP-approved electronic data interchange (EDI) system... electronically submit the in-bond application to CBP via a CBP-approved EDI system. \\6\\ Due to the unique... as the CBP-approved EDI system for submitting the in-bond application and other information that is...

  10. 15 CFR 971.1006 - Proprietary enforcement information.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATIONS OF THE ENVIRONMENTAL DATA SERVICE DEEP SEABED MINING REGULATIONS FOR COMMERCIAL RECOVERY PERMITS... or maintained under Title III of the Act concerning a person or vessel engaged in commercial recovery... Administrator will, consistent with due process, move to have records sealed, under 15 CFR part 904 subpart C...

  11. NCI-CPTAC DREAM Proteogenomics Challenge (Registration Now Open) | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Proteogenomics, integration of proteomics, genomics, and transcriptomics, is an emerging approach that promises to advance basic, translational and clinical research.  By combining genomic and proteomic information, leading scientists are gaining new insights due to a more complete and unified understanding of complex biological processes.

  12. Simultaneous determination of multiple soil enzyme activities for soil health-biogeochemical indexes

    USDA-ARS?s Scientific Manuscript database

    Enzyme activities (EAs) are soil health indicators of changes in decomposition processes due to management and the crop(s) affecting the quantity and quality of plant residues and nutrients entering the soil. More commonly assessed soil EAs can provide information of reactions where plant available ...

  13. Multiple-Reason Decision Making Based on Automatic Processing

    ERIC Educational Resources Information Center

    Glockner, Andreas; Betsch, Tilmann

    2008-01-01

    It has been repeatedly shown that in decisions under time constraints, individuals predominantly use noncompensatory strategies rather than complex compensatory ones. The authors argue that these findings might be due not to limitations of cognitive capacity but instead to limitations of information search imposed by the commonly used experimental…

  14. Quantifying sediment provenance using multiple composite fingerprints in a small watershed in Oklahoma

    USDA-ARS?s Scientific Manuscript database

    Quantitative information on sediment provenance is badly needed for calibration and validation of process-based soil erosion models. However, sediment source data are rather limited due to difficulties in direct measurement of various source contributions at a watershed scale. The objectives are t...

  15. 77 FR 47818 - Proposed Information Collection; Comment Request; Socioeconomics of Commercial Fishers and for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-10

    ... on small businesses). Our initial plan, as the first step in the assessment process, was to interview..., and for-hire recreational fishing operations (charter and party/head boat operations)--with questions... and fishing industry interviews were completed. The commercial fisheries interviews were not begun due...

  16. A Handbook for Hearing Officers. Revised.

    ERIC Educational Resources Information Center

    McCarthy, Greg

    The handbook provides information on legislation and litigation pertaining to the education of handicapped pupils in South Carolina, required procedures for ensuring due process, and suggestions for procedures to be followed by Hearing Officers prior to, during, and after a hearing. The opening section on state laws includes definitions of…

  17. SCIMITAR: Scalable Stream-Processing for Sensor Information Brokering

    DTIC Science & Technology

    2013-11-01

    IaaS) cloud frameworks including Amazon Web Services and Eucalyptus . For load testing, we used The Grinder [9], a Java load testing framework that...internal Eucalyptus cluster which we could not scale as large as the Amazon environment due to a lack of computation resources. We recreated our

  18. Health Information Economy: Literature Review.

    PubMed

    Ebrahimi, Kamal; Roudbari, Masoud; Sadoughi, Farahnaz

    2015-04-19

    Health Information Economy (HIE) is one of the broader, more complex, and challenging and yet important topics in the field of health science that requires the identification of its dimensions for planning and policy making. The aim of this study was to determine HIE concept dimensions. This paper presents a systematic methodology for analyzing the trends of HIE. For this purpose, the main keywords of this area were identified and searched in the databases and from among 4775 retrieved sources, 12 sources were studied in the field of HIE. Information Economy (IE) in the world has passed behind four paradigms that involve the information evaluation perspective, the information technology perspective, the asymmetric information perspective and information value perspective. In this research, the fourth perspective in the HIE was analyzed. The main findings of this research were categorized in three major groups, including the flow of information process in the field of health (production. collection, processing and dissemination), and information applications in the same field (education, research, health industry, policy, legislation, and decision-making) and the underlying fields. According to the findings, HIE has already developed a theoretical and conceptual gap that due to its importance in the next decade would be one of the research approaches to health science.

  19. Influence of trust in the spreading of information

    NASA Astrophysics Data System (ADS)

    Wu, Hongrun; Arenas, Alex; Gómez, Sergio

    2017-01-01

    The understanding and prediction of information diffusion processes on networks is a major challenge in network theory with many implications in social sciences. Many theoretical advances occurred due to stochastic spreading models. Nevertheless, these stochastic models overlooked the influence of rational decisions on the outcome of the process. For instance, different levels of trust in acquaintances do play a role in information spreading, and actors may change their spreading decisions during the information diffusion process accordingly. Here, we study an information-spreading model in which the decision to transmit or not is based on trust. We explore the interplay between the propagation of information and the trust dynamics happening on a two-layer multiplex network. Actors' trustable or untrustable states are defined as accumulated cooperation or defection behaviors, respectively, in a Prisoner's Dilemma setup, and they are controlled by a memory span. The propagation of information is abstracted as a threshold model on the information-spreading layer, where the threshold depends on the trustability of agents. The analysis of the model is performed using a tree approximation and validated on homogeneous and heterogeneous networks. The results show that the memory of previous actions has a significant effect on the spreading of information. For example, the less memory that is considered, the higher is the diffusion. Information is highly promoted by the emergence of trustable acquaintances. These results provide insight into the effect of plausible biases on spreading dynamics in a multilevel networked system.

  20. A 3-Component Mixture of Rayleigh Distributions: Properties and Estimation in Bayesian Framework

    PubMed Central

    Aslam, Muhammad; Tahir, Muhammad; Hussain, Zawar; Al-Zahrani, Bander

    2015-01-01

    To study lifetimes of certain engineering processes, a lifetime model which can accommodate the nature of such processes is desired. The mixture models of underlying lifetime distributions are intuitively more appropriate and appealing to model the heterogeneous nature of process as compared to simple models. This paper is about studying a 3-component mixture of the Rayleigh distributionsin Bayesian perspective. The censored sampling environment is considered due to its popularity in reliability theory and survival analysis. The expressions for the Bayes estimators and their posterior risks are derived under different scenarios. In case the case that no or little prior information is available, elicitation of hyperparameters is given. To examine, numerically, the performance of the Bayes estimators using non-informative and informative priors under different loss functions, we have simulated their statistical properties for different sample sizes and test termination times. In addition, to highlight the practical significance, an illustrative example based on a real-life engineering data is also given. PMID:25993475

  1. Enhanced dimension-specific visual working memory in grapheme–color synesthesia☆

    PubMed Central

    Terhune, Devin Blair; Wudarczyk, Olga Anna; Kochuparampil, Priya; Cohen Kadosh, Roi

    2013-01-01

    There is emerging evidence that the encoding of visual information and the maintenance of this information in a temporarily accessible state in working memory rely on the same neural mechanisms. A consequence of this overlap is that atypical forms of perception should influence working memory. We examined this by investigating whether having grapheme–color synesthesia, a condition characterized by the involuntary experience of color photisms when reading or representing graphemes, would confer benefits on working memory. Two competing hypotheses propose that superior memory in synesthesia results from information being coded in two information channels (dual-coding) or from superior dimension-specific visual processing (enhanced processing). We discriminated between these hypotheses in three n-back experiments in which controls and synesthetes viewed inducer and non-inducer graphemes and maintained color or grapheme information in working memory. Synesthetes displayed superior color working memory than controls for both grapheme types, whereas the two groups did not differ in grapheme working memory. Further analyses excluded the possibilities of enhanced working memory among synesthetes being due to greater color discrimination, stimulus color familiarity, or bidirectionality. These results reveal enhanced dimension-specific visual working memory in this population and supply further evidence for a close relationship between sensory processing and the maintenance of sensory information in working memory. PMID:23892185

  2. Multicriterion problem of allocation of resources in the heterogeneous distributed information processing systems

    NASA Astrophysics Data System (ADS)

    Antamoshkin, O. A.; Kilochitskaya, T. R.; Ontuzheva, G. A.; Stupina, A. A.; Tynchenko, V. S.

    2018-05-01

    This study reviews the problem of allocation of resources in the heterogeneous distributed information processing systems, which may be formalized in the form of a multicriterion multi-index problem with the linear constraints of the transport type. The algorithms for solution of this problem suggest a search for the entire set of Pareto-optimal solutions. For some classes of hierarchical systems, it is possible to significantly speed up the procedure of verification of a system of linear algebraic inequalities for consistency due to the reducibility of them to the stream models or the application of other solution schemes (for strongly connected structures) that take into account the specifics of the hierarchies under consideration.

  3. Gender differences in the use of external landmarks versus spatial representations updated by self-motion.

    PubMed

    Lambrey, Simon; Berthoz, Alain

    2007-09-01

    Numerous data in the literature provide evidence for gender differences in spatial orientation. In particular, it has been suggested that spatial representations of large-scale environments are more accurate in terms of metric information in men than in women but are richer in landmark information in women than in men. One explanatory hypothesis is that men and women differ in terms of navigational processes they used in daily life. The present study investigated this hypothesis by distinguishing two navigational processes: spatial updating by self-motion and landmark-based orientation. Subjects were asked to perform a pointing task in three experimental conditions, which differed in terms of reliability of the external landmarks that could be used. Two groups of subjects were distinguished, a mobile group and an immobile group, in which spatial updating of environmental locations did not have the same degree of importance for the correct performance of the pointing task. We found that men readily relied on an internal egocentric representation of where landmarks were expected to be in order to perform the pointing task, a representation that could be updated during self-motion (spatial updating). In contrast, women seemed to take their bearings more readily on the basis of the stable landmarks of the external world. We suggest that this gender difference in spatial orientation is not due to differences in information processing abilities but rather due to the differences in higher level strategies.

  4. Implantable electronics: emerging design issues and an ultra light-weight security solution.

    PubMed

    Narasimhan, Seetharam; Wang, Xinmu; Bhunia, Swarup

    2010-01-01

    Implantable systems that monitor biological signals require increasingly complex digital signal processing (DSP) electronics for real-time in-situ analysis and compression of the recorded signals. While it is well-known that such signal processing hardware needs to be implemented under tight area and power constraints, new design requirements emerge with their increasing complexity. Use of nanoscale technology shows tremendous benefits in implementing these advanced circuits due to dramatic improvement in integration density and power dissipation per operation. However, it also brings in new challenges such as reliability and large idle power (due to higher leakage current). Besides, programmability of the device as well as security of the recorded information are rapidly becoming major design considerations of such systems. In this paper, we analyze the emerging issues associated with the design of the DSP unit in an implantable system. Next, we propose a novel ultra light-weight solution to address the information security issue. Unlike the conventional information security approaches like data encryption, which come at large area and power overhead and hence are not amenable for resource-constrained implantable systems, we propose a multilevel key-based scrambling algorithm, which exploits the nature of the biological signal to effectively obfuscate it. Analysis of the proposed algorithm in the context of neural signal processing and its hardware implementation shows that we can achieve high level of security with ∼ 13X lower power and ∼ 5X lower area overhead than conventional cryptographic solutions.

  5. Understanding of how older adults with low vision obtain, process, and understand health information and services.

    PubMed

    Kim, Hyung Nam

    2017-10-16

    Twenty-five years after the Americans with Disabilities Act, there has still been a lack of advancement of accessibility in healthcare for people with visual impairments, particularly older adults with low vision. This study aims to advance understanding of how older adults with low vision obtain, process, and use health information and services, and to seek opportunities of information technology to support them. A convenience sample of 10 older adults with low vision participated in semi-structured phone interviews, which were audio-recorded and transcribed verbatim for analysis. Participants shared various concerns in accessing, understanding, and using health information, care services, and multimedia technologies. Two main themes and nine subthemes emerged from the analysis. Due to the concerns, older adults with low vision tended to fail to obtain the full range of all health information and services to meet their specific needs. Those with low vision still rely on residual vision such that multimedia-based information which can be useful, but it should still be designed to ensure its accessibility, usability, and understandability.

  6. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    PubMed

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  7. The feasibility of using natural language processing to extract clinical information from breast pathology reports.

    PubMed

    Buckley, Julliette M; Coopey, Suzanne B; Sharko, John; Polubriaginof, Fernanda; Drohan, Brian; Belli, Ahmet K; Kim, Elizabeth M H; Garber, Judy E; Smith, Barbara L; Gadd, Michele A; Specht, Michelle C; Roche, Constance A; Gudewicz, Thomas M; Hughes, Kevin S

    2012-01-01

    The opportunity to integrate clinical decision support systems into clinical practice is limited due to the lack of structured, machine readable data in the current format of the electronic health record. Natural language processing has been designed to convert free text into machine readable data. The aim of the current study was to ascertain the feasibility of using natural language processing to extract clinical information from >76,000 breast pathology reports. APPROACH AND PROCEDURE: Breast pathology reports from three institutions were analyzed using natural language processing software (Clearforest, Waltham, MA) to extract information on a variety of pathologic diagnoses of interest. Data tables were created from the extracted information according to date of surgery, side of surgery, and medical record number. The variety of ways in which each diagnosis could be represented was recorded, as a means of demonstrating the complexity of machine interpretation of free text. There was widespread variation in how pathologists reported common pathologic diagnoses. We report, for example, 124 ways of saying invasive ductal carcinoma and 95 ways of saying invasive lobular carcinoma. There were >4000 ways of saying invasive ductal carcinoma was not present. Natural language processor sensitivity and specificity were 99.1% and 96.5% when compared to expert human coders. We have demonstrated how a large body of free text medical information such as seen in breast pathology reports, can be converted to a machine readable format using natural language processing, and described the inherent complexities of the task.

  8. Costing child protective services staff turnover.

    PubMed

    Graef, M I; Hill, E L

    2000-01-01

    This article details the process used in one state to determine the financial costs to the child welfare agency accrued over the course of one year that were directly attributable to CPS staff turnover. The formulas and process for calculating specific cost elements due to separation, replacement and training are provided. The practical considerations inherent in this type of analysis are highlighted, as well as the use of this type of data to inform agency human resource strategies.

  9. A novel architecture for information retrieval system based on semantic web

    NASA Astrophysics Data System (ADS)

    Zhang, Hui

    2011-12-01

    Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The challenge that is now before us is not only to help people locating relevant information precisely but also to access and aggregate a variety of information from different resources automatically. Current web document are in human-oriented formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be understood and processed by machine. It provides new possibilities for automatic web information processing. A main problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search Engine.

  10. Testing an alternate informed consent process.

    PubMed

    Yates, Bernice C; Dodendorf, Diane; Lane, Judy; LaFramboise, Louise; Pozehl, Bunny; Duncan, Kathleen; Knodel, Kendra

    2009-01-01

    One of the main problems in conducting clinical trials is low participation rate due to potential participants' misunderstanding of the rationale for the clinical trial or perceptions of loss of control over treatment decisions. The objective of this study was to test an alternate informed consent process in cardiac rehabilitation participants that involved the use of a multimedia flip chart to describe a future randomized clinical trial and then asked, hypothetically, if they would participate in the future trial. An attractive and inviting visual presentation of the study was created in the form of a 23-page flip chart that included 24 color photographs displaying information about the purpose of the study, similarities and differences between the two treatment groups, and the data collection process. We tested the flip chart in 35 cardiac rehabilitation participants. Participants were asked if they would participate in this future study on two occasions: immediately after the description of the flip chart and 24 hours later, after reading through the informed consent document. Participants were also asked their perceptions of the flip chart and consent process. Of the 35 participants surveyed, 19 (54%) indicated that they would participate in the future study. No participant changed his or her decision 24 hours later after reading the full consent form. The participation rate improved 145% over that of an earlier feasibility study where the recruitment rate was 22%. Most participants stated that the flip chart was helpful and informative and that the photographs were effective in communicating the purpose of the study. Participation rates could be enhanced in future clinical trials by using a visual presentation to explain and describe the study as part of the informed consent process. More research is needed to test alternate methods of obtaining informed consent.

  11. Synaptic plasticity, neural circuits, and the emerging role of altered short-term information processing in schizophrenia

    PubMed Central

    Crabtree, Gregg W.; Gogos, Joseph A.

    2014-01-01

    Synaptic plasticity alters the strength of information flow between presynaptic and postsynaptic neurons and thus modifies the likelihood that action potentials in a presynaptic neuron will lead to an action potential in a postsynaptic neuron. As such, synaptic plasticity and pathological changes in synaptic plasticity impact the synaptic computation which controls the information flow through the neural microcircuits responsible for the complex information processing necessary to drive adaptive behaviors. As current theories of neuropsychiatric disease suggest that distinct dysfunctions in neural circuit performance may critically underlie the unique symptoms of these diseases, pathological alterations in synaptic plasticity mechanisms may be fundamental to the disease process. Here we consider mechanisms of both short-term and long-term plasticity of synaptic transmission and their possible roles in information processing by neural microcircuits in both health and disease. As paradigms of neuropsychiatric diseases with strongly implicated risk genes, we discuss the findings in schizophrenia and autism and consider the alterations in synaptic plasticity and network function observed in both human studies and genetic mouse models of these diseases. Together these studies have begun to point toward a likely dominant role of short-term synaptic plasticity alterations in schizophrenia while dysfunction in autism spectrum disorders (ASDs) may be due to a combination of both short-term and long-term synaptic plasticity alterations. PMID:25505409

  12. A laboratory information management system for the analysis of tritium (3H) in environmental waters.

    PubMed

    Belachew, Dagnachew Legesse; Terzer-Wassmuth, Stefan; Wassenaar, Leonard I; Klaus, Philipp M; Copia, Lorenzo; Araguás, Luis J Araguás; Aggarwal, Pradeep

    2018-07-01

    Accurate and precise measurements of low levels of tritium ( 3 H) in environmental waters are difficult to attain due to complex steps of sample preparation, electrolytic enrichment, liquid scintillation decay counting, and extensive data processing. We present a Microsoft Access™ relational database application, TRIMS (Tritium Information Management System) to assist with sample and data processing of tritium analysis by managing the processes from sample registration and analysis to reporting and archiving. A complete uncertainty propagation algorithm ensures tritium results are reported with robust uncertainty metrics. TRIMS will help to increase laboratory productivity and improve the accuracy and precision of 3 H assays. The software supports several enrichment protocols and LSC counter types. TRIMS is available for download at no cost from the IAEA at www.iaea.org/water. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. [Analysis of judicial sentences issued against traumatologists between 1995 and 2011 as regards medical negligence].

    PubMed

    Cardoso-Cita, Z; Perea-Pérez, B; Albarrán-Juan, M E; Labajo-González, M E; López-Durán, L; Marco-Martínez, F; Santiago-Saéz, A

    2016-01-01

    Traumatology and Orthopaedic Surgery is one of the specialities with most complaints due to its scope and complexity. The aim of this study is to determine the characteristics of the complaints made against medical specialists in Traumatology, taking into account those variables that might have an influence both on the presenting of the complaint as well as on the resolving of the process. An analysis was performed on 303 legal judgments (1995-2011) collected in the health legal judgements archive of the Madrid School of Medicine, which is linked to the Westlaw Aranzadi data base. Civil jurisdiction was the most used. The specific processes with most complaints were bone-joint disorders followed by vascular-nerve problems and infections. The injury claimed against most was in the lower limb, particularly the knee. The most frequent general cause of complaint was surgical treatment error, followed by diagnostic error. There was lack of information in 14.9%. There was sentencing in 49.8% of the cases, with compensation mainly being less than 50,000 euros. Traumatology and Orthopaedic Surgery is a speciality prone to complaints due to malpractice. The number of sentences against traumatologists is high, but compensations are usually less than 50,000 euros. The main reason for sentencing is surgical treatment error; thus being the basic surgical procedure and where precautions should be maximised. The judgements due to lack of information are high, with adequate doctor-patient communication being essential as well as the correct completion of the informed consent. Copyright © 2014 SECOT. Published by Elsevier Espana. All rights reserved.

  14. Application of image processing to calculate the number of fish seeds using raspberry-pi

    NASA Astrophysics Data System (ADS)

    Rahmadiansah, A.; Kusumawardhani, A.; Duanto, F. N.; Qoonita, F.

    2018-03-01

    Many fish cultivator in Indonesia who suffered losses due to the sale and purchase of fish seeds did not match the agreed amount. The loss is due to the calculation of fish seed still using manual method. To overcome these problems, then in this study designed fish counting system automatically and real-time fish using the image processing based on Raspberry Pi. Used image processing because it can calculate moving objects and eliminate noise. Image processing method used to calculate moving object is virtual loop detector or virtual detector method and the approach used is “double difference image”. The “double difference” approach uses information from the previous frame and the next frame to estimate the shape and position of the object. Using these methods and approaches, the results obtained were quite good with an average error of 1.0% for 300 individuals in a test with a virtual detector width of 96 pixels and a slope of 1 degree test plane.

  15. 76 FR 39820 - Notice of Funding Availability: Sections 514, 515 and 516 Multi-Family Housing Revitalization...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... post office or private mailer does not constitute delivery. Facsimile (FAX) and postage-due pre... a pre-application described in Section VI. This pre-application process is designed to lessen the... reserves the right to post all information submitted as part of the pre-application/application package...

  16. Teaching Science and Mathematics to At Risk Students. ERIC Digest.

    ERIC Educational Resources Information Center

    Schwartz, Wendy

    Traditionally, disadvantaged groups, such as women and minorities, have not excelled in science and math. Often the lack of literacy and achievement in these subjects is due to the following factors: (1) cognitive differences between how the information is presented and how the students process it; (2) lack of familiarity, because of cultural…

  17. Attending to Multiple Visual Streams: Interactions between Location-Based and Category-Based Attentional Selection

    ERIC Educational Resources Information Center

    Fagioli, Sabrina; Macaluso, Emiliano

    2009-01-01

    Behavioral studies indicate that subjects are able to divide attention between multiple streams of information at different locations. However, it is still unclear to what extent the observed costs reflect processes specifically associated with spatial attention, versus more general interference due the concurrent monitoring of multiple streams of…

  18. Factors Related to Impaired Visual Orienting Behavior in Children with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Boot, F. H.; Pel, J .J. M.; Evenhuis, H. M.; van der Steen, J.

    2012-01-01

    It is generally assumed that children with intellectual disabilities (ID) have an increased risk of impaired visual information processing due to brain damage or brain development disorder. So far little evidence has been presented to support this assumption. Abnormal visual orienting behavior is a sensitive tool to evaluate impaired visual…

  19. The Due-Able Process Could Happen to You! Physical Educators, Handicapped Students, and the Law.

    ERIC Educational Resources Information Center

    Kennedy, Susan O.; And Others

    1989-01-01

    This article presents basic information for regular and special physical educators to help them better understand the procedural rights of parents as well as the schools, and to help them make appropriate judgments for the physical education placement and programing of students with handicaps. (IAH)

  20. Does Ethnicity Matter? The Impact of Stereotypical Expectations on In-Service Teachers' Judgments of Students

    ERIC Educational Resources Information Center

    Glock, Sabine

    2016-01-01

    Ethnic minority students face many disadvantages in school, which might be due in part to teachers' stereotypical expectations and attitudes. Dual process theories of impression and judgment formation specify person information that confirms or disconfirms stereotypical expectations as determinants of how judgments are formed. While…

  1. An approach to determine multiple enzyme activities in the same soil sample for soil health-biogeochemical indexes

    USDA-ARS?s Scientific Manuscript database

    Enzyme activities (EAs) are soil health indicators of changes in decomposition processes due to management and the crop(s) affecting the quantity and quality of plant residues and nutrients entering the soil. More commonly assessed soil EAs can provide information of reactions where plant available ...

  2. Thinking in Action: Some Insights from Cognitive Sport Psychology

    ERIC Educational Resources Information Center

    Moran, Aidan

    2012-01-01

    Historically, cognitive researchers have largely ignored the domain of sport in their quest to understand how the mind works. This neglect is due, in part, to the limitations of the information processing paradigm that dominated cognitive psychology in its formative years. With the emergence of the embodiment approach to cognition, however, sport…

  3. The Case for General Education in Community Colleges.

    ERIC Educational Resources Information Center

    Cohen, Arthur M.

    General education is the process of developing a framework on which to place knowledge stemming from various sources, of learning to think critically, develop values, understand traditions, and respect diverse cultures and opinions. Its rationale is the freedom enjoyed by an informed citizen. General education has had an unstable history due to…

  4. New York Bight Study: Report 5, NY Bight Biological Review Program

    DTIC Science & Technology

    1994-05-01

    final cap has been recolonized due to changes in granulome - tVy and stress from chronic burial. The procedure and information available for examining this...34 Linear oceanographic features: A focus for research on recruitment processes," Australian Journal of Ecology 15, 391-401. Kiorboe, T., Munk, P., Richardson

  5. Managing Democracy in the Workplace for Sustainable Productivity in Nigeria

    ERIC Educational Resources Information Center

    Arikpo, Arikpo B.; Etor, Robert B.; Usang, Ewa

    2007-01-01

    Democracy engenders freedom for all, human rights, participation based on equality, shared values, the rule of law, due process, good governance and transparency. In the workplace, it would include freedom of expression, association, participation, access to available information and the right of workers to understand what goes on where they work.…

  6. An IBM Compatible Participant Data Base System for Outdoor Programs.

    ERIC Educational Resources Information Center

    Watters, Ron

    The process of maintaining mailing lists and other informational files on outdoor program participants is, plainly and simply, a pain in the neck. Mailing list maintenance is particularly difficult for programs that deal with university students, due to their frequent moves. This paper describes a new software program, the Outdoor Program Data…

  7. 48 CFR 1852.245-73 - Financial reporting of NASA property in the custody of contractors.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... due. However, contractors' procedures must document the process for developing these estimates based... shall have formal policies and procedures, which address the validation of NF 1018 data, including data... validation is to ensure that information reported is accurate and in compliance with the NASA FAR Supplement...

  8. 48 CFR 1852.245-73 - Financial reporting of NASA property in the custody of contractors.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... due. However, contractors' procedures must document the process for developing these estimates based... shall have formal policies and procedures, which address the validation of NF 1018 data, including data... validation is to ensure that information reported is accurate and in compliance with the NASA FAR Supplement...

  9. An information extraction framework for cohort identification using electronic health records.

    PubMed

    Liu, Hongfang; Bielinski, Suzette J; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B; Jonnalagadda, Siddhartha R; Ravikumar, K E; Wu, Stephen T; Kullo, Iftikhar J; Chute, Christopher G

    2013-01-01

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework.

  10. Integrating medical and research information: a big data approach.

    PubMed

    Tilve Álvarez, Carlos M; Ayora Pais, Alberto; Ruíz Romero, Cristina; Llamas Gómez, Daniel; Carrajo García, Lino; Blanco García, Francisco J; Vázquez González, Guillermo

    2015-01-01

    Most of the information collected in different fields by Instituto de Investigación Biomédica de A Coruña (INIBIC) is classified as unstructured due to its high volume and heterogeneity. This situation, linked to the recent requirement of integrating it to the medical information, makes it necessary to implant specific architectures to collect and organize it before it can be analysed. The purpose of this article is to present the Hadoop framework as a solution to the problem of integrating research information in the Business Intelligence field. This framework can collect, explore, process and structure the aforementioned information, which allow us to develop an equivalent function to a data mart in an Intelligence Business system.

  11. Retinotopically specific reorganization of visual cortex for tactile pattern recognition

    PubMed Central

    Cheung, Sing-Hang; Fang, Fang; He, Sheng; Legge, Gordon E.

    2009-01-01

    Although previous studies have shown that Braille reading and other tactile-discrimination tasks activate the visual cortex of blind and sighted people [1–5], it is not known whether this kind of cross-modal reorganization is influenced by retinotopic organization. We have addressed this question by studying S, a visually impaired adult with the rare ability to read print visually and Braille by touch. S had normal visual development until age six years, and thereafter severe acuity reduction due to corneal opacification, but no evidence of visual-field loss. Functional magnetic resonance imaging (fMRI) revealed that, in S’s early visual areas, tactile information processing activated what would be the foveal representation for normally-sighted individuals, and visual information processing activated what would be the peripheral representation. Control experiments showed that this activation pattern was not due to visual imagery. S’s high-level visual areas which correspond to shape- and object-selective areas in normally-sighted individuals were activated by both visual and tactile stimuli. The retinotopically specific reorganization in early visual areas suggests an efficient redistribution of neural resources in the visual cortex. PMID:19361999

  12. Which are the best predictors of theory of mind delay in children with specific language impairment?

    PubMed

    Andrés-Roqueta, Clara; Adrian, Juan E; Clemente, Rosa A; Katsos, Napoleon

    2013-01-01

    The relationship between language and theory of mind (ToM) development in participants with specific language impairment (SLI) it is far from clear due to there were differences in study design and methodologies of previous studies. This research consisted of an in-depth investigation of ToM delay in children with SLI during the typical period of acquisition, and it studied whether linguistic or information-processing variables were the best predictors of this process. It also took into account whether there were differences in ToM competence due to the degree of pragmatic impairment within the SLI group. Thirty-one children with SLI (3;5-7;5 years old) and two control groups (age matched and language matched) were assessed with False Belief (FB) tasks, a wide battery of language measures and additional information-processing measures. The members of the SLI group were less competent than their age-matched peers at solving FB tasks, but they performed similarly to the language-matched group. Regression analysis showed that overall linguistic skills of children with SLI were the best predictor of ToM performance, and especially grammar abilities. No differences between SLI subgroups were found according to their pragmatic level. A delay in ToM development in children with SLI around the critical period of acquisition is confirmed more comprehensively, and it is shown to be more strongly related to their general linguistic level than to their age and other information-processing faculties. This finding stresses the importance of early educational and clinical programmes aimed at reducing deleterious effects in later development. © 2013 Royal College of Speech and Language Therapists.

  13. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    PubMed

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. An Integrative Literature Review of Organisational Factors Associated with Admission and Discharge Delays in Critical Care

    PubMed Central

    Peltonen, Laura-Maria; McCallum, Louise; Siirala, Eriikka; Haataja, Marjaana; Lundgrén-Laine, Heljä; Salanterä, Sanna; Lin, Frances

    2015-01-01

    The literature shows that delayed admission to the intensive care unit (ICU) and discharge delays from the ICU are associated with increased adverse events and higher costs. Identifying factors related to delays will provide information to practice improvements, which contribute to better patient outcomes. The aim of this integrative review was to explore the incidence of patients' admission and discharge delays in critical care and to identify organisational factors associated with these delays. Seven studies were included. The major findings are as follows: (1) explanatory research about discharge delays is scarce and one study on admission delays was found, (2) delays are a common problem mostly due to organisational factors, occurring in 38% of admissions and 22–67% of discharges, and (3) redesigning care processes by improving information management and coordination between units and interdisciplinary teams could reduce discharge delays. In conclusion, patient outcomes can be improved through efficient and safe care processes. More exploratory research is needed to identify factors that contribute to admission and discharge delays to provide evidence for clinical practice improvements. Shortening delays requires an interdisciplinary and multifaceted approach to the whole patient flow process. Conclusions should be made with caution due to the limited number of articles included in this review. PMID:26558286

  15. [Imprinting as a mechanism of information memorizing in the adult BALB/c mice].

    PubMed

    Nikol'skaia, K A; Berezhnoĭ, D S

    2011-09-01

    Study of spatial learning in adult BALB/c mice revealed that a short exposition to the environment (from 3 to 8 minutes) could be enough for spatial information to be fixed in the long-term memory, and affected subsequent learning process in the new environment. Control group, learning in the same maze, followed the "shortest path" principle during formation of the optimal food-obtaining habit. Experimental animals, learning in a slightly changed environment, were unable to apply this rule due to persistent coupling of the new spatial information with the old memory traces which led to constant errors. The obtained effect was observed during the whole learning period and depended neither on frequency nor on interval of repetition during the initial information acquisition. The obtained data testify that memorizing in adult state share the properties with the imprinting process inherent in the early ontogeny. The memory fixation on all development stages seems to be based on a universal mechanism.

  16. BIM based virtual environment for fire emergency evacuation.

    PubMed

    Wang, Bin; Li, Haijiang; Rezgui, Yacine; Bradley, Alex; Ong, Hoang N

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management.

  17. Fermionic entanglement via quantum walks in quantum dots

    NASA Astrophysics Data System (ADS)

    Melnikov, Alexey A.; Fedichkin, Leonid E.

    2018-02-01

    Quantum walks are fundamentally different from random walks due to the quantum superposition property of quantum objects. Quantum walk process was found to be very useful for quantum information and quantum computation applications. In this paper we demonstrate how to use quantum walks as a tool to generate high-dimensional two-particle fermionic entanglement. The generated entanglement can survive longer in the presence of depolorazing noise due to the periodicity of quantum walk dynamics. The possibility to create two distinguishable qudits in a system of tunnel-coupled semiconductor quantum dots is discussed.

  18. Runaway reactions, their courses and the methods to establish safe process conditions

    NASA Astrophysics Data System (ADS)

    Gustin, J. L.

    1991-08-01

    Much of the literature on runaway reactions deals with the consequences such as mechanical damage toxic and flammable release. The DIERS literature provides effective methods for vent sizing where experimental information is requested. Thermal stability measurements provide information on the onset temperature and kinetic data for chemical reactions. There is less information on the way the runaway reactions occur whereas the runaway reactions may have different causes. The purpose of this paper is to describe the various process deviations which can cause a runaway reaction to occur and to discuss the experimental information necessary for risk assessment, the choice of a safe process and the mitigation of the consequences of the runaway reaction. Each possible hazardous process deviation is illustrated by examples from the process industry and/or relevant experimental information obtained from laboratory experiments. The typical hazardous situations to be considered are the following: 1) The homogeneous thermal runaway due to too high a temperature. 2) The homogeneous runaway reaction by unintended introduction of additional reactants or catalyst. 3) The heterogeneous runaway reaction due to too high a local temperature. 4) The heterogeneous runaway reaction caused by slow heat conduction to the outside. 5) The runaway reaction caused by excess residence time at the process temperature (autocatalytic reactions). 6) The runaway reaction caused by reactant accumulation. The controling reactant feed rate is higher than the consumption rate perhaps because the temperature is too low, or the catalyst is absent. 7) The runaway reaction due to the pressurization of the enclosure by gaseous oxidizing intermediates (typical of nitric oxidations). 8) The runaway reaction due to phase separation of unstable species (liquids, solids) by loss of mixing or on cooling. 9) The runaway reaction on mixing of fast reacting chemicals in separate phases. 10)The runaway reaction due to fire or external heating. Considering the various runaway situations, the effectiveness of the following approaches is discussed: - Theoretical and experimental information required for hazard assessment. - Choice of adequate process conditions. - Choice of adequate methods for process control. - Experimental information required for vent sizing. La plus grande partie de la littérature sur les emballements thermiques traite des conséquences de l'accident telles que les effets mécaniques, les émissions toxiques et inflammables. Les travaux publiés par le DIERS fournissent des méthodes permettant le dimensionnement d'évents, nécessitant des déterminations expérimentales. Il y a moins d'information sur la manière dont les emballements thermiques peuvent survenir alors que ceux-ci peuvent avoir différentes causes. Le propos de cet article est de décrire les différentes dérives de procédé qui peuvent entraîner un emballement thermique et de déterminer l'information expérimentale nécessaire pour l'analyse des risques du procédé, le choix de conditions opératoires sûres et la réduction des conséquences de l'emballement thermique. Chaque dérive de procédé dangereuse, est illustrée par des exemples connus dans l'industrie chimique et par des données expérimentales obtenues dans des essais de laboratoire. Les conditions de procédé dangereuses prises en compte sont les suivantes: 1)L'emballement thermique homogène dû à une température excessive; 2) L'emballement thermique homogène par introduction d'un catalyseur ou d'un réactif contrôlant; 3) L'emballement thermique hétérogène dû à une température locale excessive; 4)L'emballement thermique hétérogène dû à une faible conduction thermique vers l'extérieur; 5) L'emballement thermique dû à un temps de séjour excessif à la température du procédé (Réactions autocatalytiques); 6) L'emballement thermique par accumulation de réactifs. La vitesse d'introduction d'un réactif contrôlant est supérieure à la vitesse de consommation de ce réactif, parce que la température est trop basse ou le catalyseur absent; 7) L'emballement thermique dû à la pressurisation d'une enceinte par des intermédiaires gazeux oxydants (situation caractéristique des oxydations nitriques), 8)L'emballement thermique dû à la séparation de phases contenant des espèces instables (liquides, solides) par perte de l'agitation ou par refroidissement; 9) L'emballement thermique par mélange de produits incompatibles, se trouvant précédemment dans des phases séparées; 10) L'emballement thermique dû à un chauffage externe ou à un feu. Considérant les différentes situations conduisant à un emballement thermique, l'intérêt de l'approche systématique suivante est examiné: - Information théorique et expérimentale nécessaire pour déterminer les risques du procédé. - Choix de conditions opératoires adéquates. - Choix de méthodes convenables pour le contrôle du procédé. - Information expérimentale nécessaire pour le calcul d'évent.

  19. Improved Performance and Safety for High Energy Batteries Through Use of Hazard Anticipation and Capacity Prediction

    NASA Technical Reports Server (NTRS)

    Atwater, Terrill

    1993-01-01

    Prediction of the capacity remaining in used high rate, high energy batteries is important information to the user. Knowledge of the capacity remaining in used batteries results in better utilization. This translates into improved readiness and cost savings due to complete, efficient use. High rate batteries, due to their chemical nature, are highly sensitive to misuse (i.e., over discharge or very high rate discharge). Battery failure due to misuse or manufacturing defects could be disastrous. Since high rate, high energy batteries are expensive and energetic, a reliable method of predicting both failures and remaining energy has been actively sought. Due to concerns over safety, the behavior of lithium/sulphur dioxide cells at different temperatures and current drains was examined. The main thrust of this effort was to determine failure conditions for incorporation in hazard anticipation circuitry. In addition, capacity prediction formulas have been developed from test data. A process that performs continuous, real-time hazard anticipation and capacity prediction was developed. The introduction of this process into microchip technology will enable the production of reliable, safe, and efficient high energy batteries.

  20. Object Categorization in Finer Levels Relies More on Higher Spatial Frequencies and Takes Longer.

    PubMed

    Ashtiani, Matin N; Kheradpisheh, Saeed R; Masquelier, Timothée; Ganjtabesh, Mohammad

    2017-01-01

    The human visual system contains a hierarchical sequence of modules that take part in visual perception at different levels of abstraction, i.e., superordinate, basic, and subordinate levels. One important question is to identify the "entry" level at which the visual representation is commenced in the process of object recognition. For a long time, it was believed that the basic level had a temporal advantage over two others. This claim has been challenged recently. Here we used a series of psychophysics experiments, based on a rapid presentation paradigm, as well as two computational models, with bandpass filtered images of five object classes to study the processing order of the categorization levels. In these experiments, we investigated the type of visual information required for categorizing objects in each level by varying the spatial frequency bands of the input image. The results of our psychophysics experiments and computational models are consistent. They indicate that the different spatial frequency information had different effects on object categorization in each level. In the absence of high frequency information, subordinate and basic level categorization are performed less accurately, while the superordinate level is performed well. This means that low frequency information is sufficient for superordinate level, but not for the basic and subordinate levels. These finer levels rely more on high frequency information, which appears to take longer to be processed, leading to longer reaction times. Finally, to avoid the ceiling effect, we evaluated the robustness of the results by adding different amounts of noise to the input images and repeating the experiments. As expected, the categorization accuracy decreased and the reaction time increased significantly, but the trends were the same. This shows that our results are not due to a ceiling effect. The compatibility between our psychophysical and computational results suggests that the temporal advantage of the superordinate (resp. basic) level to basic (resp. subordinate) level is mainly due to the computational constraints (the visual system processes higher spatial frequencies more slowly, and categorization in finer levels depends more on these higher spatial frequencies).

  1. Object Categorization in Finer Levels Relies More on Higher Spatial Frequencies and Takes Longer

    PubMed Central

    Ashtiani, Matin N.; Kheradpisheh, Saeed R.; Masquelier, Timothée; Ganjtabesh, Mohammad

    2017-01-01

    The human visual system contains a hierarchical sequence of modules that take part in visual perception at different levels of abstraction, i.e., superordinate, basic, and subordinate levels. One important question is to identify the “entry” level at which the visual representation is commenced in the process of object recognition. For a long time, it was believed that the basic level had a temporal advantage over two others. This claim has been challenged recently. Here we used a series of psychophysics experiments, based on a rapid presentation paradigm, as well as two computational models, with bandpass filtered images of five object classes to study the processing order of the categorization levels. In these experiments, we investigated the type of visual information required for categorizing objects in each level by varying the spatial frequency bands of the input image. The results of our psychophysics experiments and computational models are consistent. They indicate that the different spatial frequency information had different effects on object categorization in each level. In the absence of high frequency information, subordinate and basic level categorization are performed less accurately, while the superordinate level is performed well. This means that low frequency information is sufficient for superordinate level, but not for the basic and subordinate levels. These finer levels rely more on high frequency information, which appears to take longer to be processed, leading to longer reaction times. Finally, to avoid the ceiling effect, we evaluated the robustness of the results by adding different amounts of noise to the input images and repeating the experiments. As expected, the categorization accuracy decreased and the reaction time increased significantly, but the trends were the same. This shows that our results are not due to a ceiling effect. The compatibility between our psychophysical and computational results suggests that the temporal advantage of the superordinate (resp. basic) level to basic (resp. subordinate) level is mainly due to the computational constraints (the visual system processes higher spatial frequencies more slowly, and categorization in finer levels depends more on these higher spatial frequencies). PMID:28790954

  2. FIBRE AND INTEGRATED OPTICS. OPTICAL PROCESSING OF INFORMATION: Method for optical data processing based on a two-pulse photon echo

    NASA Astrophysics Data System (ADS)

    Zakharov, S. M.; Manykin, Eduard A.

    1995-02-01

    The principles of optical processing based on dynamic spatial—temporal properties of two-pulse photon echo signals are considered. The properties of a resonant medium as an on-line filter of temporal and spatial frequencies are discussed. These properties are due to the sensitivity of such a medium to the Fourier spectrum of the second exiting pulse. Degeneracy of quantum resonant systems, demonstrated by the coherent response dependence on the square of the amplitude of the second pulse, can be used for 'simultaneous' correlation processing of optical 'signals'. Various methods for the processing of the Fourier optical image are discussed.

  3. A Rural Community's Involvement in the Design and Usability Testing of a Computer-Based Informed Consent Process for the Personalized Medicine Research Project

    PubMed Central

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095

  4. A rural community's involvement in the design and usability testing of a computer-based informed consent process for the Personalized Medicine Research Project.

    PubMed

    Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A

    2014-01-01

    Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.

  5. Neuroergonomics: Quantitative Modeling of Individual, Shared, and Team Neurodynamic Information.

    PubMed

    Stevens, Ronald H; Galloway, Trysha L; Willemsen-Dunlap, Ann

    2018-06-01

    The aim of this study was to use the same quantitative measure and scale to directly compare the neurodynamic information/organizations of individual team members with those of the team. Team processes are difficult to separate from those of individual team members due to the lack of quantitative measures that can be applied to both process sets. Second-by-second symbolic representations were created of each team member's electroencephalographic power, and quantitative estimates of their neurodynamic organizations were calculated from the Shannon entropy of the symbolic data streams. The information in the neurodynamic data streams of health care ( n = 24), submarine navigation ( n = 12), and high school problem-solving ( n = 13) dyads was separated into the information of each team member, the information shared by team members, and the overall team information. Most of the team information was the sum of each individual's neurodynamic information. The remaining team information was shared among the team members. This shared information averaged ~15% of the individual information, with momentary levels of 1% to 80%. Continuous quantitative estimates can be made from the shared, individual, and team neurodynamic information about the contributions of different team members to the overall neurodynamic organization of a team and the neurodynamic interdependencies among the team members. Information models provide a generalizable quantitative method for separating a team's neurodynamic organization into that of individual team members and that shared among team members.

  6. Reference software implementation for GIFTS ground data processing

    NASA Astrophysics Data System (ADS)

    Garcia, R. K.; Howell, H. B.; Knuteson, R. O.; Martin, G. D.; Olson, E. R.; Smuga-Otto, M. J.

    2006-08-01

    Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.

  7. Musical ability and non-native speech-sound processing are linked through sensitivity to pitch and spectral information.

    PubMed

    Kempe, Vera; Bublitz, Dennis; Brooks, Patricia J

    2015-05-01

    Is the observed link between musical ability and non-native speech-sound processing due to enhanced sensitivity to acoustic features underlying both musical and linguistic processing? To address this question, native English speakers (N = 118) discriminated Norwegian tonal contrasts and Norwegian vowels. Short tones differing in temporal, pitch, and spectral characteristics were used to measure sensitivity to the various acoustic features implicated in musical and speech processing. Musical ability was measured using Gordon's Advanced Measures of Musical Audiation. Results showed that sensitivity to specific acoustic features played a role in non-native speech-sound processing: Controlling for non-verbal intelligence, prior foreign language-learning experience, and sex, sensitivity to pitch and spectral information partially mediated the link between musical ability and discrimination of non-native vowels and lexical tones. The findings suggest that while sensitivity to certain acoustic features partially mediates the relationship between musical ability and non-native speech-sound processing, complex tests of musical ability also tap into other shared mechanisms. © 2014 The British Psychological Society.

  8. Review of chart recognition in document images

    NASA Astrophysics Data System (ADS)

    Liu, Yan; Lu, Xiaoqing; Qin, Yeyang; Tang, Zhi; Xu, Jianbo

    2013-01-01

    As an effective information transmitting way, chart is widely used to represent scientific statistics datum in books, research papers, newspapers etc. Though textual information is still the major source of data, there has been an increasing trend of introducing graphs, pictures, and figures into the information pool. Text recognition techniques for documents have been accomplished using optical character recognition (OCR) software. Chart recognition techniques as a necessary supplement of OCR for document images are still an unsolved problem due to the great subjectiveness and variety of charts styles. This paper reviews the development process of chart recognition techniques in the past decades and presents the focuses of current researches. The whole process of chart recognition is presented systematically, which mainly includes three parts: chart segmentation, chart classification, and chart Interpretation. In each part, the latest research work is introduced. In the last, the paper concludes with a summary and promising future research direction.

  9. Improving the delivery of care and reducing healthcare costs with the digitization of information.

    PubMed

    Noffsinger, R; Chin, S

    2000-01-01

    In the coming years, the digitization of information and the Internet will be extremely powerful in reducing healthcare costs while assisting providers in the delivery of care. One example of healthcare inefficiency that can be managed through information digitization is the process of prescription writing. Due to the handwritten and verbal communication surrounding prescription writing, as well as the multiple tiers of authorizations, the prescription drug process causes extensive financial waste as well as medical errors, lost time, and even fatal accidents. Electronic prescription management systems are being designed to address these inefficiencies. By utilizing new electronic prescription systems, physicians not only prescribe more accurately, but also improve formulary compliance thereby reducing pharmacy utilization. These systems expand patient care by presenting proactive alternatives at the point of prescription while reducing costs and providing additional benefits for consumers and healthcare providers.

  10. Tell me more: Can a memory test reduce analogue traumatic intrusions?

    PubMed

    Krans, Julie; Näring, Gérard; Holmes, Emily A; Becker, Eni S

    2009-05-01

    Information processing theories of post-traumatic stress disorder (PTSD) state that intrusive images emerge due to a lack of integration of perceptual trauma representations in autobiographical memory. To test this hypothesis experimentally, participants were shown an aversive film to elicit intrusive images. After viewing, they received a recognition test for just one part of the film. The test contained neutrally formulated items to rehearse information from the film. Participants reported intrusive images for the film in an intrusion diary during one week after viewing. In line with expectations, the number of intrusive images decreased only for the part of the film for which the recognition test was given. Furthermore, deliberate cued-recall memory after one week was selectively enhanced for the film part that was in the recognition test a week before. The findings provide new evidence supporting information processing models of PTSD and have potential implications for early interventions after trauma.

  11. Method of developing all-optical trinary JK, D-type, and T-type flip-flops using semiconductor optical amplifiers.

    PubMed

    Garai, Sisir Kumar

    2012-04-10

    To meet the demand of very fast and agile optical networks, the optical processors in a network system should have a very fast execution rate, large information handling, and large information storage capacities. Multivalued logic operations and multistate optical flip-flops are the basic building blocks for such fast running optical computing and data processing systems. In the past two decades, many methods of implementing all-optical flip-flops have been proposed. Most of these suffer from speed limitations because of the low switching response of active devices. The frequency encoding technique has been used because of its many advantages. It can preserve its identity throughout data communication irrespective of loss of light energy due to reflection, refraction, attenuation, etc. The action of polarization-rotation-based very fast switching of semiconductor optical amplifiers increases processing speed. At the same time, tristate optical flip-flops increase information handling capacity.

  12. Interoperability Matter: Levels of Data Sharing, Starting from a 3d Information Modelling

    NASA Astrophysics Data System (ADS)

    Tommasi, C.; Achille, C.

    2017-02-01

    Nowadays, the adoption of BIM processes in the AEC (Architecture, Engineering and Construction) industry means to be oriented towards synergistic workflows, based on informative instruments capable of realizing the virtual model of the building. The target of this article is to speak about the interoperability matter, approaching the subject through a theoretical part and also a practice example, in order to show how these notions are applicable in real situations. In particular, the case study analysed belongs to the Cultural Heritage field, where it is possible to find some difficulties - both in the modelling and sharing phases - due to the complexity of shapes and elements. Focusing on the interoperability between different software, the questions are: What and how many kind of information can I share? Given that this process leads also to a standardization of the modelled parts, is there the possibility of an accuracy loss?

  13. Health Information Economy: Literature Review

    PubMed Central

    Ebrahimi, Kamal; Roudbari, Masoud; Sadoughi, Farahnaz

    2015-01-01

    Introduction: Health Information Economy (HIE) is one of the broader, more complex, and challenging and yet important topics in the field of health science that requires the identification of its dimensions for planning and policy making. The aim of this study was to determine HIE concept dimensions. Methods: This paper presents a systematic methodology for analyzing the trends of HIE. For this purpose, the main keywords of this area were identified and searched in the databases and from among 4775 retrieved sources, 12 sources were studied in the field of HIE. Results: Information Economy (IE) in the world has passed behind four paradigms that involve the information evaluation perspective, the information technology perspective, the asymmetric information perspective and information value perspective. In this research, the fourth perspective in the HIE was analyzed. The main findings of this research were categorized in three major groups, including the flow of information process in the field of health (production. collection, processing and dissemination), and information applications in the same field (education, research, health industry, policy, legislation, and decision-making) and the underlying fields. Conclusion: According to the findings, HIE has already developed a theoretical and conceptual gap that due to its importance in the next decade would be one of the research approaches to health science. PMID:26153182

  14. From open source communications to knowledge

    NASA Astrophysics Data System (ADS)

    Preece, Alun; Roberts, Colin; Rogers, David; Webberley, Will; Innes, Martin; Braines, Dave

    2016-05-01

    Rapid processing and exploitation of open source information, including social media sources, in order to shorten decision-making cycles, has emerged as an important issue in intelligence analysis in recent years. Through a series of case studies and natural experiments, focussed primarily upon policing and counter-terrorism scenarios, we have developed an approach to information foraging and framing to inform decision making, drawing upon open source intelligence, in particular Twitter, due to its real-time focus and frequent use as a carrier for links to other media. Our work uses a combination of natural language (NL) and controlled natural language (CNL) processing to support information collection from human sensors, linking and schematising of collected information, and the framing of situational pictures. We illustrate the approach through a series of vignettes, highlighting (1) how relatively lightweight and reusable knowledge models (schemas) can rapidly be developed to add context to collected social media data, (2) how information from open sources can be combined with reports from trusted observers, for corroboration or to identify con icting information; and (3) how the approach supports users operating at or near the tactical edge, to rapidly task information collection and inform decision-making. The approach is supported by bespoke software tools for social media analytics and knowledge management.

  15. Real-time physiological monitoring with distributed networks of sensors and object-oriented programming techniques

    NASA Astrophysics Data System (ADS)

    Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.

    1998-05-01

    Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.

  16. Intrusion errors in visuospatial working memory performance.

    PubMed

    Cornoldi, Cesare; Mammarella, Nicola

    2006-02-01

    This study tested the hypothesis that failure in active visuospatial working memory tasks involves a difficulty in avoiding intrusions due to information that is already activated. Two experiments are described, in which participants were required to process several series of locations on a 4 x 4 matrix and then to produce only the final location of each series. Results revealed a higher number of errors due to already activated locations (intrusions) compared with errors due to new locations (inventions). Moreover, when participants were required to pay extra attention to some irrelevant (non-final) locations by tapping on the table, intrusion errors increased. Results are discussed in terms of current models of working memory functioning.

  17. Enhanced dimension-specific visual working memory in grapheme-color synesthesia.

    PubMed

    Terhune, Devin Blair; Wudarczyk, Olga Anna; Kochuparampil, Priya; Cohen Kadosh, Roi

    2013-10-01

    There is emerging evidence that the encoding of visual information and the maintenance of this information in a temporarily accessible state in working memory rely on the same neural mechanisms. A consequence of this overlap is that atypical forms of perception should influence working memory. We examined this by investigating whether having grapheme-color synesthesia, a condition characterized by the involuntary experience of color photisms when reading or representing graphemes, would confer benefits on working memory. Two competing hypotheses propose that superior memory in synesthesia results from information being coded in two information channels (dual-coding) or from superior dimension-specific visual processing (enhanced processing). We discriminated between these hypotheses in three n-back experiments in which controls and synesthetes viewed inducer and non-inducer graphemes and maintained color or grapheme information in working memory. Synesthetes displayed superior color working memory than controls for both grapheme types, whereas the two groups did not differ in grapheme working memory. Further analyses excluded the possibilities of enhanced working memory among synesthetes being due to greater color discrimination, stimulus color familiarity, or bidirectionality. These results reveal enhanced dimension-specific visual working memory in this population and supply further evidence for a close relationship between sensory processing and the maintenance of sensory information in working memory. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  18. A fuzzy-theory-based method for studying the effect of information transmission on nonlinear crowd dispersion dynamics

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lo, Siuming

    2017-01-01

    Emergencies involved in mass events are related to a variety of factors and processes. An important factor is the transmission of information on danger that has an influence on nonlinear crowd dynamics during the process of crowd dispersion. Due to much uncertainty in this process, there is an urgent need to propose a method to investigate the influence. In this paper, a novel fuzzy-theory-based method is presented to study crowd dynamics under the influence of information transmission. Fuzzy functions and rules are designed for the ambiguous description of human states. Reasonable inference is employed to decide the output values of decision making such as pedestrian movement speed and directions. Through simulation under four-way pedestrian situations, good crowd dispersion phenomena are achieved. Simulation results under different conditions demonstrate that information transmission cannot always induce successful crowd dispersion in all situations. This depends on whether decision strategies in response to information on danger are unified and effective, especially in dense crowds. Results also suggest that an increase in drift strength at low density and the percentage of pedestrians, who choose one of the furthest unoccupied Von Neumann neighbors from the dangerous source as the drift direction at high density, is helpful in crowd dispersion. Compared with previous work, our comprehensive study improves an in-depth understanding of nonlinear crowd dynamics under the effect of information on danger.

  19. Improving financial performance by modeling and analysis of radiology procedure scheduling at a large community hospital.

    PubMed

    Lu, Lingbo; Li, Jingshan; Gisler, Paula

    2011-06-01

    Radiology tests, such as MRI, CT-scan, X-ray and ultrasound, are cost intensive and insurance pre-approvals are necessary to get reimbursement. In some cases, tests may be denied for payments by insurance companies due to lack of pre-approvals, inaccurate or missing necessary information. This can lead to substantial revenue losses for the hospital. In this paper, we present a simulation study of a centralized scheduling process for outpatient radiology tests at a large community hospital (Central Baptist Hospital in Lexington, Kentucky). Based on analysis of the central scheduling process, a simulation model of information flow in the process has been developed. Using such a model, the root causes of financial losses associated with errors and omissions in this process were identified and analyzed, and their impacts were quantified. In addition, "what-if" analysis was conducted to identify potential process improvement strategies in the form of recommendations to the hospital leadership. Such a model provides a quantitative tool for continuous improvement and process control in radiology outpatient test scheduling process to reduce financial losses associated with process error. This method of analysis is also applicable to other departments in the hospital.

  20. Improving medical stores management through automation and effective communication.

    PubMed

    Kumar, Ashok; Cariappa, M P; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan-Do-Study-Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management.

  1. A novel speech processing algorithm based on harmonicity cues in cochlear implant

    NASA Astrophysics Data System (ADS)

    Wang, Jian; Chen, Yousheng; Zhang, Zongping; Chen, Yan; Zhang, Weifeng

    2017-08-01

    This paper proposed a novel speech processing algorithm in cochlear implant, which used harmonicity cues to enhance tonal information in Mandarin Chinese speech recognition. The input speech was filtered by a 4-channel band-pass filter bank. The frequency ranges for the four bands were: 300-621, 621-1285, 1285-2657, and 2657-5499 Hz. In each pass band, temporal envelope and periodicity cues (TEPCs) below 400 Hz were extracted by full wave rectification and low-pass filtering. The TEPCs were modulated by a sinusoidal carrier, the frequency of which was fundamental frequency (F0) and its harmonics most close to the center frequency of each band. Signals from each band were combined together to obtain an output speech. Mandarin tone, word, and sentence recognition in quiet listening conditions were tested for the extensively used continuous interleaved sampling (CIS) strategy and the novel F0-harmonic algorithm. Results found that the F0-harmonic algorithm performed consistently better than CIS strategy in Mandarin tone, word, and sentence recognition. In addition, sentence recognition rate was higher than word recognition rate, as a result of contextual information in the sentence. Moreover, tone 3 and 4 performed better than tone 1 and tone 2, due to the easily identified features of the former. In conclusion, the F0-harmonic algorithm could enhance tonal information in cochlear implant speech processing due to the use of harmonicity cues, thereby improving Mandarin tone, word, and sentence recognition. Further study will focus on the test of the F0-harmonic algorithm in noisy listening conditions.

  2. Evolutionary and Cognitive Motivations for Fractal Art in Art and Design Education

    ERIC Educational Resources Information Center

    Joye, Yannick

    2005-01-01

    Humans are endowed with cognitive modules specialised in processing information about the class of natural things. Due to their naturalness, fractal art and design can contribute to developing these modules, and trigger affective responses that are associated with certain natural objects. It is argued that exposure to fractals in an art and design…

  3. FIRE's Guide to Due Process and Fair Procedure on Campus. FIRE's Guides to Student Rights on Campus

    ERIC Educational Resources Information Center

    Silverglate, Harvey A.; Gewolb, Josh

    2003-01-01

    Students should know their rights and liberties, and they need to be better informed and better equipped about how to assert and defend these precious things. The protectors of students' rights and liberties--those faculty, administrators, parents, alumni, friends, citizens, advisors, and attorneys who care about such vital matters--should…

  4. Contractor Past Performance Information (PPI) in Source Selection: A Comparison Study of Public and Private Sector

    DTIC Science & Technology

    2005-05-01

    efficiencies similar to those in the private sector . However, along the way, Government and private sector industry have begun to disagree about how PPI is...double that of the private sector due to an evaluation process that is cumbersome, time-consuming, and lacking the efficiencies enjoyed by private

  5. Pedagogy Redefined: Frameworks of Learning Approaches Prevalent in the Current Digital Information Age

    ERIC Educational Resources Information Center

    AlFuqaha, Isam Najib

    2013-01-01

    This paper attempts to delineate the frameworks of learner-centered vis-à-vis teacher-centered processes of learning prevalent in the second decade of the twenty-first century. It defines the pedagogical changes that have emerged due to the development of delivery technologies, and the interrelations among teachers, students, and knowledge. The…

  6. Analysis of the Implementation of a WebQuest for Learning English in a Secondary School in Spain

    ERIC Educational Resources Information Center

    Renau, Maria Luisa; Pesudo, Marta

    2016-01-01

    In this technological era we live in, the educative scenario is changing rapidly and significantly due to the incorporation of the Internet. Therefore, education should pay special attention to society needs considering the information and communication technologies (TICs) essentially in the teaching process in order to make students ready for…

  7. Assessing Health Literacy: A New Domain for Collaboration between Language Testers and Health Professionals

    ERIC Educational Resources Information Center

    Elder, Catherine; Barber, Melissa; Staples, Margaret; Osborne, Richard H.; Clerehan, Rosemary; Buchbinder, Rachelle

    2012-01-01

    Health literacy, defined as an individual's capacity to process health information in order to make appropriate health decisions, is the focus of increasing attention in medical fields due to growing awareness that suboptimal health literacy is associated with poorer health outcomes. To explore this issue, a number of instruments, reported to have…

  8. Contrasting spatial patterns in active-fire and fire-suppressed Mediterranean climate old-growth mixed conifer forests

    Treesearch

    Danny L. Fry; Scott L. Stephens; Brandon M. Collins; Malcolm North; Ernesto Franco-Vizcaino; Samantha J. Gill

    2014-01-01

    In Mediterranean environments in western North America, historic fire regimes in frequent-fire conifer forests are highly variable both temporally and spatially. This complexity influenced forest structure and spatial patterns, but some of this diversity has been lost due to anthropogenic disruption of ecosystem processes, including fire. Information from reference...

  9. Letter from House Minority Leader Gerald R. Ford to President Richard M. Nixon. Teaching with Documents.

    ERIC Educational Resources Information Center

    Potter, Lee Ann; Schamel, Wynell

    2001-01-01

    Provides historical background on how President Richard Nixon selected someone as vice president after Spiro T. Agnew resigned due to criminal charges. Provides background information on his choice, Gerald Ford, and discusses the process of how Ford officially became vice president. Includes a document from the Nixon Presidential Materials…

  10. Public Opinion and Capital Punishment: A Close Examination of the Views of Abolitionists and Retentionists.

    ERIC Educational Resources Information Center

    Ellsworth, Phoebe C.; Ross, Lee

    1983-01-01

    Examined the attitudinal and informational bases of people's (N=500) opinions about the death penalty. Results showed 58.8 percent were proponents of capital punishment, 30.8 percent were opponents, and 10.4 percent were undecided. Respondents were generally ignorant on factual issues. Opponents favored due process guarantees more than did…

  11. Visual and Non-Visual Contributions to the Perception of Object Motion during Self-Motion

    PubMed Central

    Fajen, Brett R.; Matthis, Jonathan S.

    2013-01-01

    Many locomotor tasks involve interactions with moving objects. When observer (i.e., self-)motion is accompanied by object motion, the optic flow field includes a component due to self-motion and a component due to object motion. For moving observers to perceive the movement of other objects relative to the stationary environment, the visual system could recover the object-motion component – that is, it could factor out the influence of self-motion. In principle, this could be achieved using visual self-motion information, non-visual self-motion information, or a combination of both. In this study, we report evidence that visual information about the speed (Experiment 1) and direction (Experiment 2) of self-motion plays a role in recovering the object-motion component even when non-visual self-motion information is also available. However, the magnitude of the effect was less than one would expect if subjects relied entirely on visual self-motion information. Taken together with previous studies, we conclude that when self-motion is real and actively generated, both visual and non-visual self-motion information contribute to the perception of object motion. We also consider the possible role of this process in visually guided interception and avoidance of moving objects. PMID:23408983

  12. An industrial ecology approach to municipal solid waste ...

    EPA Pesticide Factsheets

    Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with examples suggested for various residual streams. A methodology is presented to consider individual waste-to-energy or waste-to-product system synergies, evaluating the economic and environmental issues associated with each system. Steps included in the methodology include identifying waste streams, specific waste components of interest, and conversion technologies, plus steps for determining the economic and environmental effects of using wastes and changes due to transport, administrative handling, and processing. In addition to presenting the methodology, technologies for various MSW input streams are categorized as commercialized or demonstrated to provide organizations that are considering processes for MSW with summarized information. The organization can also follow the methodology to analyze interesting processes. Presents information useful for analyzing the sustainability of alternatives for the management of municipal solid waste.

  13. Horizontal tuning for faces originates in high-level Fusiform Face Area.

    PubMed

    Goffaux, Valerie; Duecker, Felix; Hausfeld, Lars; Schiltz, Christine; Goebel, Rainer

    2016-01-29

    Recent work indicates that the specialization of face visual perception relies on the privileged processing of horizontal angles of facial information. This suggests that stimulus properties assumed to be fully resolved in primary visual cortex (V1; e.g., orientation) in fact determine human vision until high-level stages of processing. To address this hypothesis, the present fMRI study explored the orientation sensitivity of V1 and high-level face-specialized ventral regions such as the Occipital Face Area (OFA) and Fusiform Face Area (FFA) to different angles of face information. Participants viewed face images filtered to retain information at horizontal, vertical or oblique angles. Filtered images were viewed upright, inverted and (phase-)scrambled. FFA responded most strongly to the horizontal range of upright face information; its activation pattern reliably separated horizontal from oblique ranges, but only when faces were upright. Moreover, activation patterns induced in the right FFA and the OFA by upright and inverted faces could only be separated based on horizontal information. This indicates that the specialized processing of upright face information in the OFA and FFA essentially relies on the encoding of horizontal facial cues. This pattern was not passively inherited from V1, which was found to respond less strongly to horizontal than other orientations likely due to adaptive whitening. Moreover, we found that orientation decoding accuracy in V1 was impaired for stimuli containing no meaningful shape. By showing that primary coding in V1 is influenced by high-order stimulus structure and that high-level processing is tuned to selective ranges of primary information, the present work suggests that primary and high-level levels of the visual system interact in order to modulate the processing of certain ranges of primary information depending on their relevance with respect to the stimulus and task at hand. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. An Information Extraction Framework for Cohort Identification Using Electronic Health Records

    PubMed Central

    Liu, Hongfang; Bielinski, Suzette J.; Sohn, Sunghwan; Murphy, Sean; Wagholikar, Kavishwar B.; Jonnalagadda, Siddhartha R.; Ravikumar, K.E.; Wu, Stephen T.; Kullo, Iftikhar J.; Chute, Christopher G

    Information extraction (IE), a natural language processing (NLP) task that automatically extracts structured or semi-structured information from free text, has become popular in the clinical domain for supporting automated systems at point-of-care and enabling secondary use of electronic health records (EHRs) for clinical and translational research. However, a high performance IE system can be very challenging to construct due to the complexity and dynamic nature of human language. In this paper, we report an IE framework for cohort identification using EHRs that is a knowledge-driven framework developed under the Unstructured Information Management Architecture (UIMA). A system to extract specific information can be developed by subject matter experts through expert knowledge engineering of the externalized knowledge resources used in the framework. PMID:24303255

  15. Multiple-stage pure phase encoding with biometric information

    NASA Astrophysics Data System (ADS)

    Chen, Wen

    2018-01-01

    In recent years, many optical systems have been developed for securing information, and optical encryption/encoding has attracted more and more attention due to the marked advantages, such as parallel processing and multiple-dimensional characteristics. In this paper, an optical security method is presented based on pure phase encoding with biometric information. Biometric information (such as fingerprint) is employed as security keys rather than plaintext used in conventional optical security systems, and multiple-stage phase-encoding-based optical systems are designed for generating several phase-only masks with biometric information. Subsequently, the extracted phase-only masks are further used in an optical setup for encoding an input image (i.e., plaintext). Numerical simulations are conducted to illustrate the validity, and the results demonstrate that high flexibility and high security can be achieved.

  16. Next Generation Global Navigation Satellite Systems (GNSS) Processing at NASA CDDIS

    NASA Astrophysics Data System (ADS)

    Michael, B. P.; Noll, C. E.

    2016-12-01

    The Crustal Dynamics Data Information System (CDDIS) has been providing access to space geodesy and related data sets since 1982, and in particular, Global Navigation Satellite Systems (GNSS) data and derived products since 1992. The CDDIS became one of the Earth Observing System Data and Information System (EOSDIS) archive centers in 2007. As such, CDDIS has evolved to offer a broad range of data ingest services, from data upload, quality control, documentation, metadata extraction, and ancillary information. With a growing understanding of the needs and goals of its science users CDDIS continues to improve these services. Due to the importance of GNSS data and derived products in scientific studies over the last decade, CDDIS has seen its ingest volume explode to over 30 million files per year or more than one file per second from over hundreds of simultaneous data providers. In order to accommodate this increase and to streamline operations and fully automate the workflow, CDDIS has recently updated the data submission process and GNSS processing. This poster will cover this new ingest infrastructure, workflow, and the agile techniques applied in its development and current operations.

  17. Image Segmentation Using Minimum Spanning Tree

    NASA Astrophysics Data System (ADS)

    Dewi, M. P.; Armiati, A.; Alvini, S.

    2018-04-01

    This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.

  18. Abnormal brain processing of affective and sensory pain descriptors in chronic pain patients.

    PubMed

    Sitges, Carolina; García-Herrera, Manuel; Pericás, Miquel; Collado, Dolores; Truyols, Magdalena; Montoya, Pedro

    2007-12-01

    Previous research has suggested that chronic pain patients might be particularly vulnerable to the effects of negative mood during information processing. However, there is little evidence for abnormal brain processing of affective and sensory pain-related information in chronic pain. Behavioral and brain responses, to pain descriptors and pleasant words, were examined in chronic pain patients and healthy controls during a self-endorsement task. Eighteen patients with fibromyalgia (FM), 18 patients with chronic musculoskeletal pain due to identifiable physical injury (MSK), and 16 healthy controls were asked to decide whether word targets described their current or past experience of pain. The number of self-endorsed words, elapsed time to endorse the words, and event-related potentials (ERPs) elicited by words, were recorded. Data revealed that chronic pain patients used more affective and sensory pain descriptors, and were slower in responding to self-endorsed pain descriptors than to pleasant words. In addition, it was found that affective pain descriptors elicited significantly more enhanced positive ERP amplitudes than pleasant words in MSK pain patients; whereas sensory pain descriptors elicited greater positive ERP amplitudes than affective pain words in healthy controls. These data support the notion of abnormal information processing in chronic pain patients, which might be characterized by a lack of dissociation between sensory and affective components of pain-related information, and by an exaggerated rumination over word meaning during the encoding of self-referent information about pain.

  19. Subliminally and consciously induced cognitive conflicts interact at several processing levels.

    PubMed

    Stock, Ann-Kathrin; Friedrich, Julia; Beste, Christian

    2016-12-01

    Controlled behavior is susceptible to conflicts that can emerge from subliminal or consciously processed information. While research suggests that both sources of conflicting information may interact in their modulation of controlled behavior, it has remained unclear which cognitive sub-processes involved in controlled behavior are affected by this interaction; i.e., at which processing level subliminally and consciously induced response conflicts interact in modulating controlled behavior. Moreover, we investigated whether this interaction of subliminally and consciously induced response conflicts was due to a nexus between the two types of conflict like a common cognitive process or factor. For this, n = 38 healthy young subjects completed a paradigm which combines subliminal primes and consciously perceived flankers while an electroencephalography (EEG) was recorded. We show that the interaction of subliminal and conscious sources of conflict is not restricted to the response selection level (N2) but can already be shown at the earliest stages of perceptual and attentional processing (P1). While the degree of early attentional processing of subliminal information seems to depend on the absence of consciously perceived response conflicts, conflicts during the stage of response selection may be either reduced or enhanced by subliminal priming. Moreover, the results showed that even though the two different sources of conflict interact at the response selection level, they clearly originate from two distinct processes that interact before they detrimentally affect cognitive control. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Safe patient care - safety culture and risk management in otorhinolaryngology.

    PubMed

    St Pierre, Michael

    2013-12-13

    Safety culture is positioned at the heart of an organization's vulnerability to error because of its role in framing organizational awareness to risk and in providing and sustaining effective strategies of risk management. Safety related attitudes of leadership and management play a crucial role in the development of a mature safety culture ("top-down process"). A type marker for organizational culture and thus a predictor for an organization's maturity in respect to safety is information flow and in particular an organization's general way of coping with information that suggests anomaly. As all values and beliefs, relationships, learning, and other aspects of organizational safety culture are about sharing and processing information, safety culture has been termed "informed culture". An informed culture is free of blame and open for information provided by incidents. "Incident reporting systems" are the backbone of a reporting culture, where good information flow is likely to support and encourage other kinds of cooperative behavior, such as problem solving, innovation, and inter-departmental bridging. Another facet of an informed culture is the free flow of information during perioperative patient care. The World Health Organization's safe surgery checklist" is the most prevalent example of a standardized information exchange aimed at preventing patient harm due to information deficit. In routine tasks mandatory standard operating procedures have gained widespread acceptance in guaranteeing the highest possible process quality. Technical and non-technical skills of healthcare professionals are the decisive human resource for an efficient and safe delivery of patient care and the avoidance of errors. The systematic enhancement of staff qualification by providing training opportunities can be a major investment in patient safety. In recent years several otorhinolaryngology departments have started to incorporate stimulation based team trainings into their curriculum.

  1. [Safe patient care: safety culture and risk management in otorhinolaryngology].

    PubMed

    St Pierre, M

    2013-04-01

    Safety culture is positioned at the heart of an organisation's vulnerability to error because of its role in framing organizational awareness to risk and in providing and sustaining effective strategies of risk management. Safety related attitudes of leadership and management play a crucial role in the development of a mature safety culture ("top-down process"). A type marker for organizational culture and thus a predictor for an organizations maturity in respect to safety is information flow and in particular an organization's general way of coping with information that suggests anomaly. As all values and beliefs, relationships, learning, and other aspects of organizational safety culture are about sharing and processing information, safety culture has been termed "informed culture". An informed culture is free of blame and open for information provided by incidents. "Incident reporting systems" are the backbone of a reporting culture, where good information flow is likely to support and encourage other kinds of cooperative behavior, such as problem solving, innovation, and inter-departmental bridging. Another facet of an informed culture is the free flow of information during perioperative patient care. The World Health Organisation's "safe surgery checklist" is the most prevalent example of a standardized information exchange aimed at preventing patient harm due to information deficit. In routine tasks mandatory standard operating procedures have gained widespread acceptance in guaranteeing the highest possible process quality.Technical and non-technical skills of healthcare professionals are the decisive human resource for an efficient and safe delivery of patient care and the avoidance of errors. The systematic enhancement of staff qualification by providing training opportunities can be a major investment in patient safety. In recent years several otorhinolaryngology departments have started to incorporate simulation based team trainings into their curriculum. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Is the Universe Really That Simple?

    NASA Astrophysics Data System (ADS)

    Cirkovic, Milan M.

    2002-07-01

    The intriguing recent suggestion of Tegmark that the universe - contrary to all our experiences and expectations - contains only a small amount of information due to an extremely high degree of internal symmetry is critically examined. It is shown that there are several physical processes, notably Hawking evaporation of black holes and non-zero decoherence time effects described by Plaga, as well as thought experiments of Deutsch and Tegmark himself, which can be construed as arguments against the low-information universe hypothesis. Some ramifications for both quantum mechanics and cosmology are briefly discussed.

  3. Combined holography and thermography in a single sensor through image-plane holography at thermal infrared wavelengths.

    PubMed

    Georges, Marc P; Vandenrijt, Jean-François; Thizy, Cédric; Alexeenko, Igor; Pedrini, Giancarlo; Vollheim, Birgit; Lopez, Ion; Jorge, Iagoba; Rochet, Jonathan; Osten, Wolfgang

    2014-10-20

    Holographic interferometry in the thermal wavelengths range, combining a CO(2) laser and digital hologram recording with a microbolometer array based camera, allows simultaneously capturing temperature and surface shape information about objects. This is due to the fact that the holograms are affected by the thermal background emitted by objects at room temperature. We explain the setup and the processing of data which allows decoupling the two types of information. This natural data fusion can be advantageously used in a variety of nondestructive testing applications.

  4. Developing services for climate impact and adaptation baseline information and methodologies for the Andes

    NASA Astrophysics Data System (ADS)

    Huggel, C.

    2012-04-01

    Impacts of climate change are observed and projected across a range of ecosystems and economic sectors, and mountain regions thereby rank among the hotspots of climate change. The Andes are considered particularly vulnerable to climate change, not only due to fragile ecosystems but also due to the high vulnerability of the population. Natural resources such as water systems play a critical role and are observed and projected to be seriously affected. Adaptation to climate change impacts is therefore crucial to contain the negative effects on the population. Adaptation projects require information on the climate and affected socio-environmental systems. There is, however, generally a lack of methodological guidelines how to generate the necessary scientific information and how to communicate to implementing governmental and non-governmental institutions. This is particularly important in view of the international funds for adaptation such as the Green Climate Fund established and set into process at the UNFCCC Conferences of the Parties in Cancun 2010 and Durban 2011. To facilitate this process international and regional organizations (World Bank and Andean Community) and a consortium of research institutions have joined forces to develop and define comprehensive methodologies for baseline and climate change impact assessments for the Andes, with an application potential to other mountain regions (AndesPlus project). Considered are the climatological baseline of a region, and the assessment of trends based on ground meteorological stations, reanalysis data, and satellite information. A challenge is the scarcity of climate information in the Andes, and the complex climatology of the mountain terrain. A climate data platform has been developed for the southern Peruvian Andes and is a key element for climate data service and exchange. Water resources are among the key livelihood components for the Andean population, and local and national economy, in particular for agriculture and hydropower. The retreat of glaciers as one of the clearest signal of climate change represents a problem for water supply during the long dry season. Hydrological modeling, using data from the few gauging stations and complemented by satellite precipitation data, is needed to generate baseline and climate impact information. Food security is often considered threatened due to climate change impacts, in the Andes for instance by droughts and cold spells that seriously affect high-elevation food systems. Eventually, methodologies are compiled and developed for analyzing risks from natural hazards and disasters. The vulnerabilities and risks for all types of climate impacts need to be reflected by analyzing the local and regional social, cultural, political and economic context. To provide the necessary references and information the project AndesPlus has developed a web-based knowledge and information platform. The highly interdisciplinary process of the project should contribute to climate impact and adaptation information services, needed to meet the challenges of adaptation.

  5. Bidding-based autonomous process planning and scheduling

    NASA Astrophysics Data System (ADS)

    Gu, Peihua; Balasubramanian, Sivaram; Norrie, Douglas H.

    1995-08-01

    Improving productivity through computer integrated manufacturing systems (CIMS) and concurrent engineering requires that the islands of automation in an enterprise be completely integrated. The first step in this direction is to integrate design, process planning, and scheduling. This can be achieved through a bidding-based process planning approach. The product is represented in a STEP model with detailed design and administrative information including design specifications, batch size, and due dates. Upon arrival at the manufacturing facility, the product registered in the shop floor manager which is essentially a coordinating agent. The shop floor manager broadcasts the product's requirements to the machines. The shop contains autonomous machines that have knowledge about their functionality, capabilities, tooling, and schedule. Each machine has its own process planner and responds to the product's request in a different way that is consistent with its capabilities and capacities. When more than one machine offers certain process(es) for the same requirements, they enter into negotiation. Based on processing time, due date, and cost, one of the machines wins the contract. The successful machine updates its schedule and advises the product to request raw material for processing. The concept was implemented using a multi-agent system with the task decomposition and planning achieved through contract nets. The examples are included to illustrate the approach.

  6. Reviewing information support during the Great East Japan Earthquake disaster : From the perspective of a hospital library that received support

    NASA Astrophysics Data System (ADS)

    Terasawa, Motoko

    The Great East Japan Earthquake of March 11, 2011 caused extensive damage over a widespread area. Our hospital library, which is located in the affected area, was no exception. A large collection of books was lost, and some web content was inaccessible due to damage to the network environment. This greatly hindered our efforts to continue providing post-disaster medical information services. Information support, such as free access to databases, journals, and other online content related to the disaster areas, helped us immensely during this time. We were fortunate to have the cooperation of various medical employees and library members via social networks, such as twitter, during the process of attaining this information support.

  7. [Words before actions- the significance of counselling in the Praena-Test era].

    PubMed

    Tschudin, Sibil

    2014-04-23

    Due to new offers in prenatal diagnostics pregnant women are forced to make choices. In Switzerland physicians are obliged to inform previous to prenatal tests and to obtain informed consent. Considering the complexity of this information and the consequences of a positive result, counselling is challenging, especially in an intercultural context. A questionnaire-based study compared information processing, test interpretation and emotional response of pregnant women from Switzerland and adjacent countries with Turkish women. Knowledge of the latter was significantly lower and they found counselling more unsettling, but their acceptance of prenatal tests was significantly higher. An empathetic approach and the right words are decisive, and counselling will even gain importance when considering the increase in options patients are confronted with.

  8. Laboratory testing in primary care: A systematic review of health IT impacts.

    PubMed

    Maillet, Éric; Paré, Guy; Currie, Leanne M; Raymond, Louis; Ortiz de Guinea, Ana; Trudel, Marie-Claude; Marsan, Josianne

    2018-08-01

    Laboratory testing in primary care is a fundamental process that supports patient management and care. Any breakdown in the process may alter clinical information gathering and decision-making activities and can lead to medical errors and potential adverse outcomes for patients. Various information technologies are being used in primary care with the goal to support the process, maximize patient benefits and reduce medical errors. However, the overall impact of health information technologies on laboratory testing processes has not been evaluated. To synthesize the positive and negative impacts resulting from the use of health information technology in each phase of the laboratory 'total testing process' in primary care. We conducted a systematic review. Databases including Medline, PubMed, CINAHL, Web of Science and Google Scholar were searched. Studies eligible for inclusion reported empirical data on: 1) the use of a specific IT system, 2) the impacts of the systems to support the laboratory testing process, and were conducted in 3) primary care settings (including ambulatory care and primary care offices). Our final sample consisted of 22 empirical studies which were mapped to a framework that outlines the phases of the laboratory total testing process, focusing on phases where medical errors may occur. Health information technology systems support several phases of the laboratory testing process, from ordering the test to following-up with patients. This is a growing field of research with most studies focusing on the use of information technology during the final phases of the laboratory total testing process. The findings were largely positive. Positive impacts included easier access to test results by primary care providers, reduced turnaround times, and increased prescribed tests based on best practice guidelines. Negative impacts were reported in several studies: paper-based processes employed in parallel to the electronic process increased the potential for medical errors due to clinicians' cognitive overload; systems deemed not reliable or user-friendly hampered clinicians' performance; and organizational issues arose when results tracking relied on the prescribers' memory. The potential of health information technology lies not only in the exchange of health information, but also in knowledge sharing among clinicians. This review has underscored the important role played by cognitive factors, which are critical in the clinician's decision-making, the selection of the most appropriate tests, correct interpretation of the results and efficient interventions. By providing the right information, at the right time to the right clinician, many IT solutions adequately support the laboratory testing process and help primary care clinicians make better decisions. However, several technological and organizational barriers require more attention to fully support the highly fragmented and error-prone process of laboratory testing. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Temporal factors affecting somatosensory–auditory interactions in speech processing

    PubMed Central

    Ito, Takayuki; Gracco, Vincent L.; Ostry, David J.

    2014-01-01

    Speech perception is known to rely on both auditory and visual information. However, sound-specific somatosensory input has been shown also to influence speech perceptual processing (Ito et al., 2009). In the present study, we addressed further the relationship between somatosensory information and speech perceptual processing by addressing the hypothesis that the temporal relationship between orofacial movement and sound processing contributes to somatosensory–auditory interaction in speech perception. We examined the changes in event-related potentials (ERPs) in response to multisensory synchronous (simultaneous) and asynchronous (90 ms lag and lead) somatosensory and auditory stimulation compared to individual unisensory auditory and somatosensory stimulation alone. We used a robotic device to apply facial skin somatosensory deformations that were similar in timing and duration to those experienced in speech production. Following synchronous multisensory stimulation the amplitude of the ERP was reliably different from the two unisensory potentials. More importantly, the magnitude of the ERP difference varied as a function of the relative timing of the somatosensory–auditory stimulation. Event-related activity change due to stimulus timing was seen between 160 and 220 ms following somatosensory onset, mostly around the parietal area. The results demonstrate a dynamic modulation of somatosensory–auditory convergence and suggest the contribution of somatosensory information for speech processing process is dependent on the specific temporal order of sensory inputs in speech production. PMID:25452733

  10. Leveraging health information exchange to improve population health reporting processes: lessons in using a collaborative-participatory design process.

    PubMed

    Revere, Debra; Dixon, Brian E; Hills, Rebecca; Williams, Jennifer L; Grannis, Shaun J

    2014-01-01

    Surveillance, or the systematic monitoring of disease within a population, is a cornerstone function of public health. Despite significant investment in information technologies (IT) to improve the public's health, health care providers continue to rely on manual, spontaneous reporting processes that can result in incomplete and delayed surveillance activities. Participatory design principles advocate including real users and stakeholders when designing an information system to ensure high ecological validity of the product, incorporate relevance and context into the design, reduce misconceptions designers can make due to insufficient domain expertise, and ultimately reduce barriers to adoption of the system. This paper focuses on the collaborative and informal participatory design process used to develop enhanced, IT-enabled reporting processes that leverage available electronic health records in a health information exchange to prepopulate notifiable-conditions report forms used by public health authorities. Over nine months, public health stakeholders, technical staff, and informatics researchers were engaged in a multiphase participatory design process that included public health stakeholder focus groups, investigator-engineering team meetings, public health survey and census regarding high-priority data elements, and codesign of exploratory prototypes and final form mock-ups. A number of state-mandated report fields that are not highly used or desirable for disease investigation were eliminated, which allowed engineers to repurpose form space for desired and high-priority data elements and improve the usability of the forms. Our participatory design process ensured that IT development was driven by end user expertise and needs, resulting in significant improvements to the layout and functionality of the reporting forms. In addition to informing report form development, engaging with public health end users and stakeholders through the participatory design process provided new insights into public health workflow and allowed the team to quickly triage user requests while managing user expectations within the realm of engineering possibilities. Engaging public health, engineering staff, and investigators in a shared codesigning process ensured that the new forms will not only meet real-life needs but will also support development of a product that will be adopted and, ultimately, improve communicable and infectious disease reporting by clinicians to public health.

  11. User-centered requirements engineering in health information systems: a study in the hemophilia field.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2012-06-01

    The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  12. Temporal Processing in the Visual Cortex of the Awake and Anesthetized Rat.

    PubMed

    Aasebø, Ida E J; Lepperød, Mikkel E; Stavrinou, Maria; Nøkkevangen, Sandra; Einevoll, Gaute; Hafting, Torkel; Fyhn, Marianne

    2017-01-01

    The activity pattern and temporal dynamics within and between neuron ensembles are essential features of information processing and believed to be profoundly affected by anesthesia. Much of our general understanding of sensory information processing, including computational models aimed at mathematically simulating sensory information processing, rely on parameters derived from recordings conducted on animals under anesthesia. Due to the high variety of neuronal subtypes in the brain, population-based estimates of the impact of anesthesia may conceal unit- or ensemble-specific effects of the transition between states. Using chronically implanted tetrodes into primary visual cortex (V1) of rats, we conducted extracellular recordings of single units and followed the same cell ensembles in the awake and anesthetized states. We found that the transition from wakefulness to anesthesia involves unpredictable changes in temporal response characteristics. The latency of single-unit responses to visual stimulation was delayed in anesthesia, with large individual variations between units. Pair-wise correlations between units increased under anesthesia, indicating more synchronized activity. Further, the units within an ensemble show reproducible temporal activity patterns in response to visual stimuli that is changed between states, suggesting state-dependent sequences of activity. The current dataset, with recordings from the same neural ensembles across states, is well suited for validating and testing computational network models. This can lead to testable predictions, bring a deeper understanding of the experimental findings and improve models of neural information processing. Here, we exemplify such a workflow using a Brunel network model.

  13. Temporal Processing in the Visual Cortex of the Awake and Anesthetized Rat

    PubMed Central

    Aasebø, Ida E. J.; Stavrinou, Maria; Nøkkevangen, Sandra; Einevoll, Gaute

    2017-01-01

    Abstract The activity pattern and temporal dynamics within and between neuron ensembles are essential features of information processing and believed to be profoundly affected by anesthesia. Much of our general understanding of sensory information processing, including computational models aimed at mathematically simulating sensory information processing, rely on parameters derived from recordings conducted on animals under anesthesia. Due to the high variety of neuronal subtypes in the brain, population-based estimates of the impact of anesthesia may conceal unit- or ensemble-specific effects of the transition between states. Using chronically implanted tetrodes into primary visual cortex (V1) of rats, we conducted extracellular recordings of single units and followed the same cell ensembles in the awake and anesthetized states. We found that the transition from wakefulness to anesthesia involves unpredictable changes in temporal response characteristics. The latency of single-unit responses to visual stimulation was delayed in anesthesia, with large individual variations between units. Pair-wise correlations between units increased under anesthesia, indicating more synchronized activity. Further, the units within an ensemble show reproducible temporal activity patterns in response to visual stimuli that is changed between states, suggesting state-dependent sequences of activity. The current dataset, with recordings from the same neural ensembles across states, is well suited for validating and testing computational network models. This can lead to testable predictions, bring a deeper understanding of the experimental findings and improve models of neural information processing. Here, we exemplify such a workflow using a Brunel network model. PMID:28791331

  14. Use of multitemporal InSAR data to develop geohazard scenarios for Bandung, Western Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Salvi, Stefano; Tolomei, Cristiano; Duro, Javier; Pezzo, Giuseppe; Koudogbo, Fifamè

    2015-04-01

    The Greater Bandung metropolitan area is the second largest urban area in Indonesia, with a population of 8.6 million. It is subject to a variety of geohazards: volcanic hazards from seven active volcanoes within a radius of 50 km; high flood hazards, seismic hazard due to crustal active faults, the best known being the 30-km long Lembang fault, 10 km North of the city centre; subsidence hazards due to strong aquifer depletion; landslide hazard in the surrounding high country. In the framework of the FP7 RASOR project, multitemporal satellite SAR data have been processed over Bandung, Western Java. We used the SBAS InSAR technique (Berardino et al., 2002) to process two ALOS-1 datasets, to investigate the various sources of surface deformation acting in the area in the period 2008-2011. Persistent Scatterer Interferometry (PSI) has also been applied to achieve ground motion measurements with millimetric precision and high accuracy. The PSI processing technique considers a system of points that reflect the radar signal from the satellite continuously through the time. It makes use of differential interferometric phase measurements to generate long term terrain deformation and digital surface model maps. The GlobalSARTM algorithms developed by Altamira Information are applied to COSMO-SkyMed data acquired to measure ground motion over the area of interest. Strong ground displacements (up to 7 cm/yr) due to groundwater abstraction have been measured in the Bandung basin. The identification of long wavelength signals from tectonic sources is difficult due to the limited InSAR coherence outside of the urban environment. Limited deformation is observed also in the Tangkuban Perahu volcano to the north. The spatial and temporal distribution of the ground motion is important supporting information for the generation of long term subsidence and flood hazard scenarios.

  15. Automated sleep stage detection with a classical and a neural learning algorithm--methodological aspects.

    PubMed

    Schwaibold, M; Schöchlin, J; Bolz, A

    2002-01-01

    For classification tasks in biosignal processing, several strategies and algorithms can be used. Knowledge-based systems allow prior knowledge about the decision process to be integrated, both by the developer and by self-learning capabilities. For the classification stages in a sleep stage detection framework, three inference strategies were compared regarding their specific strengths: a classical signal processing approach, artificial neural networks and neuro-fuzzy systems. Methodological aspects were assessed to attain optimum performance and maximum transparency for the user. Due to their effective and robust learning behavior, artificial neural networks could be recommended for pattern recognition, while neuro-fuzzy systems performed best for the processing of contextual information.

  16. Pen-based computers: Computers without keys

    NASA Technical Reports Server (NTRS)

    Conklin, Cheryl L.

    1994-01-01

    The National Space Transportation System (NSTS) is comprised of many diverse and highly complex systems incorporating the latest technologies. Data collection associated with ground processing of the various Space Shuttle system elements is extremely challenging due to the many separate processing locations where data is generated. This presents a significant problem when the timely collection, transfer, collation, and storage of data is required. This paper describes how new technology, referred to as Pen-Based computers, is being used to transform the data collection process at Kennedy Space Center (KSC). Pen-Based computers have streamlined procedures, increased data accuracy, and now provide more complete information than previous methods. The end results is the elimination of Shuttle processing delays associated with data deficiencies.

  17. Automation technology using Geographic Information System (GIS)

    NASA Technical Reports Server (NTRS)

    Brooks, Cynthia L.

    1994-01-01

    Airport Surface Movement Area is but one of the actions taken to increase the capacity and safety of existing airport facilities. The System Integration Branch (SIB) has designed an integrated system consisting of an electronic moving display in the cockpit, and includes display of taxi routes which will warn controllers and pilots of the position of other traffic and warning information automatically. Although, this system has in test simulation proven to be accurate and helpful; the initial process of obtaining an airport layout of the taxi-routes and designing each of them is a very tedious and time-consuming process. Other methods of preparing the display maps are being researched. One such method is the use of the Geographical Information System (GIS). GIS is an integrated system of computer hardware and software linking topographical, demographic and other resource data that is being referenced. The software can support many areas of work with virtually unlimited information compatibility due to the system's open architecture. GIS will allow us to work faster with increased efficiency and accuracy while providing decision making capabilities. GIS is currently being used at the Langley Research Center with other applications and has been validated as an accurate system for that task. GIS usage for our task will involve digitizing aerial photographs of the topology for each taxi-runway and identifying each position according to its specific spatial coordinates. The information currently being used can be integrated with the GIS system, due to its ability to provide a wide variety of user interfaces. Much more research and data analysis will be needed before this technique will be used, however we are hopeful this will lead to better usage of man-power and technological capabilities for the future.

  18. Impaired emotional memory enhancement on recognition of pictorial stimuli in Alzheimer's disease: no influence of the nature of encoding.

    PubMed

    Chainay, Hanna; Sava, Alexandra; Michael, George A; Landré, Lionel; Versace, Rémy; Krolak-Salmon, Pierre

    2014-01-01

    There is some discrepancy in the results regarding emotional enhancement of memory (EEM) in Alzheimer's disease (AD). Some studies report better retrieval of emotional information, especially positive, than neutral information. This observation is similar to the positivity effect reported in healthy older adults. It was suggested that this effect is due to privileged, deeper and more controlled processing of positive information. One way of testing this is to control both the intention to encode the information and the cognitive resources involved during encoding. Studies investigating EEM in AD patients did not systematically control the nature of encoding. Consequently, the purpose of our study was to examine EEM in AD while manipulating the nature of encoding. Two experiments were conducted. In Experiment 1 the intention to encode stimuli was manipulated by giving or not giving instructions to participants about the subsequent retrieval. In Experiment 2 cognitive resources involved during encoding were varied (low vs high). In both experiments participants performed immediate recognition task of negative, positive and neutral pictures. 41 mild AD patients and 44 older healthy adults participated in Exp. 1, and 17 mild AD patients and 20 older healthy adults participated in Exp. 2. AD patients did not present EEM. Positivity effect, better performance for positive than neutral and negative pictures was observed with older healthy adults. The data suggest that EEM is disturbed in mild AD patients, with respect to both negative and positive stimuli, at least concerning laboratory, not real-life material. They also suggest there is a positivity effect in healthy older adults and lend support to the idea that this effect is due to preferential cognitive processing of positive information in this population. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Satellite Observations and Chemistry Climate Models - A Meandering Path Towards Better Predictions

    NASA Technical Reports Server (NTRS)

    Douglass, Anne R.

    2011-01-01

    Knowledge of the chemical and dynamical processes that control the stratospheric ozone layer has grown rapidly since the 1970s, when ideas that depletion of the ozone layer due to human activity were put forth. The concept of ozone depletion due to anthropogenic chlorine increase is simple; quantification of the effect is much more difficult. The future of stratospheric ozone is complicated because ozone is expected to increase for two reasons: the slow decrease in anthropogenic chlorine due to the Montreal Protocol and its amendments and stratospheric cooling caused by increases in carbon dioxide and other greenhouse gases. Prediction of future ozone levels requires three-dimensional models that represent physical, photochemical and radiative processes, i.e., chemistry climate models (CCMs). While laboratory kinetic and photochemical data are necessary inputs for a CCM, atmospheric measurements are needed both to reveal physical and chemical processes and for comparison with simulations to test the conceptual model that CCMs represent. Global measurements are available from various satellites including but not limited to the LIMS and TOMS instruments on Nimbus 7 (1979 - 1993), and various instruments on the Upper Atmosphere Research Satellite (1991 - 2005), Envisat (2002 - ongoing), Sci-Sat (2003 - ongoing) and Aura (2004 - ongoing). Every successful satellite instrument requires a physical concept for the measurement, knowledge of physical chemical properties of the molecules to be measured, and stellar engineering to design an instrument that will survive launch and operate for years with no opportunity for repair but providing enough information that trend information can be separated from any instrument change. The on-going challenge is to use observations to decrease uncertainty in prediction. This talk will focus on two applications. The first considers transport diagnostics and implications for prediction of the eventual demise of the Antarctic ozone hole. The second focuses on the upper stratosphere, where ozone is predicted to increase both due to chlorine decrease and due to temperature decrease expected as a result of increased concentrations Of CO2 and other greenhouse gases. Both applications show how diagnostics developed from global observations are being used to explain why the ozone response varies among CCM predictions for stratospheric ozone in the 21st century.

  20. Gaussian random bridges and a geometric model for information equilibrium

    NASA Astrophysics Data System (ADS)

    Mengütürk, Levent Ali

    2018-03-01

    The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.

  1. Investigation of Carbon Fiber Architecture in Braided Composites Using X-Ray CT Inspection

    NASA Technical Reports Server (NTRS)

    Rhoads, Daniel J.; Miller, Sandi G.; Roberts, Gary D.; Rauser, Richard W.; Golovaty, Dmitry; Wilber, J. Patrick; Espanol, Malena I.

    2017-01-01

    During the fabrication of braided carbon fiber composite materials, process variations occur which affect the fiber architecture. Quantitative measurements of local and global fiber architecture variations are needed to determine the potential effect of process variations on mechanical properties of the cured composite. Although non-destructive inspection via X-ray CT imaging is a promising approach, difficulties in quantitative analysis of the data arise due to the similar densities of the material constituents. In an effort to gain more quantitative information about features related to fiber architecture, methods have been explored to improve the details that can be captured by X-ray CT imaging. Metal-coated fibers and thin veils are used as inserts to extract detailed information about fiber orientations and inter-ply behavior from X-ray CT images.

  2. A Novel Numerical Method for Fuzzy Boundary Value Problems

    NASA Astrophysics Data System (ADS)

    Can, E.; Bayrak, M. A.; Hicdurmaz

    2016-05-01

    In the present paper, a new numerical method is proposed for solving fuzzy differential equations which are utilized for the modeling problems in science and engineering. Fuzzy approach is selected due to its important applications on processing uncertainty or subjective information for mathematical models of physical problems. A second-order fuzzy linear boundary value problem is considered in particular due to its important applications in physics. Moreover, numerical experiments are presented to show the effectiveness of the proposed numerical method on specific physical problems such as heat conduction in an infinite plate and a fin.

  3. Towards an evaluation framework for Laboratory Information Systems.

    PubMed

    Yusof, Maryati M; Arifin, Azila

    Laboratory testing and reporting are error-prone and redundant due to repeated, unnecessary requests and delayed or missed reactions to laboratory reports. Occurring errors may negatively affect the patient treatment process and clinical decision making. Evaluation on laboratory testing and Laboratory Information System (LIS) may explain the root cause to improve the testing process and enhance LIS in supporting the process. This paper discusses a new evaluation framework for LIS that encompasses the laboratory testing cycle and the socio-technical part of LIS. Literature review on discourses, dimensions and evaluation methods of laboratory testing and LIS. A critical appraisal of the Total Testing Process (TTP) and the human, organization, technology-fit factors (HOT-fit) evaluation frameworks was undertaken in order to identify error incident, its contributing factors and preventive action pertinent to laboratory testing process and LIS. A new evaluation framework for LIS using a comprehensive and socio-technical approach is outlined. Positive relationship between laboratory and clinical staff resulted in a smooth laboratory testing process, reduced errors and increased process efficiency whilst effective use of LIS streamlined the testing processes. The TTP-LIS framework could serve as an assessment as well as a problem-solving tool for the laboratory testing process and system. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  4. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    NASA Astrophysics Data System (ADS)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  5. "How Could They Believe That?": Explaining to Students Why Accusation of Witchcraft Made Good Sense in Seventeenth-Century New England.

    ERIC Educational Resources Information Center

    Godbeer, Richard

    2003-01-01

    Explains that students must understand that, due to the beliefs of the time in New England, accusing people of witchcraft during the seventeenth century was plausible. Provides background information on societal beliefs centered upon witchcraft and the supernatural, as well as the process of accusing people of being witches. (CMK)

  6. Minimizing Input-to-Output Latency in Virtual Environment

    NASA Technical Reports Server (NTRS)

    Adelstein, Bernard D.; Ellis, Stephen R.; Hill, Michael I.

    2009-01-01

    A method and apparatus were developed to minimize latency (time delay ) in virtual environment (VE) and other discrete- time computer-base d systems that require real-time display in response to sensor input s. Latency in such systems is due to the sum of the finite time requi red for information processing and communication within and between sensors, software, and displays.

  7. Developing Academic English Language Proficiency Prototypes for 5th Grade Reading: Psychometric and Linguistic Profiles of Tasks. CSE Technical Report 727

    ERIC Educational Resources Information Center

    Bailey, Alison L.; Huang, Becky H.; Shin, Hye Won; Farnsworth, Tim; Butler, Frances A.

    2007-01-01

    Within an evidentiary framework for operationally defining academic English language proficiency (AELP), linguistic analyses of standards, classroom discourse, and textbooks have led to specifications for assessment of AELP. The test development process described here is novel due to the emphasis on using linguistic profiles to inform the …

  8. Examining the Impact of Culture and Human Elements on OLAP Tools Usefulness

    ERIC Educational Resources Information Center

    Sharoupim, Magdy S.

    2010-01-01

    The purpose of the present study was to examine the impact of culture and human-related elements on the On-line Analytical Processing (OLAP) usability in generating decision-making information. The use of OLAP technology has evolved rapidly and gained momentum, mainly due to the ability of OLAP tools to examine and query large amounts of data sets…

  9. Professional Disclosure Statements and Formal Plans for Supervision: Two Strategies for Minimizing the Risk of Ethical Conflicts in Post-Master's Supervision.

    ERIC Educational Resources Information Center

    Cobia, Debra C.; Boes, Susan R.

    2000-01-01

    Discusses ethical conflicts related to issues of informed consent, due process, competence, confidentiality, and dual relationships in supervision. Proposes two strategies as ways to minimize the potential for ethical conflict in post-master's supervision: the use of professional disclosure statements by supervisors and the development of formal…

  10. Phenotypes, genome wide markers and structured genetic populations; a means to understand economically important traits in beta vulgaris and to inform the process of germplasm enhancement

    USDA-ARS?s Scientific Manuscript database

    Although hybrid seed systems in beet have been widely adopted due to profitability and productivity, the population remains the operational unit of beet improvement and thus characterizing populations in terms of markers and phenotypes is critical for novel trait discovery and eventual deployment of...

  11. A Genetically Informed Study of the Processes Underlying the Association between Parental Marital Instability and Offspring Adjustment

    ERIC Educational Resources Information Center

    D'Onofrio, Brian M.; Turkheimer, Eric; Emery, Robert E.; Slutske, Wendy S.; Heath, Andrew C.; Madden, Pamela A.; Martin, Nicholas G.

    2006-01-01

    Parental divorce is associated with problematic offspring adjustment, but the relation may be due to shared genetic or environmental factors. One way to test for these confounds is to study offspring of twins discordant for divorce. The current analyses used this design to separate the mechanisms responsible for the association between parental…

  12. 36 CFR 1250.54 - General information on fees for NARA operational records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 11″) photocopy paper, we will copy them on larger paper and will reduce your copy fee by the normal... electronic files) we will provide the equivalent of 100 pages of standard size paper copies for free. (e) We... pay FOIA fees in the past, we will require you to pay your past-due bill before we begin processing...

  13. Consumer protection and managed care: the need for organized consumers.

    PubMed

    Rodwin, M A

    1996-01-01

    Despite its many advantages, managed care creates new problems for consumers. Activists have proposed four types of remedies: (1) increased information and choice; (2) standards for services and marketing; (3) administrative oversight; and (4) procedural due process for complaints. Each approach offers some benefits, but they are insufficient to cope with consumer problems. What is lacking is effective, organized consumer advocacy.

  14. Children with Chromosome 22q11.2 Deletion Syndrome Exhibit Impaired Spatial Working Memory

    ERIC Educational Resources Information Center

    Wong, Ling M.; Riggins, Tracy; Harvey, Danielle; Cabaral, Margarita; Simon, Tony J.

    2014-01-01

    Individuals with chromosome 22q11.2 deletion syndrome (22q11.2DS) have been shown to have impairments in processing spatiotemporal information. The authors examined whether children with 22q11.2DS exhibit impairments in spatial working memory performance due to these weaknesses, even when controlling for maintenance of attention. Children with…

  15. [Constructing a database that can input record of use and product-specific information].

    PubMed

    Kawai, Satoru; Satoh, Kenichi; Yamamoto, Hideo

    2012-01-01

    In Japan, patients were infected by viral hepatitis C generally by administering a specific fibrinogen injection. However, it has been difficult to identify patients who were infected as result of the injections due to the lack of medical records. It is still not a common practice by a number of medical facilities to maintain detailed information because manual record keeping is extremely time consuming and subject to human error. Due to these reasons, the regulator required Medical device manufacturers and pharmaceutical companies to attach a bar code called "GS1-128" effective March 28, 2008. Based on this new process, we have come up with the idea of constructing a new database whose records can be entered by bar code scanning to ensure data integrity. Upon examining the efficacy of this new data collection process from the perspective of time efficiency and of course data accuracy, "GS1-128" proved that it significantly reduces time and record keeping mistakes. Patients not only became easily identifiable by a lot number and a serial number when immediate care was required, but "GS1-128" enhanced the ability to pinpoint manufacturing errors in the event any trouble or side effects are reported. This data can be shared with and utilized by the entire medical industry and will help perfect the products and enhance record keeping. I believe this new process is extremely important.

  16. Streamflow estimation in ungauged basins using remote sensed hydrological data

    NASA Astrophysics Data System (ADS)

    Vasquez, Nicolas; Vargas, Ximena

    2017-04-01

    In several parts of the world the scarcity of streamflow gauging stations produces an important deficit of information, and calibrating these basins remains a challenge for hydrologists. Improvements in remote sensing have provided significant information about hydrological cycle, which can be used to calibrate a hydrological model when streamflow information is not available. Several satellite products related to snow, evapotranspiration, soil moisture, among other variables provide essential information about hydrological processes, and can be used to calibrate physically based hydrological models. Despite this useful information, other aspects are unknown like aquifers dimensions or precipitation heterogeneity. We calibrated three snow driven basins in the Coquimbo Region in Northern Chile, using fSCA from MODIS (MOD10 and MYD10) and NDSI from Landsat. We also considered the MOD16 product to estimate evapotranspiration. Soil Moisture from AMSR-E was considered but it was not useful due to the spatial resolution of the product and the high heterogeneity of the terrain. The Cold Regional Hydrological Modal (CHRM) was selected to represent the hydrological processes due to the importance of snow processes which are, by far, the most important in this area, where precipitation falls as snow principally in winter (June to August) and the melting period begins in spring (September) and ends in the beginning of summer (December and January). The inputs used in the model are precipitation, temperature, short wave radiation, wind speed and relative humidity. The meteorological information was obtained from stations available in the area, and distributed spatially using orographic gradients for wind and precipitation and lapse rates for air temperature and dew point temperature. Short wave radiation was computed and corrected by cloud cover data from MODIS. Streamflow data was available but it was not used in the calibration process. The three basins are Cochiguaz river at Peñón (676 km2), Derecho river at Alcohuaz (338 km2) and Toro river in confluence with La Laguna river (468 km2). These sub-basins are part of the Elqui river basins and are located in the Andes Cordillera, Chile. The mean altitude are 3508 (m.a.s.l), 3543 (m.a.s.l) and 3625 (m.a.s.l) respectively. For the calibration period (2002 to 2014), the NSE of the fSCA are 0.85 and 0.87 for Cochiguaz and Derecho rivers. The Toro river was separated in two rivers: Vacas Heladas and Malo. NSE for these two last basins are 0.77 and 0.78. For ET, the analysis relies on the number of pixels inside each basin, but annually, the R2 are 0.62, 0.43, 0.46 and 0.58 for the four sub-basins. Some biases are noticed when ET is analyzed. For streamflow, the NSE were 0.64, 0.34 and 0.08 for Cochiguaz, Derecho and Toro river in the calibration period. Additionally, due to the uncertainty about the aquifers dimensions, a sensitivity analysis was performed.

  17. Neural correlates of heart-focused interoception: a functional magnetic resonance imaging meta-analysis

    PubMed Central

    2016-01-01

    Interoception is the ability to perceive one's internal body state including visceral sensations. Heart-focused interoception has received particular attention, in part due to a readily available task for behavioural assessment, but also due to accumulating evidence for a significant role in emotional experience, decision-making and clinical disorders such as anxiety and depression. Improved understanding of the underlying neural correlates is important to promote development of anatomical-functional models and suitable intervention strategies. In the present meta-analysis, nine studies reporting neural activity associated with interoceptive attentiveness (i.e. focused attention to a particular interoceptive signal for a given time interval) to one's heartbeat were submitted to a multilevel kernel density analysis. The findings corroborated an extended network associated with heart-focused interoceptive attentiveness including the posterior right and left insula, right claustrum, precentral gyrus and medial frontal gyrus. Right-hemispheric dominance emphasizes non-verbal information processing with the posterior insula presumably serving as the major gateway for cardioception. Prefrontal neural activity may reflect both top-down attention deployment and processing of feed-forward cardioceptive information, possibly orchestrated via the claustrum. This article is part of the themed issue ‘Interoception beyond homeostasis: affect, cognition and mental health’. PMID:28080975

  18. Geometric reduction of dynamical nonlocality in nanoscale quantum circuits.

    PubMed

    Strambini, E; Makarenko, K S; Abulizi, G; de Jong, M P; van der Wiel, W G

    2016-01-06

    Nonlocality is a key feature discriminating quantum and classical physics. Quantum-interference phenomena, such as Young's double slit experiment, are one of the clearest manifestations of nonlocality, recently addressed as dynamical to specify its origin in the quantum equations of motion. It is well known that loss of dynamical nonlocality can occur due to (partial) collapse of the wavefunction due to a measurement, such as which-path detection. However, alternative mechanisms affecting dynamical nonlocality have hardly been considered, although of crucial importance in many schemes for quantum information processing. Here, we present a fundamentally different pathway of losing dynamical nonlocality, demonstrating that the detailed geometry of the detection scheme is crucial to preserve nonlocality. By means of a solid-state quantum-interference experiment we quantify this effect in a diffusive system. We show that interference is not only affected by decoherence, but also by a loss of dynamical nonlocality based on a local reduction of the number of quantum conduction channels of the interferometer. With our measurements and theoretical model we demonstrate that this mechanism is an intrinsic property of quantum dynamics. Understanding the geometrical constraints protecting nonlocality is crucial when designing quantum networks for quantum information processing.

  19. EUCAST breakpoints for antifungals.

    PubMed

    Rodríguez-Tudela, Juan L; Arendrup, Maiken C; Cuenca-Estrella, Manuel; Donnelly, J Peter; Lass-Flörl, Cornelia

    2010-03-01

    Susceptibility testing of fungi and development of interpretative breakpoints has become increasingly important due to the growing incidence of invasive fungal infections, the number and classes of antifungals, and the emerging reports of acquired resistance. The subcommittee on antifungal susceptibility testing of the European Committee on Antibiotic Susceptibility Testing (EUCAST) has developed standards for susceptibility testing of fermentative yeasts and molds as well as proposing breakpoints for fluconazole and voriconazole against Candida. The aim of this work is to describe the EUCAST process of setting breakpoints for antifungals. Five aspects are evaluated during the process of developing breakpoints: 1) the most common dosage used in each European country, 2) the definition of the wild-type population for each target microorganism at the species level and the determination of epidemiological cutoffs, 3) the drug's pharmacokinetics and 4) pharmacodynamics, including Monte Carlo simulations, and 5) the correlation of MICs with clinical outcome of patients treated with the compound. When insufficient data are available (e.g., due to lack of information on the clinical outcome of infections caused by isolates with an elevated MIC), epidemiological cutoff values, rather than breakpoints, are recommended until the necessary information becomes available. Copyright 2010 Prous Science, S.A.U. or its licensors. All rights reserved.

  20. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  1. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  2. Error-proofing test system of industrial components based on image processing

    NASA Astrophysics Data System (ADS)

    Huang, Ying; Huang, Tao

    2018-05-01

    Due to the improvement of modern industrial level and accuracy, conventional manual test fails to satisfy the test standards of enterprises, so digital image processing technique should be utilized to gather and analyze the information on the surface of industrial components, so as to achieve the purpose of test. To test the installation parts of automotive engine, this paper employs camera to capture the images of the components. After these images are preprocessed including denoising, the image processing algorithm relying on flood fill algorithm is used to test the installation of the components. The results prove that this system has very high test accuracy.

  3. Multidisciplinary experiment on studying short-period variability of the sedimentary process in the northeastern part of the Black Sea

    NASA Astrophysics Data System (ADS)

    Klyuvitkin, A. A.; Ostrovskii, A. G.; Novigatskii, A. N.; Lisitzin, A. P.

    2016-07-01

    The principal aim of this work is to reveal the regularities of short-period synoptic variability of vertical flows and the composition of settling sedimentary material, to obtain information on the quantitative characteristics of the processes that influence sound-scattering layers in the water layer above the continental slope behind the shelf edge in the northeastern part of the Black Sea. The results were obtained due to improvement of the equipment and the procedures for performing sea experiments on studying physicogeological, biological, and hydrophysical processes in the upper illuminated layer of phytoplankton development.

  4. Handheld tools assess medical necessity at the point of care.

    PubMed

    Pollard, Dan

    2002-01-01

    An emerging strategy to manage financial risk in clinical practice is to involve the physician at the point of care. Using handheld technology, encounter-specific information along with medical necessity policy can be presented to physicians allowing them to integrate it into their medical decision-making process. Three different strategies are discussed: reference books or paper encounter forms, electronic reference tools, and integrated process tools. The electronic reference tool strategy was evaluated and showed a return on investment exceeding 1200% due to reduced overhead costs associated with rework of claim errors.

  5. Impact of degree truncation on the spread of a contagious process on networks.

    PubMed

    Harling, Guy; Onnela, Jukka-Pekka

    2018-03-01

    Understanding how person-to-person contagious processes spread through a population requires accurate information on connections between population members. However, such connectivity data, when collected via interview, is often incomplete due to partial recall, respondent fatigue or study design, e.g., fixed choice designs (FCD) truncate out-degree by limiting the number of contacts each respondent can report. Past research has shown how FCD truncation affects network properties, but its implications for predicted speed and size of spreading processes remain largely unexplored. To study the impact of degree truncation on predictions of spreading process outcomes, we generated collections of synthetic networks containing specific properties (degree distribution, degree-assortativity, clustering), and also used empirical social network data from 75 villages in Karnataka, India. We simulated FCD using various truncation thresholds and ran a susceptible-infectious-recovered (SIR) process on each network. We found that spreading processes propagated on truncated networks resulted in slower and smaller epidemics, with a sudden decrease in prediction accuracy at a level of truncation that varied by network type. Our results have implications beyond FCD to truncation due to any limited sampling from a larger network. We conclude that knowledge of network structure is important for understanding the accuracy of predictions of process spread on degree truncated networks.

  6. Environmental impacts caused by cemeteries and crematoria, new funeral technologies, and preferences of the Northeastern and Southern Brazilian population as for the funeral process.

    PubMed

    da Cruz, Nicholas Joseph Tavares; Lezana, Álvaro Guillermo Rojas; Freire Dos Santos, Paulo da Cruz; Santana Pinto, Ibsen Mateus Bittencourt; Zancan, Claudio; Silva de Souza, Gustavo Henrique

    2017-11-01

    Cemeteries and crematoria are the main funeral ways used in the world nowadays. It is a little-studied segment in the present days, mainly as for the possible environmental impacts in the environment, such as those derived from dental amalgam, prostheses, and dioxins, among other. This article aimed to identify the environmental impacts caused by cemeteries and crematoria and to point out new trends in funeral processes such as freeze-drying and alkaline hydrolysis. The study is justified due to the large part of the Brazilian population that do not know the environmental impacts caused by cemeteries and crematoria, as well as to bring information about the new processes. For that, a research was carried out with 400 people. The main results show that among all the funeral processes, the new freeze-drying process was opted by 33% of the sample. We also identified that the main reasons for choosing the funeral process were less environmental impact (28%), no after-death expenses (grave payment) (16.1%), and the possibility of putting away or throwing away the remains wherever you want (14.9%). Finally, new funeral processes were well accepted by the Brazilian population-those who were interviewed-due to their benefits.

  7. The Dilution Effect and Information Integration in Perceptual Decision Making

    PubMed Central

    Hotaling, Jared M.; Cohen, Andrew L.; Shiffrin, Richard M.; Busemeyer, Jerome R.

    2015-01-01

    In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects. PMID:26406323

  8. Consequences of converting graded to action potentials upon neural information coding and energy efficiency.

    PubMed

    Sengupta, Biswa; Laughlin, Simon Barry; Niven, Jeremy Edward

    2014-01-01

    Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na(+) and K(+) channels, with generator potential and graded potential models lacking voltage-gated Na(+) channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na(+) channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a 'footprint' in the generator potential that obscures incoming signals. These three processes reduce information rates by ∼50% in generator potentials, to ∼3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation.

  9. Consequences of Converting Graded to Action Potentials upon Neural Information Coding and Energy Efficiency

    PubMed Central

    Sengupta, Biswa; Laughlin, Simon Barry; Niven, Jeremy Edward

    2014-01-01

    Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ∼50% in generator potentials, to ∼3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation. PMID:24465197

  10. The Dilution Effect and Information Integration in Perceptual Decision Making.

    PubMed

    Hotaling, Jared M; Cohen, Andrew L; Shiffrin, Richard M; Busemeyer, Jerome R

    2015-01-01

    In cognitive science there is a seeming paradox: On the one hand, studies of human judgment and decision making have repeatedly shown that people systematically violate optimal behavior when integrating information from multiple sources. On the other hand, optimal models, often Bayesian, have been successful at accounting for information integration in fields such as categorization, memory, and perception. This apparent conflict could be due, in part, to different materials and designs that lead to differences in the nature of processing. Stimuli that require controlled integration of information, such as the quantitative or linguistic information (commonly found in judgment studies), may lead to suboptimal performance. In contrast, perceptual stimuli may lend themselves to automatic processing, resulting in integration that is closer to optimal. We tested this hypothesis with an experiment in which participants categorized faces based on resemblance to a family patriarch. The amount of evidence contained in the top and bottom halves of each test face was independently manipulated. These data allow us to investigate a canonical example of sub-optimal information integration from the judgment and decision making literature, the dilution effect. Splitting the top and bottom halves of a face, a manipulation meant to encourage controlled integration of information, produced farther from optimal behavior and larger dilution effects. The Multi-component Information Accumulation model, a hybrid optimal/averaging model of information integration, successfully accounts for key accuracy, response time, and dilution effects.

  11. COM3/369: Knowledge-based Information Systems: A new approach for the representation and retrieval of medical information

    PubMed Central

    Mann, G; Birkmann, C; Schmidt, T; Schaeffler, V

    1999-01-01

    Introduction Present solutions for the representation and retrieval of medical information from online sources are not very satisfying. Either the retrieval process lacks of precision and completeness the representation does not support the update and maintenance of the represented information. Most efforts are currently put into improving the combination of search engines and HTML based documents. However, due to the current shortcomings of methods for natural language understanding there are clear limitations to this approach. Furthermore, this approach does not solve the maintenance problem. At least medical information exceeding a certain complexity seems to afford approaches that rely on structured knowledge representation and corresponding retrieval mechanisms. Methods Knowledge-based information systems are based on the following fundamental ideas. The representation of information is based on ontologies that define the structure of the domain's concepts and their relations. Views on domain models are defined and represented as retrieval schemata. Retrieval schemata can be interpreted as canonical query types focussing on specific aspects of the provided information (e.g. diagnosis or therapy centred views). Based on these retrieval schemata it can be decided which parts of the information in the domain model must be represented explicitly and formalised to support the retrieval process. As representation language propositional logic is used. All other information can be represented in a structured but informal way using text, images etc. Layout schemata are used to assign layout information to retrieved domain concepts. Depending on the target environment HTML or XML can be used. Results Based on this approach two knowledge-based information systems have been developed. The 'Ophthalmologic Knowledge-based Information System for Diabetic Retinopathy' (OKIS-DR) provides information on diagnoses, findings, examinations, guidelines, and reference images related to diabetic retinopathy. OKIS-DR uses combinations of findings to specify the information that must be retrieved. The second system focuses on nutrition related allergies and intolerances. Information on allergies and intolerances of a patient are used to retrieve general information on the specified combination of allergies and intolerances. As a special feature the system generates tables showing food types and products that are tolerated or not tolerated by patients. Evaluation by external experts and user groups showed that the described approach of knowledge-based information systems increases the precision and completeness of knowledge retrieval. Due to the structured and non-redundant representation of information the maintenance and update of the information can be simplified. Both systems are available as WWW based online knowledge bases and CD-ROMs (cf. http://mta.gsf.de topic: products).

  12. Improving medical stores management through automation and effective communication

    PubMed Central

    Kumar, Ashok; Cariappa, M.P.; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Background Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. Methods We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan–Do–Study–Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. Results After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Conclusion Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management. PMID:26900225

  13. Long-term ERT monitoring of biogeochemical changes of an aged hydrocarbon contamination.

    PubMed

    Caterina, David; Flores Orozco, Adrian; Nguyen, Frédéric

    2017-06-01

    Adequate management of contaminated sites requires information with improved spatio-temporal resolution, in particular to assess bio-geochemical processes, such as the transformation and degradation of contaminants, precipitation of minerals or changes in groundwater geochemistry occurring during and after remediation procedures. Electrical Resistivity Tomography (ERT), a geophysical method sensitive to pore-fluid and pore-geometry properties, permits to gain quasi-continuous information about subsurface properties in real-time and has been consequently widely used for the characterization of hydrocarbon-impacted sediments. However, its application for the long-term monitoring of processes accompanying natural or engineered bioremediation is still difficult due to the poor understanding of the role that biogeochemical processes play in the electrical signatures. For in-situ studies, the task is further complicated by the variable signal-to-noise ratio and the variations of environmental parameters leading to resolution changes in the electrical images. In this work, we present ERT imaging results for data collected over a period of two years on a site affected by a diesel fuel contamination and undergoing bioremediation. We report low electrical resistivity anomalies in areas associated to the highest contaminant concentrations likely due transformations of the contaminant due to microbial activity and accompanying release of metabolic products. We also report large seasonal variations of the bulk electrical resistivity in the contaminated areas in correlation with temperature and groundwater level fluctuations. However, the amplitude of bulk electrical resistivity variations largely exceeds the amplitude expected given existing petrophysical models. Our results suggest that the variations in electrical properties are mainly controlled by microbial activity which in turn depends on soil temperature and hydrogeological conditions. Therefore, ERT can be suggested as a promising tool to track microbial activity during bioremediation even though further research is still needed to completely understand the bio-geochemical processes involved and their impact on electrical signatures. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Long-term ERT monitoring of biogeochemical changes of an aged hydrocarbon contamination

    NASA Astrophysics Data System (ADS)

    Caterina, David; Flores Orozco, Adrian; Nguyen, Frédéric

    2017-06-01

    Adequate management of contaminated sites requires information with improved spatio-temporal resolution, in particular to assess bio-geochemical processes, such as the transformation and degradation of contaminants, precipitation of minerals or changes in groundwater geochemistry occurring during and after remediation procedures. Electrical Resistivity Tomography (ERT), a geophysical method sensitive to pore-fluid and pore-geometry properties, permits to gain quasi-continuous information about subsurface properties in real-time and has been consequently widely used for the characterization of hydrocarbon-impacted sediments. However, its application for the long-term monitoring of processes accompanying natural or engineered bioremediation is still difficult due to the poor understanding of the role that biogeochemical processes play in the electrical signatures. For in-situ studies, the task is further complicated by the variable signal-to-noise ratio and the variations of environmental parameters leading to resolution changes in the electrical images. In this work, we present ERT imaging results for data collected over a period of two years on a site affected by a diesel fuel contamination and undergoing bioremediation. We report low electrical resistivity anomalies in areas associated to the highest contaminant concentrations likely due transformations of the contaminant due to microbial activity and accompanying release of metabolic products. We also report large seasonal variations of the bulk electrical resistivity in the contaminated areas in correlation with temperature and groundwater level fluctuations. However, the amplitude of bulk electrical resistivity variations largely exceeds the amplitude expected given existing petrophysical models. Our results suggest that the variations in electrical properties are mainly controlled by microbial activity which in turn depends on soil temperature and hydrogeological conditions. Therefore, ERT can be suggested as a promising tool to track microbial activity during bioremediation even though further research is still needed to completely understand the bio-geochemical processes involved and their impact on electrical signatures.

  15. Modernization of the NASA scientific and technical information program

    NASA Technical Reports Server (NTRS)

    Cotter, Gladys A.; Hunter, Judy F.; Ostergaard, K.

    1993-01-01

    The NASA Scientific and Technical Information Program utilizes a technology infrastructure assembled in the mid 1960s to late 1970s to process and disseminate its information products. When this infrastructure was developed it placed NASA as a leader in processing STI. The retrieval engine for the STI database was the first of its kind and was used as the basis for developing commercial, other U.S., and foreign government agency retrieval systems. Due to the combination of changes in user requirements and the tremendous increase in technological capabilities readily available in the marketplace, this infrastructure is no longer the most cost-effective or efficient methodology available. Consequently, the NASA STI Program is pursuing a modernization effort that applies new technology to current processes to provide near-term benefits to the user. In conjunction with this activity, we are developing a long-term modernization strategy designed to transition the Program to a multimedia, global 'library without walls.' Critical pieces of the long-term strategy include streamlining access to sources of STI by using advances in computer networking and graphical user interfaces; creating and disseminating technical information in various electronic media including optical disks, video, and full text; and establishing a Technology Focus Group to maintain a current awareness of emerging technology and to plan for the future.

  16. Modeling information diffusion in time-varying community networks

    NASA Astrophysics Data System (ADS)

    Cui, Xuelian; Zhao, Narisa

    2017-12-01

    Social networks are rarely static, and they typically have time-varying network topologies. A great number of studies have modeled temporal networks and explored social contagion processes within these models; however, few of these studies have considered community structure variations. In this paper, we present a study of how the time-varying property of a modular structure influences the information dissemination. First, we propose a continuous-time Markov model of information diffusion where two parameters, mobility rate and community attractiveness, are introduced to address the time-varying nature of the community structure. The basic reproduction number is derived, and the accuracy of this model is evaluated by comparing the simulation and theoretical results. Furthermore, numerical results illustrate that generally both the mobility rate and community attractiveness significantly promote the information diffusion process, especially in the initial outbreak stage. Moreover, the strength of this promotion effect is much stronger when the modularity is higher. Counterintuitively, it is found that when all communities have the same attractiveness, social mobility no longer accelerates the diffusion process. In addition, we show that the local spreading in the advantage group has been greatly enhanced due to the agglomeration effect caused by the social mobility and community attractiveness difference, which thus increases the global spreading.

  17. [Treatment of sensory information in neurodevelopmental disorders].

    PubMed

    Zoenen, D; Delvenne, V

    2018-01-01

    The processing of information coming from the elementary sensory systems conditions the development and fulfilment of a child's abilities. A dysfunction in the sensory stimuli processing may generate behavioural patterns that might affect a child's learning capacities as well as his relational sphere. The DSM-5 recognizes the sensory abnormalities as part of the symptomatology of Autism Spectrum Disorders. However, similar features are observed in other neurodevelopmental disorders. Over the years, these conditions have been the subject of numerous controversies. Nowadays, they are all grouped together under the term of Neurodevelopmental Disorders in DSM-5. The semiology of these disorders is rich and complex due to the frequent presence of comorbidities and their impact on cognitive, behavioural, and sensorimotor organization but also on a child's personality, as well as his family, his school, or his social relationships. We carried out a review of the literature on the alterations in the treatment of sensory information in ASD but also on the different neurodevelopmental clinical panels in order to show their impact on child development. Atypical sensory profiles have been demonstrated in several neurodevelopmental clinical populations such as Autism Spectrum Disorder, Attention Deficit/Hyperactivity Disorders, Dysphasia and Intellectual Disability. Abnomalies in the processing of sensory information should be systematically evaluated in child developmental disorders.

  18. [The pedunculopontine nucleus. A structure involved in motor and emotional processing].

    PubMed

    Blanco-Lezcano, L; Pavón-Fuentes, N; Serrano-Sánchez, T; Blanco-Lezcano, V; Coro-Grave de Peralta, Y; Joseph-Bouza, Y

    There is currently a growing interest for conducting studies into the electrical and neurochemical activity of the pedunculopontine nucleus (PPN) due to the privileged position occupied by this structure in the flow of information to and from the cortex. This nucleus acts as a relay, not only for the motor information that is processed in the basal ganglia but also for information of an emotional type, whose main centre is the nucleus accumbens. It is also strongly linked with the aspects that determine the mechanisms governing addiction to certain drugs. We conduct a detailed analysis of the main findings from studies of the role played by the PPN in the physiopathology of Parkinsonism, namely the study of metabolic activity, immunohistochemical studies with different tracers, electrophysiological studies that have confirmed the immunohistochemical observations, as well as deep electrical stimulation carried out in non human primates. Furthermore, we also examine the part played by this structure in the processing of emotional information associated with different learning tasks. Overall, the authors grant the PPN a privileged position in the physiopathology of the axial disorders related to Parkinson s disease; its most important afference, stemming from the subthalamic nucleus, appears to play a key role in the understanding of the part played by the PPN in Parkinsonism.

  19. Real-time nondestructive monitoring of the gas tungsten arc welding (GTAW) process by combined airborne acoustic emission and non-contact ultrasonics

    NASA Astrophysics Data System (ADS)

    Zhang, Lu; Basantes-Defaz, Alexandra-Del-Carmen; Abbasi, Zeynab; Yuhas, Donald; Ozevin, Didem; Indacochea, Ernesto

    2018-03-01

    Welding is a key manufacturing process for many industries and may introduce defects into the welded parts causing significant negative impacts, potentially ruining high-cost pieces. Therefore, a real-time process monitoring method is important to implement for avoiding producing a low-quality weld. Due to high surface temperature and possible contamination of surface by contact transducers, the welding process should be monitored via non-contact transducers. In this paper, airborne acoustic emission (AE) transducers tuned at 60 kHz and non-contact ultrasonic testing (UT) transducers tuned at 500 kHz are implemented for real time weld monitoring. AE is a passive nondestructive evaluation method that listens for the process noise, and provides information about the uniformity of manufacturing process. UT provides more quantitative information about weld defects. One of the most common weld defects as burn-through is investigated. The influences of weld defects on AE signatures (time-driven data) and UT signals (received signal energy, change in peak frequency) are presented. The level of burn-through damage is defined by using single method or combine AE/UT methods.

  20. Front-Line Physicians' Satisfaction with Information Systems in Hospitals.

    PubMed

    Peltonen, Laura-Maria; Junttila, Kristiina; Salanterä, Sanna

    2018-01-01

    Day-to-day operations management in hospital units is difficult due to continuously varying situations, several actors involved and a vast number of information systems in use. The aim of this study was to describe front-line physicians' satisfaction with existing information systems needed to support the day-to-day operations management in hospitals. A cross-sectional survey was used and data chosen with stratified random sampling were collected in nine hospitals. Data were analyzed with descriptive and inferential statistical methods. The response rate was 65 % (n = 111). The physicians reported that information systems support their decision making to some extent, but they do not improve access to information nor are they tailored for physicians. The respondents also reported that they need to use several information systems to support decision making and that they would prefer one information system to access important information. Improved information access would better support physicians' decision making and has the potential to improve the quality of decisions and speed up the decision making process.

  1. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    PubMed

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project.

  2. BIM Based Virtual Environment for Fire Emergency Evacuation

    PubMed Central

    Rezgui, Yacine; Ong, Hoang N.

    2014-01-01

    Recent building emergency management research has highlighted the need for the effective utilization of dynamically changing building information. BIM (building information modelling) can play a significant role in this process due to its comprehensive and standardized data format and integrated process. This paper introduces a BIM based virtual environment supported by virtual reality (VR) and a serious game engine to address several key issues for building emergency management, for example, timely two-way information updating and better emergency awareness training. The focus of this paper lies on how to utilize BIM as a comprehensive building information provider to work with virtual reality technologies to build an adaptable immersive serious game environment to provide real-time fire evacuation guidance. The innovation lies on the seamless integration between BIM and a serious game based virtual reality (VR) environment aiming at practical problem solving by leveraging state-of-the-art computing technologies. The system has been tested for its robustness and functionality against the development requirements, and the results showed promising potential to support more effective emergency management. PMID:25197704

  3. Camouflage target detection via hyperspectral imaging plus information divergence measurement

    NASA Astrophysics Data System (ADS)

    Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Ji, Yiqun; Shen, Weimin

    2016-01-01

    Target detection is one of most important applications in remote sensing. Nowadays accurate camouflage target distinction is often resorted to spectral imaging technique due to its high-resolution spectral/spatial information acquisition ability as well as plenty of data processing methods. In this paper, hyper-spectral imaging technique together with spectral information divergence measure method is used to solve camouflage target detection problem. A self-developed visual-band hyper-spectral imaging device is adopted to collect data cubes of certain experimental scene before spectral information divergences are worked out so as to discriminate target camouflage and anomaly. Full-band information divergences are measured to evaluate target detection effect visually and quantitatively. Information divergence measurement is proved to be a low-cost and effective tool for target detection task and can be further developed to other target detection applications beyond spectral imaging technique.

  4. A Concept of Constructing a Common Information Space for High Tech Programs Using Information Analytical Systems

    NASA Astrophysics Data System (ADS)

    Zakharova, Alexandra A.; Kolegova, Olga A.; Nekrasova, Maria E.

    2016-04-01

    The paper deals with the issues in program management used for engineering innovative products. The existing project management tools were analyzed. The aim is to develop a decision support system that takes into account the features of program management used for high-tech products: research intensity, a high level of technical risks, unpredictable results due to the impact of various external factors, availability of several implementing agencies. The need for involving experts and using intelligent techniques for information processing is demonstrated. A conceptual model of common information space to support communication between members of the collaboration on high-tech programs has been developed. The structure and objectives of the information analysis system “Geokhod” were formulated with the purpose to implement the conceptual model of common information space in the program “Development and production of new class mining equipment - “Geokhod”.

  5. Graded, Dynamically Routable Information Processing with Synfire-Gated Synfire Chains.

    PubMed

    Wang, Zhuo; Sornborger, Andrew T; Tao, Louis

    2016-06-01

    Coherent neural spiking and local field potentials are believed to be signatures of the binding and transfer of information in the brain. Coherent activity has now been measured experimentally in many regions of mammalian cortex. Recently experimental evidence has been presented suggesting that neural information is encoded and transferred in packets, i.e., in stereotypical, correlated spiking patterns of neural activity. Due to their relevance to coherent spiking, synfire chains are one of the main theoretical constructs that have been appealed to in order to describe coherent spiking and information transfer phenomena. However, for some time, it has been known that synchronous activity in feedforward networks asymptotically either approaches an attractor with fixed waveform and amplitude, or fails to propagate. This has limited the classical synfire chain's ability to explain graded neuronal responses. Recently, we have shown that pulse-gated synfire chains are capable of propagating graded information coded in mean population current or firing rate amplitudes. In particular, we showed that it is possible to use one synfire chain to provide gating pulses and a second, pulse-gated synfire chain to propagate graded information. We called these circuits synfire-gated synfire chains (SGSCs). Here, we present SGSCs in which graded information can rapidly cascade through a neural circuit, and show a correspondence between this type of transfer and a mean-field model in which gating pulses overlap in time. We show that SGSCs are robust in the presence of variability in population size, pulse timing and synaptic strength. Finally, we demonstrate the computational capabilities of SGSC-based information coding by implementing a self-contained, spike-based, modular neural circuit that is triggered by streaming input, processes the input, then makes a decision based on the processed information and shuts itself down.

  6. Changes in Phenolic Acid Content in Maize during Food Product Processing.

    PubMed

    Butts-Wilmsmeyer, Carrie J; Mumm, Rita H; Rausch, Kent D; Kandhola, Gurshagan; Yana, Nicole A; Happ, Mary M; Ostezan, Alexandra; Wasmund, Matthew; Bohn, Martin O

    2018-04-04

    The notion that many nutrients and beneficial phytochemicals in maize are lost due to food product processing is common, but this has not been studied in detail for the phenolic acids. Information regarding changes in phenolic acid content throughout processing is highly valuable because some phenolic acids are chemopreventive agents of aging-related diseases. It is unknown when and why these changes in phenolic acid content might occur during processing, whether some maize genotypes might be more resistant to processing induced changes in phenolic acid content than other genotypes, or if processing affects the bioavailability of phenolic acids in maize-based food products. For this study, a laboratory-scale processing protocol was developed and used to process whole maize kernels into toasted cornflakes. High-throughput microscale wet-lab analyses were applied to determine the concentrations of soluble and insoluble-bound phenolic acids in samples of grain, three intermediate processing stages, and toasted cornflakes obtained from 12 ex-PVP maize inbreds and seven hybrids. In the grain, insoluble-bound ferulic acid was the most common phenolic acid, followed by insoluble-bound p-coumaric acid and soluble cinnamic acid, a precursor to the phenolic acids. Notably, the ferulic acid content was approximately 1950 μg/g, more than ten-times the concentration of many fruits and vegetables. Processing reduced the content of the phenolic acids regardless of the genotype. Most changes occurred during dry milling due to the removal of the bran. The concentration of bioavailable soluble ferulic and p-coumaric acid increased negligibly due to thermal stresses. Therefore, the current dry milling based processing techniques used to manufacture many maize-based foods, including breakfast cereals, are not conducive for increasing the content of bioavailable phenolics in processed maize food products. This suggests that while maize is an excellent source of phenolics, alternative or complementary processing methods must be developed before this nutritional resource can be utilized.

  7. More than a memory: Confirmatory visual search is not caused by remembering a visual feature.

    PubMed

    Rajsic, Jason; Pratt, Jay

    2017-10-01

    Previous research has demonstrated a preference for positive over negative information in visual search; asking whether a target object is green biases search towards green objects, even when this entails more perceptual processing than searching non-green objects. The present study investigated whether this confirmatory search bias is due to the presence of one particular (e.g., green) color in memory during search. Across two experiments, we show that this is not the critical factor in generating a confirmation bias in search. Search slowed proportionally to the number of stimuli whose color matched the color held in memory only when the color was remembered as part of the search instructions. These results suggest that biased search for information is due to a particular attentional selection strategy, and not to memory-driven attentional biases. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Automatic generation of pictorial transcripts of video programs

    NASA Astrophysics Data System (ADS)

    Shahraray, Behzad; Gibbon, David C.

    1995-03-01

    An automatic authoring system for the generation of pictorial transcripts of video programs which are accompanied by closed caption information is presented. A number of key frames, each of which represents the visual information in a segment of the video (i.e., a scene), are selected automatically by performing a content-based sampling of the video program. The textual information is recovered from the closed caption signal and is initially segmented based on its implied temporal relationship with the video segments. The text segmentation boundaries are then adjusted, based on lexical analysis and/or caption control information, to account for synchronization errors due to possible delays in the detection of scene boundaries or the transmission of the caption information. The closed caption text is further refined through linguistic processing for conversion to lower- case with correct capitalization. The key frames and the related text generate a compact multimedia presentation of the contents of the video program which lends itself to efficient storage and transmission. This compact representation can be viewed on a computer screen, or used to generate the input to a commercial text processing package to generate a printed version of the program.

  9. Defective motion processing in children with cerebral visual impairment due to periventricular white matter damage.

    PubMed

    Weinstein, Joel M; Gilmore, Rick O; Shaikh, Sumera M; Kunselman, Allen R; Trescher, William V; Tashima, Lauren M; Boltz, Marianne E; McAuliffe, Matthew B; Cheung, Albert; Fesi, Jeremy D

    2012-07-01

    We sought to characterize visual motion processing in children with cerebral visual impairment (CVI) due to periventricular white matter damage caused by either hydrocephalus (eight individuals) or periventricular leukomalacia (PVL) associated with prematurity (11 individuals). Using steady-state visually evoked potentials (ssVEP), we measured cortical activity related to motion processing for two distinct types of visual stimuli: 'local' motion patterns thought to activate mainly primary visual cortex (V1), and 'global' or coherent patterns thought to activate higher cortical visual association areas (V3, V5, etc.). We studied three groups of children: (1) 19 children with CVI (mean age 9y 6mo [SD 3y 8mo]; 9 male; 10 female); (2) 40 neurologically and visually normal comparison children (mean age 9y 6mo [SD 3y 1mo]; 18 male; 22 female); and (3) because strabismus and amblyopia are common in children with CVI, a group of 41 children without neurological problems who had visual deficits due to amblyopia and/or strabismus (mean age 7y 8mo [SD 2y 8mo]; 28 male; 13 female). We found that the processing of global as opposed to local motion was preferentially impaired in individuals with CVI, especially for slower target velocities (p=0.028). Motion processing is impaired in children with CVI. ssVEP may provide useful and objective information about the development of higher visual function in children at risk for CVI. © The Authors. Journal compilation © Mac Keith Press 2011.

  10. Effects of visual information regarding allocentric processing in haptic parallelity matching.

    PubMed

    Van Mier, Hanneke I

    2013-10-01

    Research has revealed that haptic perception of parallelity deviates from physical reality. Large and systematic deviations have been found in haptic parallelity matching most likely due to the influence of the hand-centered egocentric reference frame. Providing information that increases the influence of allocentric processing has been shown to improve performance on haptic matching. In this study allocentric processing was stimulated by providing informative vision in haptic matching tasks that were performed using hand- and arm-centered reference frames. Twenty blindfolded participants (ten men, ten women) explored the orientation of a reference bar with the non-dominant hand and subsequently matched (task HP) or mirrored (task HM) its orientation on a test bar with the dominant hand. Visual information was provided by means of informative vision with participants having full view of the test bar, while the reference bar was blocked from their view (task VHP). To decrease the egocentric bias of the hands, participants also performed a visual haptic parallelity drawing task (task VHPD) using an arm-centered reference frame, by drawing the orientation of the reference bar. In all tasks, the distance between and orientation of the bars were manipulated. A significant effect of task was found; performance improved from task HP, to VHP to VHPD, and HM. Significant effects of distance were found in the first three tasks, whereas orientation and gender effects were only significant in tasks HP and VHP. The results showed that stimulating allocentric processing by means of informative vision and reducing the egocentric bias by using an arm-centered reference frame led to most accurate performance on parallelity matching. © 2013 Elsevier B.V. All rights reserved.

  11. Memory and comprehension deficits in spatial descriptions of children with non-verbal and reading disabilities.

    PubMed

    Mammarella, Irene C; Meneghetti, Chiara; Pazzaglia, Francesca; Cornoldi, Cesare

    2014-01-01

    The present study investigated the difficulties encountered by children with non-verbal learning disability (NLD) and reading disability (RD) when processing spatial information derived from descriptions, based on the assumption that both groups should find it more difficult than matched controls, but for different reasons, i.e., due to a memory encoding difficulty in cases of RD and to spatial information comprehension problems in cases of NLD. Spatial descriptions from both survey and route perspectives were presented to 9-12-year-old children divided into three groups: NLD (N = 12); RD (N = 12), and typically developing controls (TD; N = 15); then participants completed a sentence verification task and a memory for locations task. The sentence verification task was presented in two conditions: in one the children could refer to the text while answering the questions (i.e., text present condition), and in the other the text was withdrawn (i.e., text absent condition). Results showed that the RD group benefited from the text present condition, but was impaired to the same extent as the NLD group in the text absent condition, suggesting that the NLD children's difficulty is due mainly to their poor comprehension of spatial descriptions, while the RD children's difficulty is due more to a memory encoding problem. These results are discussed in terms of their implications in the neuropsychological profiles of children with NLD or RD, and the processes involved in spatial descriptions.

  12. Brain reflections: A circuit-based framework for understanding information processing and cognitive control.

    PubMed

    Gratton, Gabriele

    2018-03-01

    Here, I propose a view of the architecture of the human information processing system, and of how it can be adapted to changing task demands (which is the hallmark of cognitive control). This view is informed by an interpretation of brain activity as reflecting the excitability level of neural representations, encoding not only stimuli and temporal contexts, but also action plans and task goals. The proposed cognitive architecture includes three types of circuits: open circuits, involved in feed-forward processing such as that connecting stimuli with responses and characterized by brief, transient brain activity; and two types of closed circuits, positive feedback circuits (characterized by sustained, high-frequency oscillatory activity), which help select and maintain representations, and negative feedback circuits (characterized by brief, low-frequency oscillatory bursts), which are instead associated with changes in representations. Feed-forward activity is primarily responsible for the spread of activation along the information processing system. Oscillatory activity, instead, controls this spread. Sustained oscillatory activity due to both local cortical circuits (gamma) and longer corticothalamic circuits (alpha and beta) allows for the selection of individuated representations. Through the interaction of these circuits, it also allows for the preservation of representations across different temporal spans (sensory and working memory) and their spread across the brain. In contrast, brief bursts of oscillatory activity, generated by novel and/or conflicting information, lead to the interruption of sustained oscillatory activity and promote the generation of new representations. I discuss how this framework can account for a number of psychological and behavioral phenomena. © 2017 Society for Psychophysiological Research.

  13. Threat processing in generalized social phobia: an investigation of interpretation biases in ambiguous facial affect.

    PubMed

    Jusyte, Aiste; Schönenberg, Michael

    2014-06-30

    Facial affect is one of the most important information sources during the course of social interactions, but it is susceptible to distortion due to the complex and dynamic nature. Socially anxious individuals have been shown to exhibit alterations in the processing of social information, such as an attentional and interpretative bias toward threatening information. This may be one of the key factors contributing to the development and maintenance of anxious psychopathology. The aim of the current study was to investigate whether a threat-related interpretation bias is evident for ambiguous facial stimuli in a population of individuals with a generalized Social Anxiety Disorder (gSAD) as compared to healthy controls. Participants judged ambiguous happy/fearful, angry/fearful and angry/happy blends varying in intensity and rated the predominant affective expression. The results obtained in this study do not indicate that gSAD is associated with a biased interpretation of ambiguous facial affect. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  14. How do illness-anxious individuals process health-threatening information? A systematic review of evidence for the cognitive-behavioral model.

    PubMed

    Leonidou, Chrysanthi; Panayiotou, Georgia

    2018-08-01

    According to the cognitive-behavioral model, illness anxiety is developed and maintained through biased processing of health-threatening information and maladaptive responses to such information. This study is a systematic review of research that attempted to validate central tenets of the cognitive-behavioral model regarding etiological and maintenance mechanisms in illness anxiety. Sixty-two studies, including correlational and experimental designs, were identified through a systematic search of databases and were evaluated for their quality. Outcomes were synthesized following a qualitative thematic approach under categories of theoretically driven mechanisms derived from the cognitive-behavioral model: attention, memory and interpretation biases, perceived awareness and inaccuracy in perception of somatic sensations, negativity bias, emotion dysregulation, and behavioral avoidance. Findings partly support the cognitive-behavioral model, but several of its hypothetical mechanisms only receive weak support due to the scarcity of relevant studies. Directions for future research are suggested based on identified gaps in the existing literature. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Musical expertise is related to altered functional connectivity during audiovisual integration

    PubMed Central

    Paraskevopoulos, Evangelos; Kraneburg, Anja; Herholz, Sibylle Cornelia; Bamidis, Panagiotis D.; Pantev, Christo

    2015-01-01

    The present study investigated the cortical large-scale functional network underpinning audiovisual integration via magnetoencephalographic recordings. The reorganization of this network related to long-term musical training was investigated by comparing musicians to nonmusicians. Connectivity was calculated on the basis of the estimated mutual information of the sources’ activity, and the corresponding networks were statistically compared. Nonmusicians’ results indicated that the cortical network associated with audiovisual integration supports visuospatial processing and attentional shifting, whereas a sparser network, related to spatial awareness supports the identification of audiovisual incongruences. In contrast, musicians’ results showed enhanced connectivity in regions related to the identification of auditory pattern violations. Hence, nonmusicians rely on the processing of visual clues for the integration of audiovisual information, whereas musicians rely mostly on the corresponding auditory information. The large-scale cortical network underpinning multisensory integration is reorganized due to expertise in a cognitive domain that largely involves audiovisual integration, indicating long-term training-related neuroplasticity. PMID:26371305

  16. An Evaluation of a Natural Language Processing Tool for Identifying and Encoding Allergy Information in Emergency Department Clinical Notes

    PubMed Central

    Goss, Foster R.; Plasek, Joseph M.; Lau, Jason J.; Seger, Diane L.; Chang, Frank Y.; Zhou, Li

    2014-01-01

    Emergency department (ED) visits due to allergic reactions are common. Allergy information is often recorded in free-text provider notes; however, this domain has not yet been widely studied by the natural language processing (NLP) community. We developed an allergy module built on the MTERMS NLP system to identify and encode food, drug, and environmental allergies and allergic reactions. The module included updates to our lexicon using standard terminologies, and novel disambiguation algorithms. We developed an annotation schema and annotated 400 ED notes that served as a gold standard for comparison to MTERMS output. MTERMS achieved an F-measure of 87.6% for the detection of allergen names and no known allergies, 90% for identifying true reactions in each allergy statement where true allergens were also identified, and 69% for linking reactions to their allergen. These preliminary results demonstrate the feasibility using NLP to extract and encode allergy information from clinical notes. PMID:25954363

  17. Evaluating the Effect of Processing Parameters on Porosity in Electron Beam Melted Ti-6Al-4V via Synchrotron X-ray Microtomography

    NASA Astrophysics Data System (ADS)

    Cunningham, Ross; Narra, Sneha P.; Ozturk, Tugce; Beuth, Jack; Rollett, A. D.

    2016-03-01

    Electron beam melting (EBM) is one of the subsets of direct metal additive manufacturing (AM), an emerging manufacturing method that fabricates metallic parts directly from a three-dimensional (3D) computer model by the successive melting of powder layers. This family of technologies has seen significant growth in recent years due to its potential to manufacture complex components with shorter lead times, reduced material waste and minimal post-processing as a "near-net-shape" process, making it of particular interest to the biomedical and aerospace industries. The popular titanium alloy Ti-6Al-4V has been the focus of multiple studies due to its importance to these two industries, which can be attributed to its high strength to weight ratio and corrosion resistance. While previous research has found that most tensile properties of EBM Ti-6Al-4V meet or exceed conventional manufacturing standards, fatigue properties have been consistently inferior due to a significant presence of porosity. Studies have shown that adjusting processing parameters can reduce overall porosity; however, they frequently utilize methods that give insufficient information to properly characterize the porosity (e.g., Archimedes' method). A more detailed examination of the result of process parameter adjustments on the size and spatial distribution of gas porosity was performed utilizing synchrotron-based x-ray microtomography with a minimum feature resolution of 1.5 µm. Cross-sectional melt pool area was varied systematically via process mapping. Increasing melt pool area through the speed function variable was observed to significantly reduce porosity in the part.

  18. The communication process in clinical settings.

    PubMed

    Mathews, J J

    1983-01-01

    The communication of information in clinical settings is fraught with problems despite avowed common aims of practitioners and patients. Some reasons for the problematic nature of clinical communication are incongruent frames of reference about what information ought to be shared, sociolinguistic differences and social distance between practitioners and patients. Communication between doctors and nurses is also problematic, largely due to differences in ideology between the professions about what ought to be communicated to patients about their illness and who is ratified to give such information. Recent social changes, such as the Patient Bill of Rights and informed consent which assure access to information, and new conceptualizations of the nurse's role, warrant continued study of the communication process especially in regard to what constitutes appropriate and acceptable information about a patient's illness and who ought to give such information to patients. The purpose of this paper is to outline characteristics of communication in clinical settings and to provide a literature review of patient and practitioner interaction studies in order to reflect on why information exchange is problematic in clinical settings. A framework for presentation of the problems employs principles from interaction and role theory to investigate clinical communication from three viewpoints: (1) the level of shared knowledge between participants; (2) the effect of status, role and ideology on transactions; and (3) the regulation of communication imposed by features of the institution.

  19. Space Telemetry for the Energy Industry

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Space telemetry is the process whereby information acquired in orbit is relayed to Earth. In 1981, Bill Sheen, President of Nu-Tech Industries, Inc., saw a need for a better way of monitoring flow, due to high costs of oil and gas, increasing oil field theft and a mounting requirement for more timely information to speed up accounting procedures. Sheen turned to NASA for assistance which was provided by Kerr Industrial Applications Center (KIAC). The system that emerged from two years of research, now in production at Nu-Tech's Fort Worth Texas facility, is known as the Remote Measurement and Control Network.

  20. The impact of common APSE interface set specifications on space station information systems

    NASA Technical Reports Server (NTRS)

    Diaz-Herrera, Jorge L.; Sibley, Edgar H.

    1986-01-01

    Certain types of software facilities are needed in a Space Station Information Systems Environment; the Common APSE (Ada Program Support Environment) Interface Set (CAIS) was proposed as a means of satisfying them. The reasonableness of this is discussed by examining the current CAIS, considering the changes due to the latest Requirements and Criteria (RAC) document, and postulating the effects on the CAIS 2.0. Finally, a few additional comments are made on the problems inherent in the Ada language itself, especially on its deficiencies when used for implementing large distributed processing and data base applications.

  1. Exploring the Complex Pattern of Information Spreading in Online Blog Communities

    PubMed Central

    Pei, Sen; Muchnik, Lev; Tang, Shaoting; Zheng, Zhiming; Makse, Hernán A.

    2015-01-01

    Information spreading in online social communities has attracted tremendous attention due to its utmost practical values in applications. Despite that several individual-level diffusion data have been investigated, we still lack the detailed understanding of the spreading pattern of information. Here, by comparing information flows and social links in a blog community, we find that the diffusion processes are induced by three different spreading mechanisms: social spreading, self-promotion and broadcast. Although numerous previous studies have employed epidemic spreading models to simulate information diffusion, we observe that such models fail to reproduce the realistic diffusion pattern. In respect to users behaviors, strikingly, we find that most users would stick to one specific diffusion mechanism. Moreover, our observations indicate that the social spreading is not only crucial for the structure of diffusion trees, but also capable of inducing more subsequent individuals to acquire the information. Our findings suggest new directions for modeling of information diffusion in social systems, and could inform design of efficient propagation strategies based on users behaviors. PMID:25985081

  2. Exploring the complex pattern of information spreading in online blog communities.

    PubMed

    Pei, Sen; Muchnik, Lev; Tang, Shaoting; Zheng, Zhiming; Makse, Hernán A

    2015-01-01

    Information spreading in online social communities has attracted tremendous attention due to its utmost practical values in applications. Despite that several individual-level diffusion data have been investigated, we still lack the detailed understanding of the spreading pattern of information. Here, by comparing information flows and social links in a blog community, we find that the diffusion processes are induced by three different spreading mechanisms: social spreading, self-promotion and broadcast. Although numerous previous studies have employed epidemic spreading models to simulate information diffusion, we observe that such models fail to reproduce the realistic diffusion pattern. In respect to users behaviors, strikingly, we find that most users would stick to one specific diffusion mechanism. Moreover, our observations indicate that the social spreading is not only crucial for the structure of diffusion trees, but also capable of inducing more subsequent individuals to acquire the information. Our findings suggest new directions for modeling of information diffusion in social systems, and could inform design of efficient propagation strategies based on users behaviors.

  3. Content-Aware DataGuide with Incremental Index Update using Frequently Used Paths

    NASA Astrophysics Data System (ADS)

    Sharma, A. K.; Duhan, Neelam; Khattar, Priyanka

    2010-11-01

    Size of the WWW is increasing day by day. Due to the absence of structured data on the Web, it becomes very difficult for information retrieval tools to fully utilize the Web information. As a solution to this problem, XML pages come into play, which provide structural information to the users to some extent. Without efficient indexes, query processing can be quite inefficient due to an exhaustive traversal on XML data. In this paper an improved content-centric approach of Content-Aware DataGuide, which is an indexing technique for XML databases, is being proposed that uses frequently used paths from historical query logs to improve query performance. The index can be updated incrementally according to the changes in query workload and thus, the overhead of reconstruction can be minimized. Frequently used paths are extracted using any Sequential Pattern mining algorithm on subsequent queries in the query workload. After this, the data structures are incrementally updated. This indexing technique proves to be efficient as partial matching queries can be executed efficiently and users can now get the more relevant documents in results.

  4. Computer vision for driver assistance systems

    NASA Astrophysics Data System (ADS)

    Handmann, Uwe; Kalinke, Thomas; Tzomakas, Christos; Werner, Martin; von Seelen, Werner

    1998-07-01

    Systems for automated image analysis are useful for a variety of tasks and their importance is still increasing due to technological advances and an increase of social acceptance. Especially in the field of driver assistance systems the progress in science has reached a level of high performance. Fully or partly autonomously guided vehicles, particularly for road-based traffic, pose high demands on the development of reliable algorithms due to the conditions imposed by natural environments. At the Institut fur Neuroinformatik, methods for analyzing driving relevant scenes by computer vision are developed in cooperation with several partners from the automobile industry. We introduce a system which extracts the important information from an image taken by a CCD camera installed at the rear view mirror in a car. The approach consists of a sequential and a parallel sensor and information processing. Three main tasks namely the initial segmentation (object detection), the object tracking and the object classification are realized by integration in the sequential branch and by fusion in the parallel branch. The main gain of this approach is given by the integrative coupling of different algorithms providing partly redundant information.

  5. The role of informatics in patient-centered care and personalized medicine.

    PubMed

    Hanna, Matthew G; Pantanowitz, Liron

    2017-06-01

    The practice of cytopathology has dramatically changed due to advances in genomics and information technology. Cytology laboratories have accordingly become increasingly dependent on pathology informatics support to meet the emerging demands of precision medicine. Pathology informatics deals with information technology in the laboratory, and the impact of this technology on workflow processes and staff who interact with these tools. This article covers the critical role that laboratory information systems, electronic medical records, and digital imaging plays in patient-centered personalized medicine. The value of integrated diagnostic reports, clinical decision support, and the use of whole-slide imaging to better evaluate cytology samples destined for molecular testing is discussed. Image analysis that offers more precise and quantitative measurements in cytology is addressed, as well as the role of bioinformatics tools to cope with Big Data from next-generation sequencing. This article also highlights the barriers to the widespread adoption of these disruptive technologies due to regulatory obstacles, limited commercial solutions, poor interoperability, and lack of standardization. Cancer Cytopathol 2017;125(6 suppl):494-501. © 2017 American Cancer Society. © 2017 American Cancer Society.

  6. A two-channel, spectrally degenerate polarization entangled source on chip

    NASA Astrophysics Data System (ADS)

    Sansoni, Linda; Luo, Kai Hong; Eigner, Christof; Ricken, Raimund; Quiring, Viktor; Herrmann, Harald; Silberhorn, Christine

    2017-12-01

    Integrated optics provides the platform for the experimental implementation of highly complex and compact circuits for quantum information applications. In this context integrated waveguide sources represent a powerful resource for the generation of quantum states of light due to their high brightness and stability. However, the confinement of the light in a single spatial mode limits the realization of multi-channel sources. Due to this challenge one of the most adopted sources in quantum information processes, i.e. a source which generates spectrally indistinguishable polarization entangled photons in two different spatial modes, has not yet been realized in a fully integrated platform. Here we overcome this limitation by suitably engineering two periodically poled waveguides and an integrated polarization splitter in lithium niobate. This source produces polarization entangled states with fidelity of F = 0.973 ±0.003 and a test of Bell's inequality results in a violation larger than 14 standard deviations. It can work both in pulsed and continuous wave regime. This device represents a new step toward the implementation of fully integrated circuits for quantum information applications.

  7. Addressing maternal deaths due to violence: the Illinois experience.

    PubMed

    Koch, Abigail R; Geller, Stacie E

    2017-11-01

    Homicide, suicide, and substance abuse accounted for nearly one fourth of all pregnancy-associated deaths in Illinois from 2002 through 2013. Maternal mortality review in Illinois has been primarily focused on obstetric and medical causes and little is known about the circumstances surrounding deaths due to homicide, suicide, and substance abuse, if they are pregnancy related, and if the deaths are potentially preventable. To address this issue, we implemented a process to form a second statewide maternal mortality review committee for deaths due to violence in late 2014. We convened a stakeholder group to accomplish 3 tasks: (1) identify appropriate committee members; (2) identify potential types and sources of information that would be required for a meaningful review of violent maternal deaths; and (3) revise the Maternal Mortality Review Form. Because homicide, suicide, and substance abuse are closely linked to the social determinants of health, the review committee needed to have a broad membership with expertise in areas not required for obstetric maternal mortality review, including social service and community organizations. Identifying additional sources of information is critical; the state Violent Death Reporting System, case management data, and police and autopsy reports provide contextual information that cannot be found in medical records. The stakeholder group revised the Maternal Mortality Review Form to collect information relevant to violent maternal deaths, including screening history and psychosocial history. The form guides the maternal mortality review committee for deaths due to violence to identify potentially preventable factors relating to the woman, her family, systems of care, the community, the legal system, and the institutional environment. The committee has identified potential opportunities to decrease preventable death requiring cooperation with social service agencies and the criminal justice system in addition to the physical and mental health care systems. Illinois has demonstrated that by engaging appropriate members and expanding the information used, it is possible to conduct meaningful reviews of these deaths and make recommendations to prevent future deaths. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Modeling Dynamic Food Choice Processes to Understand Dietary Intervention Effects.

    PubMed

    Marcum, Christopher Steven; Goldring, Megan R; McBride, Colleen M; Persky, Susan

    2018-02-17

    Meal construction is largely governed by nonconscious and habit-based processes that can be represented as a collection of in dividual, micro-level food choices that eventually give rise to a final plate. Despite this, dietary behavior intervention research rarely captures these micro-level food choice processes, instead measuring outcomes at aggregated levels. This is due in part to a dearth of analytic techniques to model these dynamic time-series events. The current article addresses this limitation by applying a generalization of the relational event framework to model micro-level food choice behavior following an educational intervention. Relational event modeling was used to model the food choices that 221 mothers made for their child following receipt of an information-based intervention. Participants were randomized to receive either (a) control information; (b) childhood obesity risk information; (c) childhood obesity risk information plus a personalized family history-based risk estimate for their child. Participants then made food choices for their child in a virtual reality-based food buffet simulation. Micro-level aspects of the built environment, such as the ordering of each food in the buffet, were influential. Other dynamic processes such as choice inertia also influenced food selection. Among participants receiving the strongest intervention condition, choice inertia decreased and the overall rate of food selection increased. Modeling food selection processes can elucidate the points at which interventions exert their influence. Researchers can leverage these findings to gain insight into nonconscious and uncontrollable aspects of food selection that influence dietary outcomes, which can ultimately improve the design of dietary interventions.

  9. Access of emotional information to visual awareness in patients with major depressive disorder.

    PubMed

    Sterzer, P; Hilgenfeldt, T; Freudenberg, P; Bermpohl, F; Adli, M

    2011-08-01

    According to cognitive theories of depression, negative biases affect most cognitive processes including perception. Such depressive perception may result not only from biased cognitive appraisal but also from automatic processing biases that influence the access of sensory information to awareness. Twenty patients with major depressive disorder (MDD) and 20 healthy control participants underwent behavioural testing with a variant of binocular rivalry, continuous flash suppression (CFS), to investigate the potency of emotional visual stimuli to gain access to awareness. While a neutral, fearful, happy or sad emotional face was presented to one eye, high-contrast dynamic patterns were presented to the other eye, resulting in initial suppression of the face from awareness. Participants indicated the location of the face with a key press as soon as it became visible. The modulation of suppression time by emotional expression was taken as an index of unconscious emotion processing. We found a significant difference in the emotional modulation of suppression time between MDD patients and controls. This difference was due to relatively shorter suppression of sad faces and, to a lesser degree, to longer suppression of happy faces in MDD. Suppression time modulation by sad expression correlated with change in self-reported severity of depression after 4 weeks. Our finding of preferential access to awareness for mood-congruent stimuli supports the notion that depressive perception may be related to altered sensory information processing even at automatic processing stages. Such perceptual biases towards mood-congruent information may reinforce depressed mood and contribute to negative cognitive biases. © Cambridge University Press 2011

  10. Relations between Short-term Memory Deficits, Semantic Processing, and Executive Function

    PubMed Central

    Allen, Corinne M.; Martin, Randi C.; Martin, Nadine

    2012-01-01

    Background Previous research has suggested separable short-term memory (STM) buffers for the maintenance of phonological and lexical-semantic information, as some patients with aphasia show better ability to retain semantic than phonological information and others show the reverse. Recently, researchers have proposed that deficits to the maintenance of semantic information in STM are related to executive control abilities. Aims The present study investigated the relationship of executive function abilities with semantic and phonological short-term memory (STM) and semantic processing in such patients, as some previous research has suggested that semantic STM deficits and semantic processing abilities are critically related to specific or general executive function deficits. Method and Procedures 20 patients with aphasia and STM deficits were tested on measures of short-term retention, semantic processing, and both complex and simple executive function tasks. Outcome and Results In correlational analyses, we found no relation between semantic STM and performance on simple or complex executive function tasks. In contrast, phonological STM was related to executive function performance in tasks that had a verbal component, suggesting that performance in some executive function tasks depends on maintaining or rehearsing phonological codes. Although semantic STM was not related to executive function ability, performance on semantic processing tasks was related to executive function, perhaps due to similar executive task requirements in both semantic processing and executive function tasks. Conclusions Implications for treatment and interpretations of executive deficits are discussed. PMID:22736889

  11. Estimation of the Past and Future Infrastructure Damage Due the Permafrost Evolution Processes

    NASA Astrophysics Data System (ADS)

    Sergeev, D. O.; Chesnokova, I. V.; Morozova, A. V.

    2015-12-01

    The geocryological processes such as thermokarst, frost heaving and fracturing, icing, thermal erosion are the source of immediate danger for the structures. The economic losses during the construction procedures in the permafrost area are linked also with the other geological processes that have the specific character in cold regions. These processes are swamping, desertification, deflation, flooding, mudflows and landslides. Linear transport structures are most vulnerable component of regional and national economy. Because the high length the transport structures have to cross the landscapes with different permafrost conditions that have the different reaction to climate change. The climate warming is favorable for thermokarst and the frost heaving is linked with climate cooling. In result the structure falls in the circumstances that are not predicted in the construction project. Local engineering problems of structure exploitation lead to global risks of sustainable development of regions. Authors developed the database of geocryological damage cases for the last twelve years at the Russian territory. Spatial data have the attributive table that was filled by the published information from various permafrost conference proceedings. The preliminary GIS-analysis of gathered data showed the widespread territorial distribution of the cases of negative consequences of geocryological processes activity. The information about maximum effect from geocryological processes was validated by detailed field investigation along the railways in Yamal and Transbaicalia Regions. Authors expect the expanding of database by similar data from other sectors of Arctic. It is important for analyzing the regional, time and industrial tendencies of geocryological risk evolution. Obtained information could be used in insurance procedures and in information systems of decisions support in different management levels. The investigation was completed with financial support by Russian Foundation of Basic Research (Project #13-05-00462).

  12. Mild extraction methods using aqueous glucose solution for the analysis of natural dyes in textile artefacts dyed with Dyer's madder (Rubia tinctorum L.).

    PubMed

    Ford, Lauren; Henderson, Robert L; Rayner, Christopher M; Blackburn, Richard S

    2017-03-03

    Madder (Rubia tinctorum L.) has been widely used as a red dye throughout history. Acid-sensitive colorants present in madder, such as glycosides (lucidin primeveroside, ruberythric acid, galiosin) and sensitive aglycons (lucidin), are degraded in the textile back extraction process; in previous literature these sensitive molecules are either absent or present in only low concentrations due to the use of acid in typical textile back extraction processes. Anthraquinone aglycons alizarin and purpurin are usually identified in analysis following harsh back extraction methods, such those using solvent mixtures with concentrated hydrochloric acid at high temperatures. Use of softer extraction techniques potentially allows for dye components present in madder to be extracted without degradation, which can potentially provide more information about the original dye profile, which varies significantly between madder varieties, species and dyeing technique. Herein, a softer extraction method involving aqueous glucose solution was developed and compared to other back extraction techniques on wool dyed with root extract from different varieties of Rubia tinctorum. Efficiencies of the extraction methods were analysed by HPLC coupled with diode array detection. Acidic literature methods were evaluated and they generally caused hydrolysis and degradation of the dye components, with alizarin, lucidin, and purpurin being the main compounds extracted. In contrast, extraction in aqueous glucose solution provides a highly effective method for extraction of madder dyed wool and is shown to efficiently extract lucidin primeveroside and ruberythric acid without causing hydrolysis and also extract aglycons that are present due to hydrolysis during processing of the plant material. Glucose solution is a favourable extraction medium due to its ability to form extensive hydrogen bonding with glycosides present in madder, and displace them from the fibre. This new glucose method offers an efficient process that preserves these sensitive molecules and is a step-change in analysis of madder dyed textiles as it can provide further information about historical dye preparation and dyeing processes that current methods cannot. The method also efficiently extracts glycosides in artificially aged samples, making it applicable for museum textile artefacts. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Domain Specific Language Support for Exascale. Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baden, Scott

    The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically

  14. Maximizing Modern Distribution of Complex Anatomical Spatial Information: 3D Reconstruction and Rapid Prototype Production of Anatomical Corrosion Casts of Human Specimens

    ERIC Educational Resources Information Center

    Li, Jianyi; Nie, Lanying; Li, Zeyu; Lin, Lijun; Tang, Lei; Ouyang, Jun

    2012-01-01

    Anatomical corrosion casts of human specimens are useful teaching aids. However, their use is limited due to ethical dilemmas associated with their production, their lack of perfect reproducibility, and their consumption of original specimens in the process of casting. In this study, new approaches with modern distribution of complex anatomical…

  15. Coupled hydrological and biogeochemical processes controlling variability of nitrogen species in streamflow during autumn in an upland forest

    Treesearch

    Stephen D. Sebestyen; James B. Shanley; Elizabeth W. Boyer; Carol Kendall; Daniel H. Doctor

    2014-01-01

    Autumn is a season of dynamic change in forest streams of the northeastern United States due to effects of leaf fall on both hydrology and biogeochemistry. Few studies have explored how interactions of biogeochemical transformations, various nitrogen sources, and catchment flow paths affect stream nitrogen variation during autumn. To provide more information on this...

  16. Developing Academic English Language Proficiency Prototypes for 5th Grade Reading: Psychometric and Linguistic Profiles of Tasks. An Extended Executive Summary. CSE Report 720

    ERIC Educational Resources Information Center

    Bailey, Alison L.; Huang, Becky H.; Shin, Hye Won; Farnsworth, Tim; Butler, Frances A.

    2007-01-01

    Within an evidentiary framework for operationally defining academic English language proficiency (AELP), linguistic analyses of standards, classroom discourse, and textbooks have led to specifications for assessment of AELP. The test development process described here is novel due to the emphasis on using linguistic profiles to inform the …

  17. Texting Styles and Information Change of SMS Text Messages in Filipino

    NASA Astrophysics Data System (ADS)

    Cabatbat, Josephine Jill T.; Tapang, Giovanni A.

    2013-02-01

    We identify the different styles of texting in Filipino short message service (SMS) texts and analyze the change in unigram and bigram frequencies due to these styles. Style preference vectors for sample texts were calculated and used to identify the style combination used by an average individual. The change in Shannon entropy of the SMS text is explained in light of a coding process.

  18. A smart multisensor approach to assist blind people in specific urban navigation tasks.

    PubMed

    Ando, B

    2008-12-01

    Visually impaired people are often discouraged in using electronic aids due to complexity of operation, large amount of training, nonoptimized degree of information provided to the user, and high cost. In this paper, a new multisensor architecture is discussed, which would help blind people to perform urban mobility tasks. The device is based on a multisensor strategy and adopts smart signal processing.

  19. Conflict minerals from the Democratic Republic of the Congo: global tantalum processing plants, a critical part of the tantalum supply chain

    USGS Publications Warehouse

    Papp, John F.

    2014-01-01

    Post-beneficiation processing plants (generally called smelters and refineries) for 3TG mineral ores and concentrates were identified by company and industry association representatives as being the link in the 3TG mineral supply chain through which these minerals can be traced to their source of origin (mine). The determination of the source of origin is critical to the development of a complete and transparent conflict-free mineral supply chain. Tungsten processing plants were the subject of the first fact sheet in this series published by USGS NMIC in August 2014. Background information about historical conditions and multinational stakeholders’ voluntary due diligence guidance for minerals from conflict-affected and high-risk areas is presented in the tungsten fact sheet. This fact sheet, the second in a series about 3TG minerals, focuses on the tantalum supply chain by listing selected processors that produced tantalum materials commercially worldwide during 2013–14. It does not provide any information regarding the sources of material processed in these facilities.

  20. Content standards for medical image metadata

    NASA Astrophysics Data System (ADS)

    d'Ornellas, Marcos C.; da Rocha, Rafael P.

    2003-12-01

    Medical images are at the heart of the healthcare diagnostic procedures. They have provided not only a noninvasive mean to view anatomical cross-sections of internal organs but also a mean for physicians to evaluate the patient"s diagnosis and monitor the effects of the treatment. For a Medical Center, the emphasis may shift from the generation of image to post processing and data management since the medical staff may generate even more processed images and other data from the original image after various analyses and post processing. A medical image data repository for health care information system is becoming a critical need. This data repository would contain comprehensive patient records, including information such as clinical data and related diagnostic images, and post-processed images. Due to the large volume and complexity of the data as well as the diversified user access requirements, the implementation of the medical image archive system will be a complex and challenging task. This paper discusses content standards for medical image metadata. In addition it also focuses on the image metadata content evaluation and metadata quality management.

  1. Time-lapse electrical surveys to locate infiltration zones in weathered hard rock tropical areas

    NASA Astrophysics Data System (ADS)

    Wubda, M.; Descloitres, M.; Yalo, N.; Ribolzi, O.; Vouillamoz, J. M.; Boukari, M.; Hector, B.; Séguis, L.

    2017-07-01

    In West Africa, infiltration and groundwater recharge processes in hard rock areas are depending on climatic, surface and subsurface conditions, and are poorly documented. Part of the reason is that identification, location and monitoring of these processes is still a challenge. Here, we explore the potential for time-lapse electrical surveys to bring additional information on these processes for two different climate situations: a semi-arid Sahelian site (north of Burkina and a humid Sudanian site (north of Benin), respectively focusing on indirect (localized) and direct (diffuse) recharge processes. The methodology is based on surveys in dry season and rainy season on typical pond or gully using Electrical Resistivity Tomography (ERT) and frequency electromagnetic (FEM) apparent conductivity mapping. The results show that in the Sahelian zone an indirect recharge occurs as expected, but infiltration doesn't takes place at the center of the pond to the aquifer, but occurs laterally in the banks. In Sudanian zone, the ERT survey shows a direct recharge process as expected, but also a complicated behavior of groundwater dilution, as well as the role of hardpans for fast infiltration. These processes are ascertained by groundwater monitoring in adjacent observing wells. At last, FEM time lapse mapping is found to be difficult to quantitatively interpreted due to the non-uniqueness of the model, clearly evidenced comparing FEM result to auger holes monitoring. Finally, we found that time-lapse ERT can be an efficient way to track infiltration processes across ponds and gullies in both climatic conditions, the Sahelian setting providing results easier to interpret, due to significant resistivity contrasts between dry and rain seasons. Both methods can be used for efficient implementation of punctual sensors for complementary studies. However, FEM time-lapse mapping remains difficult to practice without external information that renders this method less attractive for quantitative interpretation purposes.

  2. Modelling surface water-groundwater interaction with a conceptual approach: model development and application in New Zealand

    NASA Astrophysics Data System (ADS)

    Yang, J.; Zammit, C.; McMillan, H. K.

    2016-12-01

    As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.

  3. Honeybee society destruction by losing control of self-reproduction

    NASA Astrophysics Data System (ADS)

    Zhang, Peipei; Su, Beibei; He, Da-Ren

    2004-03-01

    Recently the mechanism of the damage caused by invasion of Apis mellifera capensis honeybee into the normal A. M. Scutellata colonies became interesting for scientists due to the fact that the mechanism may resemble those of cancer vicious hyperplasia, spreading of some epidemic, and turbulence of society induced by some bad society groups. For the mechanism, we suggest a new guess, which means that the losing control of self-reproduction disturbs and throws information structure of the society into confuse. We also simulate the damage process with a cellular automata based on the idea. The simulation shows that the process is equivalent to a non-equilibrium percolation phase transition. This discussion remind us that the management and monitor on the information network between society members may be a more effective way for avoiding the overflow of the destructor sub-colonies.

  4. Scalable quantum information processing with atomic ensembles and flying photons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei Feng; Yu Yafei; Feng Mang

    2009-10-15

    We present a scheme for scalable quantum information processing with atomic ensembles and flying photons. Using the Rydberg blockade, we encode the qubits in the collective atomic states, which could be manipulated fast and easily due to the enhanced interaction in comparison to the single-atom case. We demonstrate that our proposed gating could be applied to generation of two-dimensional cluster states for measurement-based quantum computation. Moreover, the atomic ensembles also function as quantum repeaters useful for long-distance quantum state transfer. We show the possibility of our scheme to work in bad cavity or in weak coupling regime, which could muchmore » relax the experimental requirement. The efficient coherent operations on the ensemble qubits enable our scheme to be switchable between quantum computation and quantum communication using atomic ensembles.« less

  5. Impaired threat prioritisation after selective bilateral amygdala lesions

    PubMed Central

    Bach, Dominik R.; Hurlemann, Rene; Dolan, Raymond J.

    2015-01-01

    The amygdala is proposed to process threat-related information in non-human animals. In humans, empirical evidence from lesion studies has provided the strongest evidence for a role in emotional face recognition and social judgement. Here we use a face-in-the-crowd (FITC) task which in healthy control individuals reveals prioritised threat processing, evident in faster serial search for angry compared to happy target faces. We investigate AM and BG, two individuals with bilateral amygdala lesions due to Urbach–Wiethe syndrome, and 16 control individuals. In lesion patients we show a reversal of a threat detection advantage indicating a profound impairment in prioritising threat information. This is the first direct demonstration that human amygdala lesions impair prioritisation of threatening faces, providing evidence that this structure has a causal role in responding to imminent danger. PMID:25282058

  6. An information theory model for dissipation in open quantum systems

    NASA Astrophysics Data System (ADS)

    Rogers, David M.

    2017-08-01

    This work presents a general model for open quantum systems using an information game along the lines of Jaynes’ original work. It is shown how an energy based reweighting of propagators provides a novel moment generating function at each time point in the process. Derivatives of the generating function give moments of the time derivatives of observables. Aside from the mathematically helpful properties, the ansatz reproduces key physics of stochastic quantum processes. At high temperature, the average density matrix follows the Caldeira-Leggett equation. Its associated Langevin equation clearly demonstrates the emergence of dissipation and decoherence time scales, as well as an additional diffusion due to quantum confinement. A consistent interpretation of these results is that decoherence and wavefunction collapse during measurement are directly related to the degree of environmental noise, and thus occur because of subjective uncertainty of an observer.

  7. Event-triggered distributed filtering over sensor networks with deception attacks and partial measurements

    NASA Astrophysics Data System (ADS)

    Bu, Xianye; Dong, Hongli; Han, Fei; Li, Gongfa

    2018-07-01

    This paper is concerned with the distributed filtering problem for a class of time-varying systems subject to deception attacks and event-triggering protocols. Due to the bandwidth limitation, an event-triggered communication strategy is adopted to alleviate the data transmission pressure in the algorithm implementation process. The partial nodes-based filtering problem is considered, where only a partial of nodes can measure the information of the plant. Meanwhile, the measurement information possibly suffers the deception attacks in the transmission process. Sufficient conditions can be established such that the error dynamics satisfies the prescribed average ? performance constraints. The parameters of designed filters can be calculated by solving a series of recursive linear matrix inequalities. A simulation example is presented to demonstrate the effectiveness of the proposed filtering method in this paper.

  8. Government Open Systems Interconnection Profile (GOSIP) transition strategy

    NASA Astrophysics Data System (ADS)

    Laxen, Mark R.

    1993-09-01

    This thesis analyzes the Government Open Systems Interconnection Profile (GOSIP) and the requirements of the Federal Information Processing Standard (FIPS) Publication 146-1. It begins by examining the International Organization for Standardization (ISO) Open Systems Interconnection (OSI) architecture and protocol suites and the distinctions between GOSIP version one and two. Additionally, it explores some of the GOSIP protocol details and discusses the process by which standards organizations have developed their recommendations. Implementation considerations from both government and vendor perspectives illustrate the barriers and requirements faced by information systems managers, as well as basic transition strategies. The result of this thesis is to show a transition strategy through an extended and coordinated period of coexistence due to extensive legacy systems and GOSIP product unavailability. Recommendations for GOSIP protocol standards to include capabilities outside the OSI model are also presented.

  9. Evaporation of (quantum) black holes and energy conservation

    NASA Astrophysics Data System (ADS)

    Torres, R.; Fayos, F.; Lorente-Espín, O.

    2013-03-01

    We consider Hawking radiation as due to a tunneling process in a black hole were quantum corrections, derived from Quantum Einstein Gravity, are taken into account. The consequent derivation, satisfying conservation laws, leads to a deviation from an exact thermal spectrum. This has consequences for the information loss paradox since the non-thermal radiation is shown to carry information out of the black hole. Under the appropriate approximation, a quantum corrected temperature is assigned to the black hole. The evolution of the quantum black hole as it evaporates is then described by taking into account the full implications of energy conservation as well as the backscattered radiation. It is shown that, as a critical mass of the order of Planck's mass is reached, the evaporation process decelerates abruptly while the black hole mass decays towards this critical mass.

  10. Optimization of Visual Information Presentation for Visual Prosthesis.

    PubMed

    Guo, Fei; Yang, Yuan; Gao, Yong

    2018-01-01

    Visual prosthesis applying electrical stimulation to restore visual function for the blind has promising prospects. However, due to the low resolution, limited visual field, and the low dynamic range of the visual perception, huge loss of information occurred when presenting daily scenes. The ability of object recognition in real-life scenarios is severely restricted for prosthetic users. To overcome the limitations, optimizing the visual information in the simulated prosthetic vision has been the focus of research. This paper proposes two image processing strategies based on a salient object detection technique. The two processing strategies enable the prosthetic implants to focus on the object of interest and suppress the background clutter. Psychophysical experiments show that techniques such as foreground zooming with background clutter removal and foreground edge detection with background reduction have positive impacts on the task of object recognition in simulated prosthetic vision. By using edge detection and zooming technique, the two processing strategies significantly improve the recognition accuracy of objects. We can conclude that the visual prosthesis using our proposed strategy can assist the blind to improve their ability to recognize objects. The results will provide effective solutions for the further development of visual prosthesis.

  11. Optimization of Visual Information Presentation for Visual Prosthesis

    PubMed Central

    Gao, Yong

    2018-01-01

    Visual prosthesis applying electrical stimulation to restore visual function for the blind has promising prospects. However, due to the low resolution, limited visual field, and the low dynamic range of the visual perception, huge loss of information occurred when presenting daily scenes. The ability of object recognition in real-life scenarios is severely restricted for prosthetic users. To overcome the limitations, optimizing the visual information in the simulated prosthetic vision has been the focus of research. This paper proposes two image processing strategies based on a salient object detection technique. The two processing strategies enable the prosthetic implants to focus on the object of interest and suppress the background clutter. Psychophysical experiments show that techniques such as foreground zooming with background clutter removal and foreground edge detection with background reduction have positive impacts on the task of object recognition in simulated prosthetic vision. By using edge detection and zooming technique, the two processing strategies significantly improve the recognition accuracy of objects. We can conclude that the visual prosthesis using our proposed strategy can assist the blind to improve their ability to recognize objects. The results will provide effective solutions for the further development of visual prosthesis. PMID:29731769

  12. Non-conscious processes in changing health-related behaviour: a conceptual analysis and framework

    PubMed Central

    Hollands, Gareth J.; Marteau, Theresa M.; Fletcher, Paul C.

    2016-01-01

    ABSTRACT Much of the global burden of non-communicable disease is caused by unhealthy behaviours that individuals enact even when informed of their health-harming consequences. A key insight is that these behaviours are not predominantly driven by deliberative conscious decisions, but occur directly in response to environmental cues and without necessary representation of their consequences. Consequently, interventions that target non-conscious rather than conscious processes to change health behaviour may have significant potential, but this important premise remains largely untested. This is in part due to the lack of a practicable conceptual framework that can be applied to better describe and assess these interventions. We propose a framework for describing or categorising interventions to change health behaviour by the degree to which their effects may be considered non-conscious. Potential practical issues with applying such a framework are discussed, as are the implications for further research to inform the testing and development of interventions. A pragmatic means of conceptualising interventions targeted at non-conscious processes is a necessary prelude to testing the potency of such interventions. This can ultimately inform the development of interventions with the potential to shape healthier behaviours across populations. PMID:26745243

  13. Information flow analysis and Petri-net-based modeling for welding flexible manufacturing cell

    NASA Astrophysics Data System (ADS)

    Qiu, T.; Chen, Shanben; Wang, Y. T.; Wu, Lin

    2000-10-01

    Due to the development of advanced manufacturing technology and the introduction of Smart-Manufacturing notion in the field of modern industrial production, welding flexible manufacturing system (WFMS) using robot technology has become the inevitable developing direction on welding automation. In WFMS process, the flexibility for different welding products and the realizing on corresponding welding parameters control are the guarantees for welding quality. Based on a new intelligent arc-welding flexible manufacturing cell (WFMC), the system structure and control policies are studied in this paper. Aiming at the different information flows among every subsystem and central monitoring computer in this WFMC, Petri net theory is introduced into the process of welding manufacturing. With its help, a discrete control model of WFMC has been constructed, in which the system status is regarded as place and the control process is regarded as transition. Moreover, grounded on automation Petri net principle, the judging and utilizing of information obtained from welding sensors are imported into net structure, which extends the traditional Petri net concepts. The control model and policies researched in this paper have established foundation for further intelligent real-time control on WFMC and WFMS.

  14. Non-conscious processes in changing health-related behaviour: a conceptual analysis and framework.

    PubMed

    Hollands, Gareth J; Marteau, Theresa M; Fletcher, Paul C

    2016-12-01

    Much of the global burden of non-communicable disease is caused by unhealthy behaviours that individuals enact even when informed of their health-harming consequences. A key insight is that these behaviours are not predominantly driven by deliberative conscious decisions, but occur directly in response to environmental cues and without necessary representation of their consequences. Consequently, interventions that target non-conscious rather than conscious processes to change health behaviour may have significant potential, but this important premise remains largely untested. This is in part due to the lack of a practicable conceptual framework that can be applied to better describe and assess these interventions. We propose a framework for describing or categorising interventions to change health behaviour by the degree to which their effects may be considered non-conscious. Potential practical issues with applying such a framework are discussed, as are the implications for further research to inform the testing and development of interventions. A pragmatic means of conceptualising interventions targeted at non-conscious processes is a necessary prelude to testing the potency of such interventions. This can ultimately inform the development of interventions with the potential to shape healthier behaviours across populations.

  15. A cognitive information processing framework for distributed sensor networks

    NASA Astrophysics Data System (ADS)

    Wang, Feiyi; Qi, Hairong

    2004-09-01

    In this paper, we present a cognitive agent framework (CAF) based on swarm intelligence and self-organization principles, and demonstrate it through collaborative processing for target classification in sensor networks. The framework involves integrated designs to provide both cognitive behavior at the organization level to conquer complexity and reactive behavior at the individual agent level to retain simplicity. The design tackles various problems in the current information processing systems, including overly complex systems, maintenance difficulties, increasing vulnerability to attack, lack of capability to tolerate faults, and inability to identify and cope with low-frequency patterns. An important and distinguishing point of the presented work from classical AI research is that the acquired intelligence does not pertain to distinct individuals but to groups. It also deviates from multi-agent systems (MAS) due to sheer quantity of extremely simple agents we are able to accommodate, to the degree that some loss of coordination messages and behavior of faulty/compromised agents will not affect the collective decision made by the group.

  16. Image processing system for the measurement of timber truck loads

    NASA Astrophysics Data System (ADS)

    Carvalho, Fernando D.; Correia, Bento A. B.; Davies, Roger; Rodrigues, Fernando C.; Freitas, Jose C. A.

    1993-01-01

    The paper industry uses wood as its raw material. To know the quantity of wood in the pile of sawn tree trunks, every truck load entering the plant is measured to determine its volume. The objective of this procedure is to know the solid volume of wood stocked in the plant. Weighing the tree trunks has its own problems, due to their high capacity for absorbing water. Image processing techniques were used to evaluate the volume of a truck load of logs of wood. The system is based on a PC equipped with an image processing board using data flow processors. Three cameras allow image acquisition of the sides and rear of the truck. The lateral images contain information about the sectional area of the logs, and the rear image contains information about the length of the logs. The machine vision system and the implemented algorithms are described. The results being obtained with the industrial prototype that is now installed in a paper mill are also presented.

  17. Differential growth of wrinkled biofilms

    NASA Astrophysics Data System (ADS)

    Espeso, D. R.; Carpio, A.; Einarsson, B.

    2015-02-01

    Biofilms are antibiotic-resistant bacterial aggregates that grow on moist surfaces and can trigger hospital-acquired infections. They provide a classical example in biology where the dynamics of cellular communities may be observed and studied. Gene expression regulates cell division and differentiation, which affect the biofilm architecture. Mechanical and chemical processes shape the resulting structure. We gain insight into the interplay between cellular and mechanical processes during biofilm development on air-agar interfaces by means of a hybrid model. Cellular behavior is governed by stochastic rules informed by a cascade of concentration fields for nutrients, waste, and autoinducers. Cellular differentiation and death alter the structure and the mechanical properties of the biofilm, which is deformed according to Föppl-Von Kármán equations informed by cellular processes and the interaction with the substratum. Stiffness gradients due to growth and swelling produce wrinkle branching. We are able to reproduce wrinkled structures often formed by biofilms on air-agar interfaces, as well as spatial distributions of differentiated cells commonly observed with B. subtilis.

  18. Coupling of Laser with Plasma Arc to Facilitate Hybrid Welding of Metallic Materials: A Review

    NASA Astrophysics Data System (ADS)

    Zhiyong, Li; Srivatsan, T. S.; Yan, LI; Wenzhao, Zhang

    2013-02-01

    Hybrid laser arc welding combines the advantages of laser welding and arc welding. Ever since its origination in the late 1970s, this technique has gained gradual attention and progressive use due to a combination of high welding speed, better formation of weld bead, gap tolerance, and increased penetration coupled with less distortion. In hybrid laser arc welding, one of the reasons for the observed improvement is an interaction or coupling effect between the plasma arc, laser beam, droplet transfer, and the weld pool. Few researchers have made an attempt to study different aspects of the process to facilitate a better understanding. It is difficult to get a thorough understanding of the process if only certain information in a certain field is provided. In this article, an attempt to analyze the coupling effect of the process was carried out based on a careful review of the research work that has been done which provides useful information from a different prospective.

  19. Sex differences in the processing of flankers.

    PubMed

    Stoet, Gijsbert

    2010-04-01

    The study of sex differences in cognition has often focused on differences in spatial processing. Recently, sex differences in selective attention have been observed by Bayliss, di Pellegrino, and Tipper (2005), showing that women are more influenced than men by irrelevant spatial cues. The current study elaborates on this finding and tests whether sex differences in the processing of irrelevant information also occur in a simpler task, in which there is no need to redirect visual attention and no need to remember multiple spatial stimulus-response associations. Here, attention is studied using a novel combination of a go/no-go task and a flanker task. A total of 80 neurotypical participants were studied, and it was found that responses in women were more strongly affected by flanker information than were responses in men. This suggests that these sex differences were not due to difficulties with spatial reorientation, or remembering spatial stimulus-response relationships. The findings are discussed in the context of the hunter-gatherer theory of sex differences.

  20. Immobilization of pH-sensitive CdTe Quantum Dots in a Poly(acrylate) Hydrogel for Microfluidic Applications

    NASA Astrophysics Data System (ADS)

    Franke, M.; Leubner, S.; Dubavik, A.; George, A.; Savchenko, T.; Pini, C.; Frank, P.; Melnikau, D.; Rakovich, Y.; Gaponik, N.; Eychmüller, A.; Richter, A.

    2017-04-01

    Microfluidic devices present the basis of modern life sciences and chemical information processing. To control the flow and to allow optical readout, a reliable sensor material that can be easily utilized for microfluidic systems is in demand. Here, we present a new optical readout system for pH sensing based on pH sensitive, photoluminescent glutathione capped cadmium telluride quantum dots that are covalently immobilized in a poly(acrylate) hydrogel. For an applicable pH sensing the generated hybrid material is integrated in a microfluidic sensor chip setup. The hybrid material not only allows in situ readout, but also possesses valve properties due to the swelling behavior of the poly(acrylate) hydrogel. In this work, the swelling property of the hybrid material is utilized in a microfluidic valve seat, where a valve opening process is demonstrated by a fluid flow change and in situ monitored by photoluminescence quenching. This discrete photoluminescence detection (ON/OFF) of the fluid flow change (OFF/ON) enables upcoming chemical information processing.

  1. Cortical network architecture for context processing in primate brain

    PubMed Central

    Chao, Zenas C; Nagasaka, Yasuo; Fujii, Naotaka

    2015-01-01

    Context is information linked to a situation that can guide behavior. In the brain, context is encoded by sensory processing and can later be retrieved from memory. How context is communicated within the cortical network in sensory and mnemonic forms is unknown due to the lack of methods for high-resolution, brain-wide neuronal recording and analysis. Here, we report the comprehensive architecture of a cortical network for context processing. Using hemisphere-wide, high-density electrocorticography, we measured large-scale neuronal activity from monkeys observing videos of agents interacting in situations with different contexts. We extracted five context-related network structures including a bottom-up network during encoding and, seconds later, cue-dependent retrieval of the same network with the opposite top-down connectivity. These findings show that context is represented in the cortical network as distributed communication structures with dynamic information flows. This study provides a general methodology for recording and analyzing cortical network neuronal communication during cognition. DOI: http://dx.doi.org/10.7554/eLife.06121.001 PMID:26416139

  2. Mass media in health promotion: an analysis using an extended information-processing model.

    PubMed

    Flay, B R; DiTecco, D; Schlegel, R P

    1980-01-01

    The information-processing model of the attitude and behavior change process was critically examined and extended from six to 12 levels for a better analysis of change due to mass media campaigns. Findings from social psychology and communications research, and from evaluations of mass media health promotion programs, were reviewed to determine how source, message, channel, receiver, and destination variables affect each of the levels of change of major interest (knowledge, beliefs, attitudes, intentions and behavior). Factors found to most likely induce permanent attitude and behavior change (most important in health promotion) were: presentation and repetition over long time periods, via multiple sources, at different times (including "prime" or high-exposure times), by multiple sources, in novel and involving ways, with appeals to multiple motives, development of social support, and provisions of appropriate behavioral skills, alternatives, and reinforcement (preferably in ways that get the active participation of the audience). Suggestions for evaluation of mass media programs that take account of this complexity were advanced.

  3. A Disorder of Executive Function and Its Role in Language Processing

    PubMed Central

    Martin, Randi C.; Allen, Corinne M.

    2014-01-01

    R. Martin and colleagues have proposed separate stores for the maintenance of phonological and semantic information in short-term memory. Evidence from patients with aphasia has shown that damage to these separable buffers has specific consequences for language comprehension and production, suggesting an interdependence between language and memory systems. This article discusses recent research on aphasic patients with limited-capacity short-term memories (STMs) and reviews evidence suggesting that deficits in retaining semantic information in STM may be caused by a disorder in the executive control process of inhibition, specific to verbal representations. In contrast, a phonological STM deficit may be due to overly rapid decay. In semantic STM deficits, it is hypothesized that the inhibitory deficit produces difficulty inhibiting irrelevant verbal representations, which may lead to excessive interference. In turn, the excessive interference associated with semantic STM deficits has implications for single-word and sentence processing, and it may be the source of the reduced STM capacity shown by these patients. PMID:18720317

  4. Voluntary eyeblinks disrupt iconic memory.

    PubMed

    Thomas, Laura E; Irwin, David E

    2006-04-01

    In the present research, we investigated whether eyeblinks interfere with cognitive processing. In Experiment 1, the participants performed a partial-report iconic memory task in which a letter array was presented for 106 msec, followed 50, 150, or 750 msec later by a tone that cued recall of onerow of the array. At a cue delay of 50 msec between array offset and cue onset, letter report accuracy was lower when the participants blinked following array presentation than under no-blink conditions; the participants made more mislocation errors under blink conditions. This result suggests that blinking interferes with the binding of object identity and object position in iconic memory. Experiment 2 demonstrated that interference due to blinks was not due merely to changes in light intensity. Experiments 3 and 4 demonstrated that other motor responses did not interfere with iconic memory. We propose a new phenomenon, cognitive blink suppression, in which blinking inhibits cognitive processing. This phenomenon may be due to neural interference. Blinks reduce activation in area V1, which may interfere with the representation of information in iconic memory.

  5. Influence of the electron-cation interaction on electron mobility in dye-sensitized ZnO and TiO2 nanocrystals: a study using ultrafast terahertz spectroscopy.

    PubMed

    Nemec, H; Rochford, J; Taratula, O; Galoppini, E; Kuzel, P; Polívka, T; Yartsev, A; Sundström, V

    2010-05-14

    Charge transport and recombination in nanostructured semiconductors are poorly understood key processes in dye-sensitized solar cells. We have employed time-resolved spectroscopies in the terahertz and visible spectral regions supplemented with Monte Carlo simulations to obtain unique information on these processes. Our results show that charge transport in the active solar cell material can be very different from that in nonsensitized semiconductors, due to strong electrostatic interaction between injected electrons and dye cations at the surface of the semiconductor nanoparticle. For ZnO, this leads to formation of an electron-cation complex which causes fast charge recombination and dramatically decreases the electron mobility even after the dissociation of the complex. Sensitized TiO2 does not suffer from this problem due to its high permittivity efficiently screening the charges.

  6. Mutual information identifies spurious Hurst phenomena in resting state EEG and fMRI data

    NASA Astrophysics Data System (ADS)

    von Wegner, Frederic; Laufs, Helmut; Tagliazucchi, Enzo

    2018-02-01

    Long-range memory in time series is often quantified by the Hurst exponent H , a measure of the signal's variance across several time scales. We analyze neurophysiological time series from electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) resting state experiments with two standard Hurst exponent estimators and with the time-lagged mutual information function applied to discretized versions of the signals. A confidence interval for the mutual information function is obtained from surrogate Markov processes with equilibrium distribution and transition matrix identical to the underlying signal. For EEG signals, we construct an additional mutual information confidence interval from a short-range correlated, tenth-order autoregressive model. We reproduce the previously described Hurst phenomenon (H >0.5 ) in the analytical amplitude of alpha frequency band oscillations, in EEG microstate sequences, and in fMRI signals, but we show that the Hurst phenomenon occurs without long-range memory in the information-theoretical sense. We find that the mutual information function of neurophysiological data behaves differently from fractional Gaussian noise (fGn), for which the Hurst phenomenon is a sufficient condition to prove long-range memory. Two other well-characterized, short-range correlated stochastic processes (Ornstein-Uhlenbeck, Cox-Ingersoll-Ross) also yield H >0.5 , whereas their mutual information functions lie within the Markovian confidence intervals, similar to neural signals. In these processes, which do not have long-range memory by construction, a spurious Hurst phenomenon occurs due to slow relaxation times and heteroscedasticity (time-varying conditional variance). In summary, we find that mutual information correctly distinguishes long-range from short-range dependence in the theoretical and experimental cases discussed. Our results also suggest that the stationary fGn process is not sufficient to describe neural data, which seem to belong to a more general class of stochastic processes, in which multiscale variance effects produce Hurst phenomena without long-range dependence. In our experimental data, the Hurst phenomenon and long-range memory appear as different system properties that should be estimated and interpreted independently.

  7. Leveraging Health Information Exchange to Improve Population Health Reporting Processes: Lessons in Using a Collaborative-Participatory Design Process

    PubMed Central

    Revere, Debra; Dixon, Brian E.; Hills, Rebecca; Williams, Jennifer L.; Grannis, Shaun J.

    2014-01-01

    Introduction: Surveillance, or the systematic monitoring of disease within a population, is a cornerstone function of public health. Despite significant investment in information technologies (IT) to improve the public’s health, health care providers continue to rely on manual, spontaneous reporting processes that can result in incomplete and delayed surveillance activities. Background: Participatory design principles advocate including real users and stakeholders when designing an information system to ensure high ecological validity of the product, incorporate relevance and context into the design, reduce misconceptions designers can make due to insufficient domain expertise, and ultimately reduce barriers to adoption of the system. This paper focuses on the collaborative and informal participatory design process used to develop enhanced, IT-enabled reporting processes that leverage available electronic health records in a health information exchange to prepopulate notifiable-conditions report forms used by public health authorities. Methods: Over nine months, public health stakeholders, technical staff, and informatics researchers were engaged in a multiphase participatory design process that included public health stakeholder focus groups, investigator-engineering team meetings, public health survey and census regarding high-priority data elements, and codesign of exploratory prototypes and final form mock-ups. Findings: A number of state-mandated report fields that are not highly used or desirable for disease investigation were eliminated, which allowed engineers to repurpose form space for desired and high-priority data elements and improve the usability of the forms. Our participatory design process ensured that IT development was driven by end user expertise and needs, resulting in significant improvements to the layout and functionality of the reporting forms. Discussion: In addition to informing report form development, engaging with public health end users and stakeholders through the participatory design process provided new insights into public health workflow and allowed the team to quickly triage user requests while managing user expectations within the realm of engineering possibilities. Conclusion: Engaging public health, engineering staff, and investigators in a shared codesigning process ensured that the new forms will not only meet real-life needs but will also support development of a product that will be adopted and, ultimately, improve communicable and infectious disease reporting by clinicians to public health. PMID:25848615

  8. Conflict minerals from the Democratic Republic of the Congo—Tin processing plants, a critical part of the tin supply chain

    USGS Publications Warehouse

    Anderson, Charles

    2015-03-24

    Post-beneficiation processing plants (generally called smelters and refineries) for 3TG mineral ores and concentrates were identified by company and industry association representatives as being a link in the 3TG mineral supply chain through which these minerals can be traced to their source of origin (mine). The determination of the source of origin is critical to the development of a complete and transparent conflict-free mineral supply chain. Tungsten processing plants were the subject of the first fact sheet in this series published by the USGS NMIC in August 2014. Background information about historical conditions and multinational stakeholders’ voluntary due diligence guidance for minerals from conflict-affected and high-risk areas was presented in the tungsten fact sheet. Tantalum processing plants were the subject of the second fact sheet in this series published by the USGS NMIC in December 2014. This fact sheet, the third in the series about 3TG minerals, focuses on the tin supply chain by listing selected processors that produced tin materials commercially worldwide during 2013–14. It does not provide any information regarding the sources of the material processed in these facilities.

  9. Frequency Domain Analysis of Sensor Data for Event Classification in Real-Time Robot Assisted Deburring

    PubMed Central

    Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi

    2017-01-01

    Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809

  10. Inferring the phase of the moon from the color of sunset

    NASA Astrophysics Data System (ADS)

    Thiermann, Ryan; Sweeney, Alison; Murugan, Arvind

    We use information theory to investigate whether patterns in the spectral progression of twilight are informative of the lunar phase. Such optical cues have been sought to explain the synchronized spawning of corals and other biological processes that are coupled to the lunar cycle. We first quantify the maximum available information about lunar phase in twilight by combining measurements of twilight spectrum and models of spectral variations due to weather and atmospheric changes. We then quantify the biophysically accessible information by accounting for the spectral resolution of opsin proteins and the temporal resolution with which organisms can track spectral changes. We find that in most climates, relative spectral variation is a more reliable indicator of lunar phase than intensity variation alone since the former is less affected by cloud cover. We also find that organisms can extract most available information with three distinct opsins and reasonable integration times.

  11. Information flow and work productivity through integrated information technology

    NASA Technical Reports Server (NTRS)

    Wigand, R. T.

    1985-01-01

    The work environment surrounding integrated office systems is reviewed. The known effects of automated office technologies is synthesized and their known impact on work efficiency is reviewed. These effects are explored with regard to their impact on networks, work flow/processes, as well as organizational structure and power. Particular emphasis is given to structural changes due to the introduction of newer information technologies in organizations. The new information technologies have restructed the average organization's middle banks and, as a consequence, they have shrunk drastically. Organizational pyramids have flattened with fewer levels since executives have realized that they can get ahold of the needed information via the new technologies quicker and directly and do not have to rely on middle-level managers. Power shifts are typically accompanied with the introduction of these technologies resulting in the generation of a new form of organizational power.

  12. Social relevance: toward understanding the impact of the individual in an information cascade

    NASA Astrophysics Data System (ADS)

    Hall, Robert T.; White, Joshua S.; Fields, Jeremy

    2016-05-01

    Information Cascades (IC) through a social network occur due to the decision of users to disseminate content. We define this decision process as User Diffusion (UD). IC models typically describe an information cascade by treating a user as a node within a social graph, where a node's reception of an idea is represented by some activation state. The probability of activation then becomes a function of a node's connectedness to other activated nodes as well as, potentially, the history of activation attempts. We enrich this Coarse-Grained User Diffusion (CGUD) model by applying actor type logics to the nodes of the graph. The resulting Fine-Grained User Diffusion (FGUD) model utilizes prior research in actor typing to generate a predictive model regarding the future influence a user will have on an Information Cascade. Furthermore, we introduce a measure of Information Resonance that is used to aid in predictions regarding user behavior.

  13. Information transduction capacity reduces the uncertainties in annotation-free isoform discovery and quantification

    PubMed Central

    Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong

    2017-01-01

    Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101

  14. Descriptive survey about causes of illness given by the parents of children with cancer.

    PubMed

    Matteo, Bernardi; Pierluigi, Badon

    2008-04-01

    When a doctor diagnoses a child's illness as cancer, parents very often react by creating wrong and unrealistic theories about the origins of their child's illness which in turn generates self-blame in the parents, who take responsibility for the disease. To find what are the parents' beliefs about the origins of their children's illness. Descriptive study. Seventy-two couples of parents whose children with cancer are under treatment in the haemato-oncology paediatric ward of the Padova hospital. They have been collected by a no probabilistic method of sampling. A questionnaire was used, based on current literature, which investigates the beliefs of the parents as to what are the causes of illness, whether the parents research information about the illness and the origins of cancer and what are the information sources they use in order to establish if there is a connection between these factors. Eighty-seven percent of the sample group thinks that there is a specific origin of their child's illness: 27% believes the cause is environmental pollution, 26% believes it is due to radiation emissions, 26% believes it is due to genetic factors and 8% believes it is due to other causes. Eighty six percent and 70% of the sample search for information about the illness and its causes; 64% of the parents state that the first meeting with the medical staff, in which the illness is explained and they are informed that there are no known causes that produce it, does not clarify their doubts. The sources more often used to search for more information and explanations are the physicians in the ward, internet and medical books. This survey confirms the importance of an "advocacy" role of the nurse in educating the caregiver and the need to create instruments which guide the parents in the informative process and the research for good information. Nurses need to be cognizant that their care is crucial not just for the child, but for the entire family.

  15. Study of amended reports to evaluate and improve surgical pathology processes.

    PubMed

    Meier, Frederick A; Varney, Ruan C; Zarbo, Richard J

    2011-09-01

    : Amended surgical pathology reports record defects in the process of transforming tissue specimens into diagnostic information. : Systematic study of amended reports tests 2 hypotheses: (a) that tracking amendment frequencies and the distribution of amendment types reveals relevant aspects of quality in surgical pathology's daily transformation of specimens into diagnoses and (b) that such tracking measures the effect, or lack of effect, of efforts to improve surgical pathology processes. : We applied a binary definition of altered reports as either amendments or addenda and a taxonomy of defects that caused amendments as misidentifications, specimen defects, misinterpretations, and report defects. During the introduction of a LEAN process improvement approach-the Henry Ford Productions System-we followed trends in amendment rates and defect fractions to (a) evaluate specific interventions, (b) sort case-by-case root causes of misidentifications, specimen defects, and misinterpretations, and (c) audit the ongoing accuracy of the classification of changed reports. LEAN is the management and production system of the Toyota Motor Corporation that promotes continuous improvement; it considers wasted resources expended for purposes other than creating value for end customers and targets such expenditures for elimination. : Introduction of real-time editing of amendments saw annual amendment rates increase from 4.8/1000 to 10.1/1000 and then decrease in an incremental manner to 5.6/1000 as Henry Ford Productions System-specific interventions were introduced. Before introduction of HFPS interventions, about a fifth of the amendments were due to misidentifications, a 10th were due to specimen defects, a quarter due to misinterpretation, and almost half were due to report defects. During the period of the initial application of HFPS, the fraction of amendments due to misidentifications decreased as those due to report defects increased, in a statistically linked manner. As HFPS interventions took hold, misidentifications fell from 16% to 9%, specimen defect rates remained variable, ranging between 2% and 11%, and misinterpretations fell from 18% to 3%. Reciprocally, report defects rose from 64% to 83% of all amendment-causing defects. A case-by-case study of misidentifications, specimen defects, and misinterpretations found that (a) intervention at the specimen collection level had disappointingly little effect on patient misidentifications; (b) standardization of specimen accession and gross examination reduced only specimen defects surrounding ancillary testing; but (c) a double review of breast and prostate cases was associated with drastically reduced misinterpretation defects. Finally, audit of both amendments and addenda demonstrated that 10% of the so-called addenda actually qualified as amendments. : Monitored by the consistent taxonomy, rates of amended reports first rose, then fell. Examining specific defect categories provided information for evaluating specific LEAN interventions. Tracking the downward trend of amendment rates seemed to document the overall success of surgical pathology quality improvement efforts. Process improvements modestly decreased fractions of misidentifications and markedly decreased misinterpretation fractions. Classification integrity requires real time, independent editing of both amendments (changed reports) and addenda (addition to reports).

  16. Web Based Information System for Job Training Activities Using Personal Extreme Programming (PXP)

    NASA Astrophysics Data System (ADS)

    Asri, S. A.; Sunaya, I. G. A. M.; Rudiastari, E.; Setiawan, W.

    2018-01-01

    Job training is one of the subjects in university or polytechnic that involves many users and reporting activities. Time and distance became problems for users to reporting and to do obligations tasks during job training due to the location where the job training took place. This research tried to develop a web based information system of job training to overcome the problems. This system was developed using Personal Extreme Programming (PXP). PXP is one of the agile methods is combination of Extreme Programming (XP) and Personal Software Process (PSP). The information system that has developed and tested which are 24% of users are strongly agree, 74% are agree, 1% disagree and 0% strongly disagree about system functionality.

  17. Track-to-track association for object matching in an inter-vehicle communication system

    NASA Astrophysics Data System (ADS)

    Yuan, Ting; Roth, Tobias; Chen, Qi; Breu, Jakob; Bogdanovic, Miro; Weiss, Christian A.

    2015-09-01

    Autonomous driving poses unique challenges for vehicle environment perception due to the complex driving environment the autonomous vehicle finds itself in and differentiates from remote vehicles. Due to inherent uncertainty of the traffic environments and incomplete knowledge due to sensor limitation, an autonomous driving system using only local onboard sensor information is generally not sufficiently enough for conducting a reliable intelligent driving with guaranteed safety. In order to overcome limitations of the local (host) vehicle sensing system and to increase the likelihood of correct detections and classifications, collaborative information from cooperative remote vehicles could substantially facilitate effectiveness of vehicle decision making process. Dedicated Short Range Communication (DSRC) system provides a powerful inter-vehicle wireless communication channel to enhance host vehicle environment perceiving capability with the aid of transmitted information from remote vehicles. However, there is a major challenge before one can fuse the DSRC-transmitted remote information and host vehicle Radar-observed information (in the present case): the remote DRSC data must be correctly associated with the corresponding onboard Radar data; namely, an object matching problem. Direct raw data association (i.e., measurement-to-measurement association - M2MA) is straightforward but error-prone, due to inherent uncertain nature of the observation data. The uncertainties could lead to serious difficulty in matching decision, especially, using non-stationary data. In this study, we present an object matching algorithm based on track-to-track association (T2TA) and evaluate the proposed approach with prototype vehicles in real traffic scenarios. To fully exploit potential of the DSRC system, only GPS position data from remote vehicle are used in fusion center (at host vehicle), i.e., we try to get what we need from the least amount of information; additional feature information can help the data association but are not currently considered. Comparing to M2MA, benefits of the T2TA object matching approach are: i) tracks taking into account important statistical information can provide more reliable inference results; ii) the track-formed smoothed trajectories can be used for an easier shape matching; iii) each local vehicle can design its own tracker and sends only tracks to fusion center to alleviate communication constraints. A real traffic study with different driving environments, based on a statistical hypothesis test, shows promising object matching results of significant practical implications.

  18. Proceedings of the International Workshop on High-Level Language Computer Architecture, May 26-28, 1980, Fort Lauderdale, Florida

    DTIC Science & Technology

    1980-06-01

    pro - due to the instructions’ higher Information con - duce. These macros can then be Interpreted by tent. In Myer’s example, the number of instruc...the pro - power fail occurs only when tWe ma- cess on the next lower level machine. It con - chine is in certain states at some tinues checking lower...be pro - cessed. This line is accessed and con - Tabl3 1. Comparison of Process~ing Time catenated with the present contents of the W and W/O ’Line

  19. Identifying unmet informational needs in the inpatient setting to increase patient and caregiver engagement in the context of pediatric hematopoietic stem cell transplantation.

    PubMed

    Kaziunas, Elizabeth; Hanauer, David A; Ackerman, Mark S; Choi, Sung Won

    2016-01-01

    Patient-centered care has been shown to improve patient outcomes, satisfaction, and engagement. However, there is a paucity of research on patient-centered care in the inpatient setting, including an understanding of unmet informational needs that may be limiting patient engagement. Pediatric hematopoietic stem cell transplantation (HSCT) represents an ideal patient population for elucidating unmet informational needs, due to the procedure's complexity and its requirement for caregiver involvement. We conducted field observations and semi-structured interviews of pediatric HSCT caregivers and patients to identify informational challenges in the inpatient hospital setting. Data were analyzed using a thematic grounded theory approach. Three stages of the caregiving experience that could potentially be supported by a health information technology system, with the goal of enhancing patient/caregiver engagement, were identified: (1) navigating the health system and learning to communicate effectively with the healthcare team, (2) managing daily challenges of caregiving, and (3) transitioning from inpatient care to long-term outpatient management. We provide four practical recommendations to meet the informational needs of pediatric HSCT patients and caregivers: (1) provide patients/caregivers with real-time access to electronic health record data, (2) provide information about the clinical trials in which the patient is enrolled, (3) provide information about the patient's care team, and (4) properly prepare patients and caregivers for hospital discharge. Pediatric HSCT caregivers and patients have multiple informational needs that could be met with a health information technology system that integrates data from several sources, including electronic health records. Meeting these needs could reduce patients' and caregivers' anxiety surrounding the care process; reduce information asymmetry between caregivers/patients and providers; empower patients/caregivers to participate in the care process; and, ultimately, increase patient/caregiver engagement in the care process. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Harnessing ecosystem models and multi-criteria decision analysis for the support of forest management.

    PubMed

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  1. Harnessing Ecosystem Models and Multi-Criteria Decision Analysis for the Support of Forest Management

    NASA Astrophysics Data System (ADS)

    Wolfslehner, Bernhard; Seidl, Rupert

    2010-12-01

    The decision-making environment in forest management (FM) has changed drastically during the last decades. Forest management planning is facing increasing complexity due to a widening portfolio of forest goods and services, a societal demand for a rational, transparent decision process and rising uncertainties concerning future environmental conditions (e.g., climate change). Methodological responses to these challenges include an intensified use of ecosystem models to provide an enriched, quantitative information base for FM planning. Furthermore, multi-criteria methods are increasingly used to amalgamate information, preferences, expert judgments and value expressions, in support of the participatory and communicative dimensions of modern forestry. Although the potential of combining these two approaches has been demonstrated in a number of studies, methodological aspects in interfacing forest ecosystem models (FEM) and multi-criteria decision analysis (MCDA) are scarcely addressed explicitly. In this contribution we review the state of the art in FEM and MCDA in the context of FM planning and highlight some of the crucial issues when combining ecosystem and preference modeling. We discuss issues and requirements in selecting approaches suitable for supporting FM planning problems from the growing body of FEM and MCDA concepts. We furthermore identify two major challenges in a harmonized application of FEM-MCDA: (i) the design and implementation of an indicator-based analysis framework capturing ecological and social aspects and their interactions relevant for the decision process, and (ii) holistic information management that supports consistent use of different information sources, provides meta-information as well as information on uncertainties throughout the planning process.

  2. Meta-control of combustion performance with a data mining approach

    NASA Astrophysics Data System (ADS)

    Song, Zhe

    Large scale combustion process is complex and proposes challenges of optimizing its performance. Traditional approaches based on thermal dynamics have limitations on finding optimal operational regions due to time-shift nature of the process. Recent advances in information technology enable people collect large volumes of process data easily and continuously. The collected process data contains rich information about the process and, to some extent, represents a digital copy of the process over time. Although large volumes of data exist in industrial combustion processes, they are not fully utilized to the level where the process can be optimized. Data mining is an emerging science which finds patterns or models from large data sets. It has found many successful applications in business marketing, medical and manufacturing domains The focus of this dissertation is on applying data mining to industrial combustion processes, and ultimately optimizing the combustion performance. However the philosophy, methods and frameworks discussed in this research can also be applied to other industrial processes. Optimizing an industrial combustion process has two major challenges. One is the underlying process model changes over time and obtaining an accurate process model is nontrivial. The other is that a process model with high fidelity is usually highly nonlinear, solving the optimization problem needs efficient heuristics. This dissertation is set to solve these two major challenges. The major contribution of this 4-year research is the data-driven solution to optimize the combustion process, where process model or knowledge is identified based on the process data, then optimization is executed by evolutionary algorithms to search for optimal operating regions.

  3. Heterosexual female adolescents' decision-making about sexual intercourse and pregnancy in rural Ontario, Canada.

    PubMed

    Ezer, Paulina; Leipert, Bev; Evans, Marilyn; Regan, Sandra

    2016-01-01

    Rural female adolescents experience unique circumstances to sexual health care and information as compared to urban adolescents. These circumstances are largely due to their more isolated geographical location and rural sociocultural factors. These circumstances may be contributing factors to an incidence of adolescent pregnancy that is higher in rural areas than in urban cities. Thus, this higher incidence of pregnancy may be due to the ways in which rural adolescents make decisions regarding engagement in sexual intercourse. However, the rural female adolescent sexual decision-making process has rarely, if ever, been studied, and further investigation of this process is necessary. Focusing on rural female adolescents aged 16-19 years is especially significant as this age range is used for reporting most pregnancy and birth statistics in Ontario. Charmaz's guidelines for a constructivist grounded theory methodology were used to gain an in-depth understanding of eight Ontario rural female adolescents' decision-making process regarding sexual intercourse and pregnancy, and how they viewed rural factors and circumstances influencing this process. Research participants were obtained through initial sampling (from criteria developed prior to the study) and theoretical sampling (by collecting data that better inform the categories emerging from the data). Eight participants, aged 16-19 years, were invited to each take part in 1-2-hour individual interviews, and four of these participants were interviewed a second time to verify and elaborate on emerging constructed concepts, conceptual relationships, and the developing process. Data collection and analysis included both field notes and individual interviews in person and over the telephone. Data were analyzed for emerging themes to construct a theory to understand the participants' experiences making sexual decisions in a rural environment. The adolescent sexual decision-making process, Prioritizing Influences, that emerged from the analysis was a complex and non-linear process that involved prioritizing four influences within the rural context. The influences that participants of this study described as being part of their sexual decision-making process were personal values and circumstances, family values and expectations, friends' influences, and community influences. When influences coincided, they strengthened participants' sexual decisions, whereas when influences opposed each other, participants felt conflicted and prioritized the influence that had the most effect on their personal lives and future goals. Although these influences may be common to all adolescents, they impact the rural female adolescent sexual decision-making process by influencing and being influenced by geographical and sociocultural factors that make up the rural context. This study reveals important new and preliminary information about rural female adolescents' sexual decision-making process and factors that affect it. Findings improve understanding of how rural female adolescents make choices regarding sexual intercourse and pregnancy and can be used to guide future research projects that could facilitate effective development of sexual health promotion initiatives, inform rural health policy and practices, and enhance existing sexual education programs in rural communities.

  4. Leveraging human oversight and intervention in large-scale parallel processing of open-source data

    NASA Astrophysics Data System (ADS)

    Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.

    2015-05-01

    The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.

  5. The role of information search in seeking alternative treatment for back pain: a qualitative analysis

    PubMed Central

    2014-01-01

    Background Health consumers have moved away from a reliance on medical practitioner advice to more independent decision processes and so their information search processes have subsequently widened. This study examined how persons with back pain searched for alternative treatment types and service providers. That is, what information do they seek and how; what sources do they use and why; and by what means do they search for it? Methods 12 persons with back pain were interviewed. The method used was convergent interviewing. This involved a series of semi-structured questions to obtain open-ended answers. The interviewer analysed the responses and refined the questions after each interview, to converge on the dominant factors influencing decisions about treatment patterns. Results Persons with back pain mainly search their memories and use word of mouth (their doctor and friends) for information about potential treatments and service providers. Their search is generally limited due to personal, provider-related and information-supply reasons. However, they did want in-depth information about the alternative treatments and providers in an attempt to establish apriori their efficacy in treating their specific back problems. They searched different sources depending on the type of information they required. Conclusions The findings differ from previous studies about the types of information health consumers require when searching for information about alternative or mainstream healthcare services. The results have identified for the first time that limited information availability was only one of three categories of reasons identified about why persons with back pain do not search for more information particularly from external non-personal sources. PMID:24725300

  6. Characterization of Explosives Processing Waste Decomposition Due to Composting. Phase 2

    DTIC Science & Technology

    1992-11-01

    with Ceriodaphnia (10 replicates, each containing 15 mL of test solution and one neonate ). In each temporal block of tests, Ceriodsnhnia survival and... neonate per replicate). This reference validated the biological quality of the dilution water, the Ceriodaphnia food, the test conditions (e.g...incubation temperature and photoperiod), and the health of the neonates used to initiate the tests. Information about the leachates, including the

  7. Demons registration for in vivo and deformable laser scanning confocal endomicroscopy.

    PubMed

    Chiew, Wei-Ming; Lin, Feng; Seah, Hock Soon

    2017-09-01

    A critical effect found in noninvasive in vivo endomicroscopic imaging modalities is image distortions due to sporadic movement exhibited by living organisms. In three-dimensional confocal imaging, this effect results in a dataset that is tilted across deeper slices. Apart from that, the sequential flow of the imaging-processing pipeline restricts real-time adjustments due to the unavailability of information obtainable only from subsequent stages. To solve these problems, we propose an approach to render Demons-registered datasets as they are being captured, focusing on the coupling between registration and visualization. To improve the acquisition process, we also propose a real-time visual analytics tool, which complements the imaging pipeline and the Demons registration pipeline with useful visual indicators to provide real-time feedback for immediate adjustments. We highlight the problem of deformation within the visualization pipeline for object-ordered and image-ordered rendering. Visualizations of critical information including registration forces and partial renderings of the captured data are also presented in the analytics system. We demonstrate the advantages of the algorithmic design through experimental results with both synthetically deformed datasets and actual in vivo, time-lapse tissue datasets expressing natural deformations. Remarkably, this algorithm design is for embedded implementation in intelligent biomedical imaging instrumentation with customizable circuitry. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  8. US residential building air exchange rates: new perspectives to improve decision making at vapor intrusion sites.

    PubMed

    Reichman, Rivka; Shirazi, Elham; Colliver, Donald G; Pennell, Kelly G

    2017-02-22

    Vapor intrusion (VI) is well-known to be difficult to characterize because indoor air (IA) concentrations exhibit considerable temporal and spatial variability in homes throughout impacted communities. To overcome this and other limitations, most VI science has focused on subsurface processes; however there is a need to understand the role of aboveground processes, especially building operation, in the context of VI exposure risks. This tutorial review focuses on building air exchange rates (AERs) and provides a review of literature related building AERs to inform decision making at VI sites. Commonly referenced AER values used by VI regulators and practitioners do not account for the variability in AER values that have been published in indoor air quality studies. The information presented herein highlights that seasonal differences, short-term weather conditions, home age and air conditioning status, which are well known to influence AERs, are also likely to influence IA concentrations at VI sites. Results of a 3D VI model in combination with relevant AER values reveal that IA concentrations can vary more than one order of magnitude due to air conditioning status and one order of magnitude due to house age. Collectively, the data presented strongly support the need to consider AERs when making decisions at VI sites.

  9. A GPU-Accelerated Approach for Feature Tracking in Time-Varying Imagery Datasets.

    PubMed

    Peng, Chao; Sahani, Sandip; Rushing, John

    2017-10-01

    We propose a novel parallel connected component labeling (CCL) algorithm along with efficient out-of-core data management to detect and track feature regions of large time-varying imagery datasets. Our approach contributes to the big data field with parallel algorithms tailored for GPU architectures. We remove the data dependency between frames and achieve pixel-level parallelism. Due to the large size, the entire dataset cannot fit into cached memory. Frames have to be streamed through the memory hierarchy (disk to CPU main memory and then to GPU memory), partitioned, and processed as batches, where each batch is small enough to fit into the GPU. To reconnect the feature regions that are separated due to data partitioning, we present a novel batch merging algorithm to extract the region connection information across multiple batches in a parallel fashion. The information is organized in a memory-efficient structure and supports fast indexing on the GPU. Our experiment uses a commodity workstation equipped with a single GPU. The results show that our approach can efficiently process a weather dataset composed of terabytes of time-varying radar images. The advantages of our approach are demonstrated by comparing to the performance of an efficient CPU cluster implementation which is being used by the weather scientists.

  10. Demons registration for in vivo and deformable laser scanning confocal endomicroscopy

    NASA Astrophysics Data System (ADS)

    Chiew, Wei Ming; Lin, Feng; Seah, Hock Soon

    2017-09-01

    A critical effect found in noninvasive in vivo endomicroscopic imaging modalities is image distortions due to sporadic movement exhibited by living organisms. In three-dimensional confocal imaging, this effect results in a dataset that is tilted across deeper slices. Apart from that, the sequential flow of the imaging-processing pipeline restricts real-time adjustments due to the unavailability of information obtainable only from subsequent stages. To solve these problems, we propose an approach to render Demons-registered datasets as they are being captured, focusing on the coupling between registration and visualization. To improve the acquisition process, we also propose a real-time visual analytics tool, which complements the imaging pipeline and the Demons registration pipeline with useful visual indicators to provide real-time feedback for immediate adjustments. We highlight the problem of deformation within the visualization pipeline for object-ordered and image-ordered rendering. Visualizations of critical information including registration forces and partial renderings of the captured data are also presented in the analytics system. We demonstrate the advantages of the algorithmic design through experimental results with both synthetically deformed datasets and actual in vivo, time-lapse tissue datasets expressing natural deformations. Remarkably, this algorithm design is for embedded implementation in intelligent biomedical imaging instrumentation with customizable circuitry.

  11. A Weld Position Recognition Method Based on Directional and Structured Light Information Fusion in Multi-Layer/Multi-Pass Welding.

    PubMed

    Zeng, Jinle; Chang, Baohua; Du, Dong; Wang, Li; Chang, Shuhe; Peng, Guodong; Wang, Wenzhu

    2018-01-05

    Multi-layer/multi-pass welding (MLMPW) technology is widely used in the energy industry to join thick components. During automatic welding using robots or other actuators, it is very important to recognize the actual weld pass position using visual methods, which can then be used not only to perform reasonable path planning for actuators, but also to correct any deviations between the welding torch and the weld pass position in real time. However, due to the small geometrical differences between adjacent weld passes, existing weld position recognition technologies such as structured light methods are not suitable for weld position detection in MLMPW. This paper proposes a novel method for weld position detection, which fuses various kinds of information in MLMPW. First, a synchronous acquisition method is developed to obtain various kinds of visual information when directional light and structured light sources are on, respectively. Then, interferences are eliminated by fusing adjacent images. Finally, the information from directional and structured light images is fused to obtain the 3D positions of the weld passes. Experiment results show that each process can be done in 30 ms and the deviation is less than 0.6 mm. The proposed method can be used for automatic path planning and seam tracking in the robotic MLMPW process as well as electron beam freeform fabrication process.

  12. Enterprise resource planning for hospitals.

    PubMed

    van Merode, Godefridus G; Groothuis, Siebren; Hasman, Arie

    2004-06-30

    Integrated hospitals need a central planning and control system to plan patients' processes and the required capacity. Given the changes in healthcare one can ask the question what type of information systems can best support these healthcare delivery organizations. We focus in this review on the potential of enterprise resource planning (ERP) systems for healthcare delivery organizations. First ERP systems are explained. An overview is then presented of the characteristics of the planning process in hospital environments. Problems with ERP that are due to the special characteristics of healthcare are presented. The situations in which ERP can or cannot be used are discussed. It is suggested to divide hospitals in a part that is concerned only with deterministic processes and a part that is concerned with non-deterministic processes. ERP can be very useful for planning and controlling the deterministic processes.

  13. [Potential of Information and Communications Technology to Improve Intersectoral Processes of Care: A Case Study of the Specialised Outpatient Palliative Care].

    PubMed

    Meyer-Delpho, C; Schubert, H-J

    2015-09-01

    The added value of information and communications technologies should be demonstrated precisely in such areas of care in which the importance of intersectoral and interdisciplinary cooperation is particularly high. In the context of the accompanying research of a supply concept for palliative care patients, the potential of a digital documentation process was comparatively analysed with the conventional paper-based workflow. Data were collected in the form of a multi-methodological approach and processed for the project in 3 stages: (1) Development and analysis of a palliative care process with the focus on all relevant steps of documentation. (2) Questionnaire design and the comparative mapping of specific process times. (3) Sampling, selection, and analysis of patient records and their derivable insights of process iterations. With the use of ICT, the treatment time per patient is reduced by up to 53% and achieves a reduction in costs and workload by up to 901 min. The result of an up to 213% increase in the number of patient contacts allows a higher continuity of care. Although the 16% increase in documentation loyalty improves the usability of cross-team documented information, it partially extends the workload on the level of individual actors. By using a digital health record around 31% more patients could be treated with the same staffing ratio. The multi-stage analysis of the palliative care process showed that ICT has a decisive influence on the process dimension of intersectoral cooperation. Due to favourable organisational conditions the pioneering work of palliative care also provides important guidance for a successful use of ICT technologies in the context of innovative forms of care. © Georg Thieme Verlag KG Stuttgart · New York.

  14. A numerical model to simulate foams during devolatilization of polymers

    NASA Astrophysics Data System (ADS)

    Khan, Irfan; Dixit, Ravindra

    2014-11-01

    Customers often demand that the polymers sold in the market have low levels of volatile organic compounds (VOC). Some of the processes for making polymers involve the removal of volatiles to the levels of parts per million (devolatilization). During this step the volatiles are phase separated out of the polymer through a combination of heating and applying lower pressure, creating foam with the pure polymer in liquid phase and the volatiles in the gas phase. The efficiency of the devolatilization process depends on predicting the onset of solvent phase change in the polymer and volatiles mixture accurately based on the processing conditions. However due to the complex relationship between the polymer properties and the processing conditions this is not trivial. In this work, a bubble scale model is coupled with a bulk scale transport model to simulate the processing conditions of polymer devolatilization. The bubble scale model simulates the nucleation and bubble growth based on the classical nucleation theory and the popular ``influence volume approach.'' As such it provides the information of bubble size distribution and number density inside the polymer at any given time and position. This information is used to predict the bulk properties of the polymer and its behavior under the applied processing conditions. Initial results of this modeling approach will be presented.

  15. Triazolam and zolpidem: effects on human memory and attentional processes.

    PubMed

    Mintzer, M Z; Griffiths, R R

    1999-05-01

    The imidazopyridine hypnotic zolpidem may produce less memory and cognitive impairment than classic benzodiazepines, due to its relatively low binding affinity for the benzodiazepine receptor subtypes found in areas of the brain which are involved in learning and memory. The study was designed to compare the acute effects of single oral doses of zolpidem (5, 10, 20 mg/70 kg) and the benzodiazepine hypnotic triazolam (0.125, 0.25, and 0.5 mg/70 kg) on specific memory and attentional processes. Drug effects on memory for target (i.e., focal) information and contextual information (i.e., peripheral details surrounding a target stimulus presentation) were evaluated using a source monitoring paradigm, and drug effects on selective attention mechanisms were evaluated using a negative priming paradigm, in 18 healthy volunteers in a double-blind, placebo-controlled, crossover design. Triazolam and zolpidem produced strikingly similar dose-related effects on memory for target information. Both triazolam and zolpidem impaired subjects' ability to remember whether a word stimulus had been presented to them on the computer screen or whether they had been asked to generate the stimulus based on an antonym cue (memory for the origin of a stimulus, which is one type of contextual information). The results suggested that triazolam, but not zolpidem, impaired memory for the screen location of picture stimuli (spatial contextual information). Although both triazolam and zolpidem increased overall reaction time in the negative priming task, only triazolam increased the magnitude of negative priming relative to placebo. The observed differences between triazolam and zolpidem have implications for the cognitive and pharmacological mechanisms underlying drug-induced deficits in specific memory and attentional processes, as well for the cognitive and brain mechanisms underlying these processes.

  16. The Generalization of Mutual Information as the Information between a Set of Variables: The Information Correlation Function Hierarchy and the Information Structure of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Wolf, David R.

    2004-01-01

    The topic of this paper is a hierarchy of information-like functions, here named the information correlation functions, where each function of the hierarchy may be thought of as the information between the variables it depends upon. The information correlation functions are particularly suited to the description of the emergence of complex behaviors due to many- body or many-agent processes. They are particularly well suited to the quantification of the decomposition of the information carried among a set of variables or agents, and its subsets. In more graphical language, they provide the information theoretic basis for understanding the synergistic and non-synergistic components of a system, and as such should serve as a forceful toolkit for the analysis of the complexity structure of complex many agent systems. The information correlation functions are the natural generalization to an arbitrary number of sets of variables of the sequence starting with the entropy function (one set of variables) and the mutual information function (two sets). We start by describing the traditional measures of information (entropy) and mutual information.

  17. Interactive Whiteboard Integration in Classrooms: Active Teachers Understanding about Their Training Process

    NASA Astrophysics Data System (ADS)

    Pujol, Meritxell Cortada; Quintana, Maria Graciela Badilla; Romaní, Jordi Riera

    With the incorporation in education of Information and Communication Technologies (ICT), especially the Interactive Whiteboard (IWB), emerges the need for a proper teacher training process due to adequate the integration and the didactic use of this tool in the classroom. This article discusses the teachers' perception on the training process for ICT integration. Its main aim is to contribute to the unification of minimum criteria for effective ICT implementation in any training process for active teachers. This case study begins from the development of a training model called Eduticom which was putted into practice in 4 schools in Catalonia, Spain. Findings indicated different teachers' needs such as an appropriate infrastructure, a proper management and a flexible training model which essentially addresses methodological and didactic aspects of IWB uses in the classroom.

  18. Directional dual-tree rational-dilation complex wavelet transform.

    PubMed

    Serbes, Gorkem; Gulcur, Halil Ozcan; Aydin, Nizamettin

    2014-01-01

    Dyadic discrete wavelet transform (DWT) has been used successfully in processing signals having non-oscillatory transient behaviour. However, due to the low Q-factor property of their wavelet atoms, the dyadic DWT is less effective in processing oscillatory signals such as embolic signals (ESs). ESs are extracted from quadrature Doppler signals, which are the output of Doppler ultrasound systems. In order to process ESs, firstly, a pre-processing operation known as phase filtering for obtaining directional signals from quadrature Doppler signals must be employed. Only then, wavelet based methods can be applied to these directional signals for further analysis. In this study, a directional dual-tree rational-dilation complex wavelet transform, which can be applied directly to quadrature signals and has the ability of extracting directional information during analysis, is introduced.

  19. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    NASA Astrophysics Data System (ADS)

    Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto

    2016-06-01

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  20. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methodsmore » and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.« less

  1. Presentation format effects in a levels-of-processing task.

    PubMed

    Foos, Paul W; Goolkasian, Paula

    2008-01-01

    Three experiments were conducted to examine better performance in long-term memory when stimulus items are pictures or spoken words compared to printed words. Hypotheses regarding the allocation of attention to printed words, the semantic link between pictures and processing, and a rich long-term representation for pictures were tested. Using levels-of-processing tasks eliminated format effects when no memory test was expected and processing was deep (El), and when study and test formats did not match (E3). Pictures produced superior performance when a memory test was expected (El & 2) and when study and test formats were the same (E3). Results of all experiments support the attenuation of attention model and that picture superiority is due to a more direct access to semantic processing and a richer visual code. General principles to guide the processing of stimulus information are discussed.

  2. Impact of different stages of juice processing on the anthocyanin, flavonol, and procyanidin contents of cranberries.

    PubMed

    White, Brittany L; Howard, Luke R; Prior, Ronald L

    2011-05-11

    Juice is the most common form in which cranberries are consumed; however there is limited information on the changes of polyphenolic content of the berries during juice processing. This study investigated the effects of three different pretreatments (grinding plus blanching; only grinding; only blanching) for cranberry juice processing on the concentrations of anthocyanins, flavonols, and procyanidins throughout processing. Flavonols and procyanidins were retained in the juice to a greater extent than anthocyanins, and pressing resulted in the most significant losses in polyphenolics due to removal of the seeds and skins. Flavonol aglycones were formed during processing as a result of heat treatment. Drying of cranberry pomace resulted in increased extraction of flavonols and procyanidin oligomers but lower extraction of polymeric procyanidins. The results indicate that cranberry polyphenolics are relatively stable during processing compared to other berries; however, more work is needed to determine their fate during storage of juices.

  3. Connected Text Reading and Differences in Text Reading Fluency in Adult Readers

    PubMed Central

    Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke

    2013-01-01

    The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177

  4. Deficits in context-dependent adaptive coding of reward in schizophrenia

    PubMed Central

    Kirschner, Matthias; Hager, Oliver M; Bischof, Martin; Hartmann-Riemer, Matthias N; Kluge, Agne; Seifritz, Erich; Tobler, Philippe N; Kaiser, Stefan

    2016-01-01

    Theoretical principles of information processing and empirical findings suggest that to efficiently represent all possible rewards in the natural environment, reward-sensitive neurons have to adapt their coding range dynamically to the current reward context. Adaptation ensures that the reward system is most sensitive for the most likely rewards, enabling the system to efficiently represent a potentially infinite range of reward information. A deficit in neural adaptation would prevent precise representation of rewards and could have detrimental effects for an organism’s ability to optimally engage with its environment. In schizophrenia, reward processing is known to be impaired and has been linked to different symptom dimensions. However, despite the fundamental significance of coding reward adaptively, no study has elucidated whether adaptive reward processing is impaired in schizophrenia. We therefore studied patients with schizophrenia (n=27) and healthy controls (n=25), using functional magnetic resonance imaging in combination with a variant of the monetary incentive delay task. Compared with healthy controls, patients with schizophrenia showed less efficient neural adaptation to the current reward context, which leads to imprecise neural representation of reward. Importantly, the deficit correlated with total symptom severity. Our results suggest that some of the deficits in reward processing in schizophrenia might be due to inefficient neural adaptation to the current reward context. Furthermore, because adaptive coding is a ubiquitous feature of the brain, we believe that our findings provide an avenue in defining a general impairment in neural information processing underlying this debilitating disorder. PMID:27430009

  5. Receptive amusia: evidence for cross-hemispheric neural networks underlying music processing strategies.

    PubMed

    Schuppert, M; Münte, T F; Wieringa, B M; Altenmüller, E

    2000-03-01

    Perceptual musical functions were investigated in patients suffering from unilateral cerebrovascular cortical lesions. Using MIDI (Musical Instrument Digital Interface) technique, a standardized short test battery was established that covers local (analytical) as well as global perceptual mechanisms. These represent the principal cognitive strategies in melodic and temporal musical information processing (local, interval and rhythm; global, contour and metre). Of the participating brain-damaged patients, a total of 69% presented with post-lesional impairments in music perception. Left-hemisphere-damaged patients showed significant deficits in the discrimination of local as well as global structures in both melodic and temporal information processing. Right-hemisphere-damaged patients also revealed an overall impairment of music perception, reaching significance in the temporal conditions. Detailed analysis outlined a hierarchical organization, with an initial right-hemisphere recognition of contour and metre followed by identification of interval and rhythm via left-hemisphere subsystems. Patterns of dissociated and associated melodic and temporal deficits indicate autonomous, yet partially integrated neural subsystems underlying the processing of melodic and temporal stimuli. In conclusion, these data contradict a strong hemispheric specificity for music perception, but indicate cross-hemisphere, fragmented neural substrates underlying local and global musical information processing in the melodic and temporal dimensions. Due to the diverse profiles of neuropsychological deficits revealed in earlier investigations as well as in this study, individual aspects of musicality and musical behaviour very likely contribute to the definite formation of these widely distributed neural networks.

  6. Visual-Cerebellar Pathways and Their Roles in the Control of Avian Flight.

    PubMed

    Wylie, Douglas R; Gutiérrez-Ibáñez, Cristián; Gaede, Andrea H; Altshuler, Douglas L; Iwaniuk, Andrew N

    2018-01-01

    In this paper, we review the connections and physiology of visual pathways to the cerebellum in birds and consider their role in flight. We emphasize that there are two visual pathways to the cerebellum. One is to the vestibulocerebellum (folia IXcd and X) that originates from two retinal-recipient nuclei that process optic flow: the nucleus of the basal optic root (nBOR) and the pretectal nucleus lentiformis mesencephali (LM). The second is to the oculomotor cerebellum (folia VI-VIII), which receives optic flow information, mainly from LM, but also local visual motion information from the optic tectum, and other visual information from the ventral lateral geniculate nucleus (Glv). The tectum, LM and Glv are all intimately connected with the pontine nuclei, which also project to the oculomotor cerebellum. We believe this rich integration of visual information in the cerebellum is important for analyzing motion parallax that occurs during flight. Finally, we extend upon a suggestion by Ibbotson (2017) that the hypertrophy that is observed in LM in hummingbirds might be due to an increase in the processing demands associated with the pathway to the oculomotor cerebellum as they fly through a cluttered environment while feeding.

  7. What the Human Brain Likes About Facial Motion

    PubMed Central

    Schultz, Johannes; Brockhaus, Matthias; Bülthoff, Heinrich H.; Pilz, Karin S.

    2013-01-01

    Facial motion carries essential information about other people's emotions and intentions. Most previous studies have suggested that facial motion is mainly processed in the superior temporal sulcus (STS), but several recent studies have also shown involvement of ventral temporal face-sensitive regions. Up to now, it is not known whether the increased response to facial motion is due to an increased amount of static information in the stimulus, to the deformation of the face over time, or to increased attentional demands. We presented nonrigidly moving faces and control stimuli to participants performing a demanding task unrelated to the face stimuli. We manipulated the amount of static information by using movies with different frame rates. The fluidity of the motion was manipulated by presenting movies with frames either in the order in which they were recorded or in scrambled order. Results confirm higher activation for moving compared with static faces in STS and under certain conditions in ventral temporal face-sensitive regions. Activation was maximal at a frame rate of 12.5 Hz and smaller for scrambled movies. These results indicate that both the amount of static information and the fluid facial motion per se are important factors for the processing of dynamic faces. PMID:22535907

  8. A systematic review of discontinued trials suggested that most reasons for recruitment failure were preventable.

    PubMed

    Briel, Matthias; Olu, Kelechi Kalu; von Elm, Erik; Kasenda, Benjamin; Alturki, Reem; Agarwal, Arnav; Bhatnagar, Neera; Schandelmaier, Stefan

    2016-12-01

    To collect and classify reported reasons for recruitment failure in discontinued randomized controlled trials (RCTs) and to assess reporting quality. We systematically searched MEDLINE and EMBASE (2010-2014) and a previous cohort of RCTs for published RCTs reporting trial discontinuation due to poor recruitment. Teams of two investigators selected eligible RCTs working independently and extracted information using standardized forms. We used an iterative approach to classify reasons for poor recruitment. We included 172 RCTs discontinued due to poor recruitment (including 26 conference abstracts and 63 industry-funded RCTs). Of those, 131 (76%) reported one or more reasons for discontinuation due to poor recruitment. We identified 28 different reasons for recruitment failure; most frequently mentioned were overestimation of prevalence of eligible participants and prejudiced views of recruiters and participants on trial interventions. Few RCTs reported relevant details about the recruitment process such as how eligible participants were identified, the number of patients assessed for eligibility, and who actually recruited participants. Our classification could serve as a checklist to assist investigators in the planning of RCTs. Most reasons for recruitment failure seem preventable with a pilot study that applies the planned informed consent procedure. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Social cognition in a case of amnesia with neurodevelopmental mechanisms.

    PubMed

    Staniloiu, Angelica; Borsutzky, Sabine; Woermann, Friedrich G; Markowitsch, Hans J

    2013-01-01

    Episodic-autobiographical memory (EAM) is considered to emerge gradually in concert with the development of other cognitive abilities (such as executive functions, personal semantic knowledge, emotional knowledge, theory of mind (ToM) functions, language, and working memory). On the brain level its emergence is accompanied by structural and functional reorganization of different components of the so-called EAM network. This network includes the hippocampal formation, which is viewed as being vital for the acquisition of memories of personal events for long-term storage. Developmental studies have emphasized socio-cultural-linguistic mechanisms that may be unique to the development of EAM. Furthermore it was hypothesized that one of the main functions of EAM is the social one. In the research field, the link between EAM and social cognition remains however debated. Herein we aim to bring new insights into the relation between EAM and social information processing (including social cognition) by describing a young adult patient with amnesia with neurodevelopmental mechanisms due to perinatal complications accompanied by hypoxia. The patient was investigated medically, psychiatrically, and with neuropsychological and neuroimaging methods. Structural high resolution magnetic resonance imaging revealed significant bilateral hippocampal atrophy as well as indices for degeneration in the amygdalae, basal ganglia, and thalamus, when a less conservative threshold was applied. In addition to extensive memory investigations and testing other (non-social) cognitive functions, we employed a broad range of tests that assessed social information processing (social perception, social cognition, social regulation). Our results point to both preserved (empathy, core ToM functions, visual affect selection, and discrimination, affective prosody discrimination) and impaired domains of social information processing (incongruent affective prosody processing, complex social judgments). They support proposals for a role of the hippocampal formation in processing more complex social information that likely requires multimodal relational handling.

  10. Social cognition in a case of amnesia with neurodevelopmental mechanisms

    PubMed Central

    Staniloiu, Angelica; Borsutzky, Sabine; Woermann, Friedrich G.; Markowitsch, Hans J.

    2013-01-01

    Episodic–autobiographical memory (EAM) is considered to emerge gradually in concert with the development of other cognitive abilities (such as executive functions, personal semantic knowledge, emotional knowledge, theory of mind (ToM) functions, language, and working memory). On the brain level its emergence is accompanied by structural and functional reorganization of different components of the so-called EAM network. This network includes the hippocampal formation, which is viewed as being vital for the acquisition of memories of personal events for long-term storage. Developmental studies have emphasized socio-cultural-linguistic mechanisms that may be unique to the development of EAM. Furthermore it was hypothesized that one of the main functions of EAM is the social one. In the research field, the link between EAM and social cognition remains however debated. Herein we aim to bring new insights into the relation between EAM and social information processing (including social cognition) by describing a young adult patient with amnesia with neurodevelopmental mechanisms due to perinatal complications accompanied by hypoxia. The patient was investigated medically, psychiatrically, and with neuropsychological and neuroimaging methods. Structural high resolution magnetic resonance imaging revealed significant bilateral hippocampal atrophy as well as indices for degeneration in the amygdalae, basal ganglia, and thalamus, when a less conservative threshold was applied. In addition to extensive memory investigations and testing other (non-social) cognitive functions, we employed a broad range of tests that assessed social information processing (social perception, social cognition, social regulation). Our results point to both preserved (empathy, core ToM functions, visual affect selection, and discrimination, affective prosody discrimination) and impaired domains of social information processing (incongruent affective prosody processing, complex social judgments). They support proposals for a role of the hippocampal formation in processing more complex social information that likely requires multimodal relational handling. PMID:23805111

  11. Abnormal externally guided movement preparation in recent-onset schizophrenia is associated with impaired selective attention to external input.

    PubMed

    Smid, Henderikus G O M; Westenbroek, Joanna M; Bruggeman, Richard; Knegtering, Henderikus; Van den Bosch, Robert J

    2009-11-30

    Several theories propose that the primary cognitive impairment in schizophrenia concerns a deficit in the processing of external input information. There is also evidence, however, for impaired motor preparation in schizophrenia. This provokes the question whether the impaired motor preparation in schizophrenia is a secondary consequence of disturbed (selective) processing of the input needed for that preparation, or an independent primary deficit. The aim of the present study was to discriminate between these hypotheses, by investigating externally guided movement preparation in relation to selective stimulus processing. The sample comprised 16 recent-onset schizophrenia patients and 16 controls who performed a movement-precuing task. In this task, a precue delivered information about one, two or no parameters of a movement summoned by a subsequent stimulus. Performance measures and measures derived from the electroencephalogram showed that patients yielded smaller benefits from the precues and showed less cue-based preparatory activity in advance of the imperative stimulus than the controls, suggesting a response preparation deficit. However, patients also showed less activity reflecting selective attention to the precue. We therefore conclude that the existing evidence for an impairment of externally guided motor preparation in schizophrenia is most likely due to a deficit in selective attention to the external input, which lends support to theories proposing that the primary cognitive deficit in schizophrenia concerns the processing of input information.

  12. Where You Look Matters for Body Perception: Preferred Gaze Location Contributes to the Body Inversion Effect

    PubMed Central

    McKean, Danielle L.; Tsao, Jack W.; Chan, Annie W.-Y.

    2017-01-01

    The Body Inversion Effect (BIE; reduced visual discrimination performance for inverted compared to upright bodies) suggests that bodies are visually processed configurally; however, the specific importance of head posture information in the BIE has been indicated in reports of BIE reduction for whole bodies with fixed head position and for headless bodies. Through measurement of gaze patterns and investigation of the causal relation of fixation location to visual body discrimination performance, the present study reveals joint contributions of feature and configuration processing to visual body discrimination. Participants predominantly gazed at the (body-centric) upper body for upright bodies and the lower body for inverted bodies in the context of an experimental paradigm directly comparable to that of prior studies of the BIE. Subsequent manipulation of fixation location indicates that these preferential gaze locations causally contributed to the BIE for whole bodies largely due to the informative nature of gazing at or near the head. Also, a BIE was detected for both whole and headless bodies even when fixation location on the body was held constant, indicating a role of configural processing in body discrimination, though inclusion of the head posture information was still highly discriminative in the context of such processing. Interestingly, the impact of configuration (upright and inverted) to the BIE appears greater than that of differential preferred gaze locations. PMID:28085894

  13. Cost model for biobanks.

    PubMed

    Gonzalez-Sanchez, M Beatriz; Lopez-Valeiras, Ernesto; Morente, Manuel M; Fernández Lago, Orlando

    2013-10-01

    Current economic conditions and budget constraints in publicly funded biomedical research have brought about a renewed interest in analyzing the cost and economic viability of research infrastructures. However, there are no proposals for specific cost accounting models for these types of organizations in the international scientific literature. The aim of this paper is to present the basis of a cost analysis model useful for any biobank regardless of the human biological samples that it stores for biomedical research. The development of a unique cost model for biobanks can be a complicated task due to the diversity of the biological samples they store. Different types of samples (DNA, tumor tissues, blood, serum, etc.) require different production processes. Nonetheless, the common basic steps of the production process can be identified. Thus, the costs incurred in each step can be analyzed in detail to provide cost information. Six stages and four cost objects were obtained by taking the production processes of biobanks belonging to the Spanish National Biobank Network as a starting point. Templates and examples are provided to help managers to identify and classify the costs involved in their own biobanks to implement the model. The application of this methodology will provide accurate information on cost objects, along with useful information to give an economic value to the stored samples, to analyze the efficiency of the production process and to evaluate the viability of some sample collections.

  14. A Framework of Hyperspectral Image Compression using Neural Networks

    DOE PAGES

    Masalmah, Yahya M.; Martínez Nieves, Christian; Rivera Soto, Rafael; ...

    2015-01-01

    Hyperspectral image analysis has gained great attention due to its wide range of applications. Hyperspectral images provide a vast amount of information about underlying objects in an image by using a large range of the electromagnetic spectrum for each pixel. However, since the same image is taken multiple times using distinct electromagnetic bands, the size of such images tend to be significant, which leads to greater processing requirements. The aim of this paper is to present a proposed framework for image compression and to study the possible effects of spatial compression on quality of unmixing results. Image compression allows usmore » to reduce the dimensionality of an image while still preserving most of the original information, which could lead to faster image processing. Lastly, this paper presents preliminary results of different training techniques used in Artificial Neural Network (ANN) based compression algorithm.« less

  15. Influence of measurement error on Maxwell's demon

    NASA Astrophysics Data System (ADS)

    Sørdal, Vegard; Bergli, Joakim; Galperin, Y. M.

    2017-06-01

    In any general cycle of measurement, feedback, and erasure, the measurement will reduce the entropy of the system when information about the state is obtained, while erasure, according to Landauer's principle, is accompanied by a corresponding increase in entropy due to the compression of logical and physical phase space. The total process can in principle be fully reversible. A measurement error reduces the information obtained and the entropy decrease in the system. The erasure still gives the same increase in entropy, and the total process is irreversible. Another consequence of measurement error is that a bad feedback is applied, which further increases the entropy production if the proper protocol adapted to the expected error rate is not applied. We consider the effect of measurement error on a realistic single-electron box Szilard engine, and we find the optimal protocol for the cycle as a function of the desired power P and error ɛ .

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyle, Jennifer E.; Zhang, Xing; Weitz, Karl K.

    Understanding how biological molecules are generated, metabolized and eliminated in living systems is important for interpreting processes such as immune response and disease pathology. While genomic and proteomic studies have provided vast amounts of information over the last several decades, interest in lipidomics has also grown due to improved analytical technologies revealing altered lipid metabolism in type 2 diabetes, cancer, and lipid storage disease. Liquid chromatography and mass spectrometry (LC-MS) measurements are currently the dominant approach for characterizing the lipidome by providing detailed information on the spatial and temporal composition of lipids. However, interpreting lipids’ biological roles is challenging duemore » to the existence of numerous structural and stereoisomers (i.e. distinct acyl chain and double-bond positions), which are unresolvable using present LC-MS approaches. Here we show that combining structurally-based ion mobility spectrometry (IMS) with LC-MS measurements distinguishes lipid isomers and allows insight into biological and disease processes.« less

  17. Differential impairments of selective attention due to frequency and duration of cannabis use.

    PubMed

    Solowij, N; Michie, P T; Fox, A M

    1995-05-15

    The evidence for long-term cognitive impairments associated with chronic use of cannabis has been inconclusive. We report the results of a brain event-related potential (ERP) study of selective attention in long-term cannabis users in the unintoxicated state. Two ERP measures known to reflect distinct components of attention were found to be affected differentially by duration and frequency of cannabis use. The ability to focus attention and filter out irrelevant information, measured by frontal processing negativity to irrelevant stimuli, was impaired progressively with the number of years of use but was unrelated to frequency of use. The speed of information processing, measured by the latency of parietal P300, was delayed significantly with increasing frequency of use but was unaffected by duration of use. The results suggest that a chronic buildup of cannabinoids produces both short- and long-term cognitive impairments.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salmilehto, J.; Deppe, F.; Di Ventra, M.

    Memristors are resistive elements retaining information of their past dynamics. They have garnered substantial interest due to their potential for representing a paradigm change in electronics, information processing and unconventional computing. Given the advent of quantum technologies, a design for a quantum memristor with superconducting circuits may be envisaged. Along these lines, we introduce such a quantum device whose memristive behavior arises from quasiparticle-induced tunneling when supercurrents are cancelled. Here in this paper, for realistic parameters, we find that the relevant hysteretic behavior may be observed using current state-of-the-art measurements of the phase-driven tunneling current. Finally, we develop suitable methodsmore » to quantify memory retention in the system.« less

  19. Quantum Memristors with Superconducting Circuits

    PubMed Central

    Salmilehto, J.; Deppe, F.; Di Ventra, M.; Sanz, M.; Solano, E.

    2017-01-01

    Memristors are resistive elements retaining information of their past dynamics. They have garnered substantial interest due to their potential for representing a paradigm change in electronics, information processing and unconventional computing. Given the advent of quantum technologies, a design for a quantum memristor with superconducting circuits may be envisaged. Along these lines, we introduce such a quantum device whose memristive behavior arises from quasiparticle-induced tunneling when supercurrents are cancelled. For realistic parameters, we find that the relevant hysteretic behavior may be observed using current state-of-the-art measurements of the phase-driven tunneling current. Finally, we develop suitable methods to quantify memory retention in the system. PMID:28195193

  20. Parallel workflow manager for non-parallel bioinformatic applications to solve large-scale biological problems on a supercomputer.

    PubMed

    Suplatov, Dmitry; Popova, Nina; Zhumatiy, Sergey; Voevodin, Vladimir; Švedas, Vytas

    2016-04-01

    Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads - one for task management and communication, and another for subtask execution - are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .

  1. Color Improves Speed of Processing But Not Perception in a Motion Illusion

    PubMed Central

    Perry, Carolyn J.; Fallah, Mazyar

    2012-01-01

    When two superimposed surfaces of dots move in different directions, the perceived directions are shifted away from each other. This perceptual illusion has been termed direction repulsion and is thought to be due to mutual inhibition between the representations of the two directions. It has further been shown that a speed difference between the two surfaces attenuates direction repulsion. As speed and direction are both necessary components of representing motion, the reduction in direction repulsion can be attributed to the additional motion information strengthening the representations of the two directions and thus reducing the mutual inhibition. We tested whether bottom-up attention and top-down task demands, in the form of color differences between the two surfaces, would also enhance motion processing, reducing direction repulsion. We found that the addition of color differences did not improve direction discrimination and reduce direction repulsion. However, we did find that adding a color difference improved performance on the task. We hypothesized that the performance differences were due to the limited presentation time of the stimuli. We tested this in a follow-up experiment where we varied the time of presentation to determine the duration needed to successfully perform the task with and without the color difference. As we expected, color segmentation reduced the amount of time needed to process and encode both directions of motion. Thus we find a dissociation between the effects of attention on the speed of processing and conscious perception of direction. We propose four potential mechanisms wherein color speeds figure-ground segmentation of an object, attentional switching between objects, direction discrimination and/or the accumulation of motion information for decision-making, without affecting conscious perception of the direction. Potential neural bases are also explored. PMID:22479255

  2. Color improves speed of processing but not perception in a motion illusion.

    PubMed

    Perry, Carolyn J; Fallah, Mazyar

    2012-01-01

    When two superimposed surfaces of dots move in different directions, the perceived directions are shifted away from each other. This perceptual illusion has been termed direction repulsion and is thought to be due to mutual inhibition between the representations of the two directions. It has further been shown that a speed difference between the two surfaces attenuates direction repulsion. As speed and direction are both necessary components of representing motion, the reduction in direction repulsion can be attributed to the additional motion information strengthening the representations of the two directions and thus reducing the mutual inhibition. We tested whether bottom-up attention and top-down task demands, in the form of color differences between the two surfaces, would also enhance motion processing, reducing direction repulsion. We found that the addition of color differences did not improve direction discrimination and reduce direction repulsion. However, we did find that adding a color difference improved performance on the task. We hypothesized that the performance differences were due to the limited presentation time of the stimuli. We tested this in a follow-up experiment where we varied the time of presentation to determine the duration needed to successfully perform the task with and without the color difference. As we expected, color segmentation reduced the amount of time needed to process and encode both directions of motion. Thus we find a dissociation between the effects of attention on the speed of processing and conscious perception of direction. We propose four potential mechanisms wherein color speeds figure-ground segmentation of an object, attentional switching between objects, direction discrimination and/or the accumulation of motion information for decision-making, without affecting conscious perception of the direction. Potential neural bases are also explored.

  3. Optimal Signal Processing in Small Stochastic Biochemical Networks

    PubMed Central

    Ziv, Etay; Nemenman, Ilya; Wiggins, Chris H.

    2007-01-01

    We quantify the influence of the topology of a transcriptional regulatory network on its ability to process environmental signals. By posing the problem in terms of information theory, we do this without specifying the function performed by the network. Specifically, we study the maximum mutual information between the input (chemical) signal and the output (genetic) response attainable by the network in the context of an analytic model of particle number fluctuations. We perform this analysis for all biochemical circuits, including various feedback loops, that can be built out of 3 chemical species, each under the control of one regulator. We find that a generic network, constrained to low molecule numbers and reasonable response times, can transduce more information than a simple binary switch and, in fact, manages to achieve close to the optimal information transmission fidelity. These high-information solutions are robust to tenfold changes in most of the networks' biochemical parameters; moreover they are easier to achieve in networks containing cycles with an odd number of negative regulators (overall negative feedback) due to their decreased molecular noise (a result which we derive analytically). Finally, we demonstrate that a single circuit can support multiple high-information solutions. These findings suggest a potential resolution of the “cross-talk” phenomenon as well as the previously unexplained observation that transcription factors that undergo proteolysis are more likely to be auto-repressive. PMID:17957259

  4. Neural correlates of decision making with explicit information about probabilities and incentives in elderly healthy subjects.

    PubMed

    Labudda, Kirsten; Woermann, Friedrich G; Mertens, Markus; Pohlmann-Eden, Bernd; Markowitsch, Hans J; Brand, Matthias

    2008-06-01

    Recent functional neuroimaging and lesion studies demonstrate the involvement of the orbitofrontal/ventromedial prefrontal cortex as a key structure in decision making processes. This region seems to be particularly crucial when contingencies between options and consequences are unknown but have to be learned by the use of feedback following previous decisions (decision making under ambiguity). However, little is known about the neural correlates of decision making under risk conditions in which information about probabilities and potential outcomes is given. In the present study, we used functional magnetic resonance imaging to measure blood-oxygenation-level-dependent (BOLD) responses in 12 subjects during a decision making task. This task provided explicit information about probabilities and associated potential incentives. The responses were compared to BOLD signals in a control condition without information about incentives. In contrast to previous decision making studies, we completely removed the outcome phase following a decision to exclude the potential influence of feedback previously received on current decisions. The results indicate that the integration of information about probabilities and incentives leads to activations within the dorsolateral prefrontal cortex, the posterior parietal lobe, the anterior cingulate and the right lingual gyrus. We assume that this pattern of activation is due to the involvement of executive functions, conflict detection mechanisms and arithmetic operations during the deliberation phase of decisional processes that are based on explicit information.

  5. Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes

    NASA Astrophysics Data System (ADS)

    Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.

    2015-12-01

    Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.

  6. Machine learning for fab automated diagnostics

    NASA Astrophysics Data System (ADS)

    Giollo, Manuel; Lam, Auguste; Gkorou, Dimitra; Liu, Xing Lan; van Haren, Richard

    2017-06-01

    Process optimization depends largely on field engineer's knowledge and expertise. However, this practice turns out to be less sustainable due to the fab complexity which is continuously increasing in order to support the extreme miniaturization of Integrated Circuits. On the one hand, process optimization and root cause analysis of tools is necessary for a smooth fab operation. On the other hand, the growth in number of wafer processing steps is adding a considerable new source of noise which may have a significant impact at the nanometer scale. This paper explores the ability of historical process data and Machine Learning to support field engineers in production analysis and monitoring. We implement an automated workflow in order to analyze a large volume of information, and build a predictive model of overlay variation. The proposed workflow addresses significant problems that are typical in fab production, like missing measurements, small number of samples, confounding effects due to heterogeneity of data, and subpopulation effects. We evaluate the proposed workflow on a real usecase and we show that it is able to predict overlay excursions observed in Integrated Circuits manufacturing. The chosen design focuses on linear and interpretable models of the wafer history, which highlight the process steps that are causing defective products. This is a fundamental feature for diagnostics, as it supports process engineers in the continuous improvement of the production line.

  7. Global scale stratospheric processes as measured by the infrasound IMS network

    NASA Astrophysics Data System (ADS)

    Le Pichon, A.; Ceranna, L.; Kechut, P.

    2012-04-01

    IMS infrasound array data are routinely processed at the International Data Center (IDC). The wave parameters of the detected signals are estimated with the Progressive Multi-Channel Correlation method (PMCC). This new implementation of the PMCC algorithm allows the full frequency range of interest (0.01-5 Hz) to be processed efficiently in a single computational run. We have processed continuous recordings from 41 certified IMS stations from 2005 to 2010. We show that microbaroms are the dominant source of signals and are near-continuously globally detected. The observed azimuthal seasonal trend correlates well with the variation of the effective sound speed ratio which is a proxy for the combined effects of refraction due to sound speed gradients and advection due to along-path wind on infrasound propagation. A general trend in signal backazimuth is observed between winter and summer, driven by the seasonal reversal of the stratospheric winds. Combined with propagation modeling, we show that such an analysis enables a characterization of the wind and temperature structure above the stratosphere and may provide detailed information on upper atmospheric processes (e.g., large-scale planetary waves, stratospheric warming effects). We correlate perturbations and deviations from the seasonal trend to short time-scale variability of the atmosphere. We discuss the potential benefit of long-term infrasound monitoring to infer stratospheric processes for the first time on a global scale.

  8. An Integrated Information System for Supporting Quality Management Tasks

    NASA Astrophysics Data System (ADS)

    Beyer, N.; Helmreich, W.

    2004-08-01

    In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database for subsequent evaluation by the Quality Manager. All such information can be processed and presented in an appropriate manner for internal or external audits, as well as for regular management reviews.

  9. Establishing and sustaining a biorepository network in Israel: challenges and progress.

    PubMed

    Cohen, Yehudit; Almog, Ronit; Onn, Amir; Itzhaki-Alfia, Ayelet; Meir, Karen

    2013-12-01

    Over the past 5 years, using European and North American biobanks as models, the grass-roots establishment of independently operating biobanks has occurred virtually simultaneously in large Israeli teaching hospitals. The process of establishing a national biorepository network in Israel has progressed slowly, sustained mainly by a few proponents working together on a personal level. Slow progress has been due to limited funding and the lack of a legal framework specific to biobanking activities. Recently, due to increasing pressure from the scientific community, the government has earmarked funds for a national biorepository network, and the structure is now being established. In forming a network, Israel's biobanks face certain difficulties, particularly lack of support. Additional challenges include harmonization of standard operating procedures, database centralization, and use of a common informed consent form. In this article, we highlight some of the issues faced by Israel's biobank managers in establishing and sustaining a functional biobank network, information that could provide guidance for other small countries with limited resources.

  10. Thermomechanical milling of accessory lithics in volcanic conduits

    NASA Astrophysics Data System (ADS)

    Campbell, Michelle E.; Russell, James K.; Porritt, Lucy A.

    2013-09-01

    Accessory lithic clasts recovered from pyroclastic deposits commonly result from the failure of conduit wall rocks, and represent an underutilized resource for constraining conduit processes during explosive volcanic eruptions. The morphological features of lithic clasts provide distinctive 'textural fingerprints' of processes that have reshaped them during transport in the conduit. Here, we present the first study focused on accessory lithic clast morphology and show how the shapes and surfaces of these accessory pyroclasts can inform on conduit processes. We use two main types of accessory lithic clasts from pyroclastic fallout deposits of the 2360 B.P. subplinian eruption of Mount Meager, British Columbia, as a case study: (i) rough and subangular dacite clasts, and (ii) variably rounded and smoothed monzogranite clasts. The quantitative morphological data collected on these lithics include: mass, volume, density, 2-D image analysis of convexity (C), and 3-D laser scans for sphericity (Ψ) and smoothness (S). Shaping and comminution (i.e. milling) of clasts within the conduit are ascribed to three processes: (1) disruptive fragmentation due to high-energy impacts between clasts or between clasts and conduit walls, (2) ash-blasting of clasts suspended within the volcanic flux, and (3) thermal effects. We use a simplified conduit eruption model to predict ash-blasting velocities and lithic residence times as a function of clast size and source depth, thereby constraining the lithic milling processes. The extent of shape and surface modification (i.e. rounding and honing) is directly proportional to clast residence times within the conduit prior to evacuation. We postulate that the shallow-seated dacite clasts remain subangular and rough due to short (<2 min) residence times, whereas monzogranite clasts are much more rounded and smoothed due to deeper source depths and consequently longer residence times (up to ˜1 h). Larger monzogranite clasts are smoother than smaller clasts due to longer residence times and to greater differential velocities within the ash-laden jet. Lastly, our model residence times and mass loss estimates for rounded clasts are used to estimate minimum attrition rates due to volcanic ash-blasting within the conduit (e.g., 12 cm3 s-1 for 25 cm clasts, sourced at 2500 m depth).

  11. Electroencephalographic compression based on modulated filter banks and wavelet transform.

    PubMed

    Bazán-Prieto, Carlos; Cárdenas-Barrera, Julián; Blanco-Velasco, Manuel; Cruz-Roldán, Fernando

    2011-01-01

    Due to the large volume of information generated in an electroencephalographic (EEG) study, compression is needed for storage, processing or transmission for analysis. In this paper we evaluate and compare two lossy compression techniques applied to EEG signals. It compares the performance of compression schemes with decomposition by filter banks or wavelet Packets transformation, seeking the best value for compression, best quality and more efficient real time implementation. Due to specific properties of EEG signals, we propose a quantization stage adapted to the dynamic range of each band, looking for higher quality. The results show that the compressor with filter bank performs better than transform methods. Quantization adapted to the dynamic range significantly enhances the quality.

  12. A class of covariate-dependent spatiotemporal covariance functions

    PubMed Central

    Reich, Brian J; Eidsvik, Jo; Guindani, Michele; Nail, Amy J; Schmidt, Alexandra M.

    2014-01-01

    In geostatistics, it is common to model spatially distributed phenomena through an underlying stationary and isotropic spatial process. However, these assumptions are often untenable in practice because of the influence of local effects in the correlation structure. Therefore, it has been of prolonged interest in the literature to provide flexible and effective ways to model non-stationarity in the spatial effects. Arguably, due to the local nature of the problem, we might envision that the correlation structure would be highly dependent on local characteristics of the domain of study, namely the latitude, longitude and altitude of the observation sites, as well as other locally defined covariate information. In this work, we provide a flexible and computationally feasible way for allowing the correlation structure of the underlying processes to depend on local covariate information. We discuss the properties of the induced covariance functions and discuss methods to assess its dependence on local covariate information by means of a simulation study and the analysis of data observed at ozone-monitoring stations in the Southeast United States. PMID:24772199

  13. Kanban system implementation in cardboard supply process (Case study: PT. Akebono Brake Astra Indonesia - Jakarta)

    NASA Astrophysics Data System (ADS)

    Laksono, Pringgo Widyo; Kusumawardani, Christina Ayu

    2017-11-01

    Continuous improvement is needed by every manufacturing company to optimize their production. One way to reach that goal is eliminating waste that occurs in company. In PT. Akebono Brake Astra Indonesia - Jakarta (AAIJ), there are seven "muda" (waste) that always strived to remove, such as muda transportation that occurs in the cardboard supply system made by the non-value movement of PIC in packing area to take cardboard from warehouse. This research use Kaizen theory to get rid of muda transportation by changing the cardboard supply system that were previously done manually by PIC of packing area become taken over by a towing operator and apply Kanban system to improving the cardboard supply system information by creating set up of Kanban system that produce Material and Information Chart (MIFC), Standardized Work Chart (SWC), calculation of Kanban population, and Work Instruction (WI). This research lead to improvement of cardboard supply process, clearer and more cyclic information flow in cardboard supply system, and reduction of cost due to saving of manpower.

  14. Working Memory Capacity and Fluid Intelligence: Maintenance and Disengagement.

    PubMed

    Shipstead, Zach; Harrison, Tyler L; Engle, Randall W

    2016-11-01

    Working memory capacity and fluid intelligence have been demonstrated to be strongly correlated traits. Typically, high working memory capacity is believed to facilitate reasoning through accurate maintenance of relevant information. In this article, we present a proposal reframing this issue, such that tests of working memory capacity and fluid intelligence are seen as measuring complementary processes that facilitate complex cognition. Respectively, these are the ability to maintain access to critical information and the ability to disengage from or block outdated information. In the realm of problem solving, high working memory capacity allows a person to represent and maintain a problem accurately and stably, so that hypothesis testing can be conducted. However, as hypotheses are disproven or become untenable, disengaging from outdated problem solving attempts becomes important so that new hypotheses can be generated and tested. From this perspective, the strong correlation between working memory capacity and fluid intelligence is due not to one ability having a causal influence on the other but to separate attention-demanding mental functions that can be contrary to one another but are organized around top-down processing goals. © The Author(s) 2016.

  15. Multiple-reason decision making based on automatic processing.

    PubMed

    Glöckner, Andreas; Betsch, Tilmann

    2008-09-01

    It has been repeatedly shown that in decisions under time constraints, individuals predominantly use noncompensatory strategies rather than complex compensatory ones. The authors argue that these findings might be due not to limitations of cognitive capacity but instead to limitations of information search imposed by the commonly used experimental tool Mouselab (J. W. Payne, J. R. Bettman, & E. J. Johnson, 1988). The authors tested this assumption in 3 experiments. In the 1st experiment, information was openly presented, whereas in the 2nd experiment, the standard Mouselab program was used under different time limits. The results indicate that individuals are able to compute weighted additive decision strategies extremely quickly if information search is not restricted by the experimental procedure. In a 3rd experiment, these results were replicated using more complex decision tasks, and the major alternative explanations that individuals use more complex heuristics or that they merely encode the constellation of cues were ruled out. In sum, the findings challenge the fundaments of bounded rationality and highlight the importance of automatic processes in decision making. (c) 2008 APA, all rights reserved.

  16. Modes of Visual Recognition and Perceptually Relevant Sketch-based Coding for Images

    NASA Technical Reports Server (NTRS)

    Jobson, Daniel J.

    1991-01-01

    A review of visual recognition studies is used to define two levels of information requirements. These two levels are related to two primary subdivisions of the spatial frequency domain of images and reflect two distinct different physical properties of arbitrary scenes. In particular, pathologies in recognition due to cerebral dysfunction point to a more complete split into two major types of processing: high spatial frequency edge based recognition vs. low spatial frequency lightness (and color) based recognition. The former is more central and general while the latter is more specific and is necessary for certain special tasks. The two modes of recognition can also be distinguished on the basis of physical scene properties: the highly localized edges associated with reflectance and sharp topographic transitions vs. smooth topographic undulation. The extreme case of heavily abstracted images is pursued to gain an understanding of the minimal information required to support both modes of recognition. Here the intention is to define the semantic core of transmission. This central core of processing can then be fleshed out with additional image information and coding and rendering techniques.

  17. An Overview of Biomolecular Event Extraction from Scientific Documents

    PubMed Central

    Vanegas, Jorge A.; Matos, Sérgio; González, Fabio; Oliveira, José L.

    2015-01-01

    This paper presents a review of state-of-the-art approaches to automatic extraction of biomolecular events from scientific texts. Events involving biomolecules such as genes, transcription factors, or enzymes, for example, have a central role in biological processes and functions and provide valuable information for describing physiological and pathogenesis mechanisms. Event extraction from biomedical literature has a broad range of applications, including support for information retrieval, knowledge summarization, and information extraction and discovery. However, automatic event extraction is a challenging task due to the ambiguity and diversity of natural language and higher-level linguistic phenomena, such as speculations and negations, which occur in biological texts and can lead to misunderstanding or incorrect interpretation. Many strategies have been proposed in the last decade, originating from different research areas such as natural language processing, machine learning, and statistics. This review summarizes the most representative approaches in biomolecular event extraction and presents an analysis of the current state of the art and of commonly used methods, features, and tools. Finally, current research trends and future perspectives are also discussed. PMID:26587051

  18. Antemortem records of forensic significance among edentulous individuals.

    PubMed

    Richmond, Raymond; Pretty, Iain A

    2007-03-01

    The identification of edentulous individuals is problematic due to poor provision of labelled dental prostheses. Dental records may still provide useful information for odontologists in the comparative identification process. The purpose of this study was to determine the level of forensically significant information contained within the dental records of a population of denture wearers attending the University of Manchester School of Dentistry. Two hundred and two dental records were examined and a proforma completed. The mean age of the patients was 72 years. Medical history were absent in 4% of all records and only 67.8% of the written records were rated as good. Thirty-two percent of the records contained one or more panoramic radiographs but 30% of these were over 3 years old rendering their usefulness in identification procedures questionable. In total only 18% of the examined records contained antemortem information that would enable identification. These data suggest that the process of denture marking is an essential in order to ensure that the identification of this population can be undertaken expediently by dental means.

  19. Analysis of a municipal wastewater treatment plant using a neural network-based pattern analysis

    USGS Publications Warehouse

    Hong, Y.-S.T.; Rosen, Michael R.; Bhamidimarri, R.

    2003-01-01

    This paper addresses the problem of how to capture the complex relationships that exist between process variables and to diagnose the dynamic behaviour of a municipal wastewater treatment plant (WTP). Due to the complex biological reaction mechanisms, the highly time-varying, and multivariable aspects of the real WTP, the diagnosis of the WTP are still difficult in practice. The application of intelligent techniques, which can analyse the multi-dimensional process data using a sophisticated visualisation technique, can be useful for analysing and diagnosing the activated-sludge WTP. In this paper, the Kohonen Self-Organising Feature Maps (KSOFM) neural network is applied to analyse the multi-dimensional process data, and to diagnose the inter-relationship of the process variables in a real activated-sludge WTP. By using component planes, some detailed local relationships between the process variables, e.g., responses of the process variables under different operating conditions, as well as the global information is discovered. The operating condition and the inter-relationship among the process variables in the WTP have been diagnosed and extracted by the information obtained from the clustering analysis of the maps. It is concluded that the KSOFM technique provides an effective analysing and diagnosing tool to understand the system behaviour and to extract knowledge contained in multi-dimensional data of a large-scale WTP. ?? 2003 Elsevier Science Ltd. All rights reserved.

  20. The development of a geopressured energy management information system in support of research planning, phase 1

    NASA Astrophysics Data System (ADS)

    Bachman, A. L.; Wrighton, F. M.

    1981-10-01

    The development of an information system on the problems and potential of geopressured gas containing aquifers as well as what is known about unconventional gas production in the Gulf Coast, and the use of this information to formulate a research program to prove economic and technical feasibility is discussed. This work led to the conclusion that of six major conventional gas resource options in the Gulf Coast, the one involving gas recovery from reservoirs watered out due to prior production offers the greatest potential in the short term. In these water drive reservoirs, gas is trapped in the pore space as water invades the reservoir (due to gas production). This gas can be recovered by reducing the pressure in the reservoir and thereby causing the trapped gas to expand and become mobile. The reduction in reservoir pressure is achieved by high rate water production. The conclusions drawn from analyses of the potential for gas recovery from unconventional sources in the Gulf Coast as well as research and testing already completed are the basis for the proposed research program. The process by which the research program was formulated, intermediate results and the program itself are summarized.

  1. Submarine harbor navigation using image data

    NASA Astrophysics Data System (ADS)

    Stubberud, Stephen C.; Kramer, Kathleen A.

    2017-01-01

    The process of ingress and egress of a United States Navy submarine is a human-intensive process that takes numerous individuals to monitor locations and for hazards. Sailors pass vocal information to bridge where it is processed manually. There is interest in using video imaging of the periscope view to more automatically provide navigation within harbors and other points of ingress and egress. In this paper, video-based navigation is examined as a target-tracking problem. While some image-processing methods claim to provide range information, the moving platform problem and weather concerns, such as fog, reduce the effectiveness of these range estimates. The video-navigation problem then becomes an angle-only tracking problem. Angle-only tracking is known to be fraught with difficulties, due to the fact that the unobservable space is not the null space. When using a Kalman filter estimator to perform the tracking, significant errors arise which could endanger the submarine. This work analyzes the performance of the Kalman filter when angle-only measurements are used to provide the target tracks. This paper addresses estimation unobservability and the minimal set of requirements that are needed to address it in this complex but real-world problem. Three major issues are addressed: the knowledge of navigation beacons/landmarks' locations, the minimal number of these beacons needed to maintain the course, and update rates of the angles of the landmarks as the periscope rotates and landmarks become obscured due to blockage and weather. The goal is to address the problem of navigation to and from the docks, while maintaining the traversing of the harbor channel based on maritime rules relying solely on the image-based data. The minimal number of beacons will be considered. For this effort, the image correlation from frame to frame is assumed to be achieved perfectly. Variation in the update rates and the dropping of data due to rotation and obscuration is considered. The analysis will be based on a simple straight-line channel harbor entry to the dock, similar to a submarine entering the submarine port in San Diego.

  2. Space Tethers: Design Criteria

    NASA Technical Reports Server (NTRS)

    Tomlin, D. D.; Faile, G. C.; Hayashida, K. B.; Frost, C. L.; Wagner, C. Y.; Mitchell, M. L.; Vaughn, J. A.; Galuska, M. J.

    1997-01-01

    This document is prepared to provide a systematic process for the selection of tethers for space applications. Criteria arc provided for determining the strength requirement for tether missions and for mission success from tether severing due to micrometeoroids and orbital debris particle impacts. Background information of materials for use in space tethers is provided, including electricity-conducting tethers. Dynamic considerations for tether selection is also provided. Safety, quality, and reliability considerations are provided for a tether project.

  3. Improvements in Operational Readiness by Distributing Manufacturing Capability in the Supply Chain through Additive Manufacturing

    DTIC Science & Technology

    2017-12-01

    inefficiencies of a more complex system. Additional time may also be due to the longer distances traveled . The fulfillment time for a requisition to...Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time ...advanced manufacturing methods with additive manufacturing. This work decomposes the additive manufacturing processes into 11 primary functions. The time

  4. Comparison of inhibition in two timed reaction tasks: the color and emotion Stroop tasks.

    PubMed

    Cothran, D Lisa; Larsen, Randy

    2008-07-01

    The authors examined the cross-task consistency of the ability to inhibit the processing of irrelevant information. They compared interference scores on 2 widely used inhibition tasks and found that color word Stroop interference scores correlated with emotion word Stroop interference scores. An examination of physiological reactivity showed that, in general, the color Stroop was more arousing than was the emotion Stroop, most likely due to increased response conflict.

  5. Probabilistic 21st and 22nd Century Sea-Level Projections at a Global Network of Tide-Gauge Sites

    NASA Technical Reports Server (NTRS)

    Kopp, Robert E.; Horton, Radley M.; Little, Christopher M.; Mitrovica, Jerry X.; Oppenheimer, Michael; Rasmussen, D. J.; Strauss, Benjamin H.; Tebaldi, Claudia

    2014-01-01

    Sea-level rise due to both climate change and non-climatic factors threatens coastal settlements, infrastructure, and ecosystems. Projections of mean global sea-level (GSL) rise provide insufficient information to plan adaptive responses; local decisions require local projections that accommodate different risk tolerances and time frames and that can be linked to storm surge projections. Here we present a global set of local sea-level (LSL) projections to inform decisions on timescales ranging from the coming decades through the 22nd century. We provide complete probability distributions, informed by a combination of expert community assessment, expert elicitation, and process modeling. Between the years 2000 and 2100, we project a very likely (90% probability) GSL rise of 0.5–1.2?m under representative concentration pathway (RCP) 8.5, 0.4–0.9?m under RCP 4.5, and 0.3–0.8?m under RCP 2.6. Site-to-site differences in LSL projections are due to varying non-climatic background uplift or subsidence, oceanographic effects, and spatially variable responses of the geoid and the lithosphere to shrinking land ice. The Antarctic ice sheet (AIS) constitutes a growing share of variance in GSL and LSL projections. In the global average and at many locations, it is the dominant source of variance in late 21st century projections, though at some sites oceanographic processes contribute the largest share throughout the century. LSL rise dramatically reshapes flood risk, greatly increasing the expected number of “1-in-10” and “1-in-100” year events.

  6. Probabilistic 21st and 22nd century sea-level projections at a global network of tide-gauge sites

    NASA Astrophysics Data System (ADS)

    Kopp, Robert E.; Horton, Radley M.; Little, Christopher M.; Mitrovica, Jerry X.; Oppenheimer, Michael; Rasmussen, D. J.; Strauss, Benjamin H.; Tebaldi, Claudia

    2014-08-01

    Sea-level rise due to both climate change and non-climatic factors threatens coastal settlements, infrastructure, and ecosystems. Projections of mean global sea-level (GSL) rise provide insufficient information to plan adaptive responses; local decisions require local projections that accommodate different risk tolerances and time frames and that can be linked to storm surge projections. Here we present a global set of local sea-level (LSL) projections to inform decisions on timescales ranging from the coming decades through the 22nd century. We provide complete probability distributions, informed by a combination of expert community assessment, expert elicitation, and process modeling. Between the years 2000 and 2100, we project a very likely (90% probability) GSL rise of 0.5-1.2 m under representative concentration pathway (RCP) 8.5, 0.4-0.9 m under RCP 4.5, and 0.3-0.8 m under RCP 2.6. Site-to-site differences in LSL projections are due to varying non-climatic background uplift or subsidence, oceanographic effects, and spatially variable responses of the geoid and the lithosphere to shrinking land ice. The Antarctic ice sheet (AIS) constitutes a growing share of variance in GSL and LSL projections. In the global average and at many locations, it is the dominant source of variance in late 21st century projections, though at some sites oceanographic processes contribute the largest share throughout the century. LSL rise dramatically reshapes flood risk, greatly increasing the expected number of "1-in-10" and "1-in-100" year events.

  7. Crystallographic Analysis of a Japanese Sword by using Bragg Edge Transmission Spectroscopy

    NASA Astrophysics Data System (ADS)

    Shiota, Yoshinori; Hasemi, Hiroyuki; Kiyanagi, Yoshiaki

    Neutron imaging using a pulsed neutron source can give crystallographic information over wide area of a sample by analysing position dependent transmission spectra. With the use of a Bragg edge imaging method we non-destructively obtained crystallographic information of a Japanese sword, signed by Bishu Osafune Norimitsu, in order to know position dependent crystallographic characteristics and to check usefulness of the method for the Japanese sword investigation. Strong texture appeared on the back side. On the other hand in the middle area almost isotropic feature appeared and edge side showed feature between them. Rather isotropic area in the centre area gradually reduced from the grip side to the tip side. The crystallite size was smaller near the edge and became larger towards the back side. The smaller crystallite size will be due to quenching around the edge and this trend disappeared in the grip (nakago) area. The larger crystallite size will be due to strong hammering. Coarse grains were also observed directly as transmission images with the use of a high spatial resolution detector. The spatial distribution of the grains was not uniform but the reason have not been understood. Furthermore, a white area around a tip area was proved to be a void by looking at the Brag edge transmission spectra. This void may be formed during forging process of two kinds of steel. It is suggested that consideration on differences in the texture and the crystallite size depending on position will give information to clarify the manufacturing process, and Bragg edge analysis will be a profitable tool for research of Japanese sword.

  8. Adapting to an Uncertain World: Cognitive Capacity and Causal Reasoning with Ambiguous Observations

    PubMed Central

    Shou, Yiyun; Smithson, Michael

    2015-01-01

    Ambiguous causal evidence in which the covariance of the cause and effect is partially known is pervasive in real life situations. Little is known about how people reason about causal associations with ambiguous information and the underlying cognitive mechanisms. This paper presents three experiments exploring the cognitive mechanisms of causal reasoning with ambiguous observations. Results revealed that the influence of ambiguous observations manifested by missing information on causal reasoning depended on the availability of cognitive resources, suggesting that processing ambiguous information may involve deliberative cognitive processes. Experiment 1 demonstrated that subjects did not ignore the ambiguous observations in causal reasoning. They also had a general tendency to treat the ambiguous observations as negative evidence against the causal association. Experiment 2 and Experiment 3 included a causal learning task requiring a high cognitive demand in which paired stimuli were presented to subjects sequentially. Both experiments revealed that processing ambiguous or missing observations can depend on the availability of cognitive resources. Experiment 2 suggested that the contribution of working memory capacity to the comprehensiveness of evidence retention was reduced when there were ambiguous or missing observations. Experiment 3 demonstrated that an increase in cognitive demand due to a change in the task format reduced subjects’ tendency to treat ambiguous-missing observations as negative cues. PMID:26468653

  9. INCOG recommendations for management of cognition following traumatic brain injury, part II: attention and information processing speed.

    PubMed

    Ponsford, Jennie; Bayley, Mark; Wiseman-Hakes, Catherine; Togher, Leanne; Velikonja, Diana; McIntyre, Amanda; Janzen, Shannon; Tate, Robyn

    2014-01-01

    Traumatic brain injury, due to its diffuse nature and high frequency of injury to frontotemporal and midbrain reticular activating systems, may cause disruption in many aspects of attention: arousal, selective attention, speed of information processing, and strategic control of attention, including sustained attention, shifting and dividing of attention, and working memory. An international team of researchers and clinicians (known as INCOG) convened to develop recommendations for the management of attentional problems. The experts selected recommendations from published guidelines and then reviewed literature to ensure that recommendations were current. Decision algorithms incorporating the recommendations based on inclusion and exclusion criteria of published trials were developed. The team then prioritized recommendations for implementation and developed audit criteria to evaluate adherence to these best practices. The recommendations and discussion highlight that metacognitive strategy training focused on functional everyday activities is appropriate. Appropriate use of dual task training, environmental modifications, and cognitive behavioral therapy is also discussed. There is insufficient evidence to support mindfulness meditation and practice on de-contextualized computer-based tasks for attention. Administration of the medication methylphenidate should be considered to improve information-processing speed. The INCOG recommendations for rehabilitation of attention provide up-to-date guidance for clinicians treating people with traumatic brain injury.

  10. Exploring laterality and memory effects in the haptic discrimination of verbal and non-verbal shapes.

    PubMed

    Stoycheva, Polina; Tiippana, Kaisa

    2018-03-14

    The brain's left hemisphere often displays advantages in processing verbal information, while the right hemisphere favours processing non-verbal information. In the haptic domain due to contra-lateral innervations, this functional lateralization is reflected in a hand advantage during certain functions. Findings regarding the hand-hemisphere advantage for haptic information remain contradictory, however. This study addressed these laterality effects and their interaction with memory retention times in the haptic modality. Participants performed haptic discrimination of letters, geometric shapes and nonsense shapes at memory retention times of 5, 15 and 30 s with the left and right hand separately, and we measured the discriminability index d'. The d' values were significantly higher for letters and geometric shapes than for nonsense shapes. This might result from dual coding (naming + spatial) or/and from a low stimulus complexity. There was no stimulus-specific laterality effect. However, we found a time-dependent laterality effect, which revealed that the performance of the left hand-right hemisphere was sustained up to 15 s, while the performance of the right-hand-left hemisphere decreased progressively throughout all retention times. This suggests that haptic memory traces are more robust to decay when they are processed by the left hand-right hemisphere.

  11. Application-ready expedited MODIS data for operational land surface monitoring of vegetation condition

    USGS Publications Warehouse

    Brown, Jesslyn; Howard, Daniel M.; Wylie, Bruce K.; Friesz, Aaron M.; Ji, Lei; Gacke, Carolyn

    2015-01-01

    Monitoring systems benefit from high temporal frequency image data collected from the Moderate Resolution Imaging Spectroradiometer (MODIS) system. Because of near-daily global coverage, MODIS data are beneficial to applications that require timely information about vegetation condition related to drought, flooding, or fire danger. Rapid satellite data streams in operational applications have clear benefits for monitoring vegetation, especially when information can be delivered as fast as changing surface conditions. An “expedited” processing system called “eMODIS” operated by the U.S. Geological Survey provides rapid MODIS surface reflectance data to operational applications in less than 24 h offering tailored, consistently-processed information products that complement standard MODIS products. We assessed eMODIS quality and consistency by comparing to standard MODIS data. Only land data with known high quality were analyzed in a central U.S. study area. When compared to standard MODIS (MOD/MYD09Q1), the eMODIS Normalized Difference Vegetation Index (NDVI) maintained a strong, significant relationship to standard MODIS NDVI, whether from morning (Terra) or afternoon (Aqua) orbits. The Aqua eMODIS data were more prone to noise than the Terra data, likely due to differences in the internal cloud mask used in MOD/MYD09Q1 or compositing rules. Post-processing temporal smoothing decreased noise in eMODIS data.

  12. Modelling Evolutionary Algorithms with Stochastic Differential Equations.

    PubMed

    Heredia, Jorge Pérez

    2017-11-20

    There has been renewed interest in modelling the behaviour of evolutionary algorithms (EAs) by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogues of the additive and multiplicative drift theorems from runtime analysis. In addition, we derive a new more general multiplicative drift theorem that also covers non-elitist EAs. This theorem simultaneously allows for positive and negative results, providing information on the algorithm's progress even when the problem cannot be optimised efficiently. Finally, we provide results for some well-known heuristics namely Random Walk (RW), Random Local Search (RLS), the (1+1) EA, the Metropolis Algorithm (MA), and the Strong Selection Weak Mutation (SSWM) algorithm.

  13. A survey of study participants' understanding of informed consent to participate in a randomised controlled trial of acupuncture.

    PubMed

    Smith, Caroline A; Fogarty, Sarah

    2016-01-12

    It is important that potential study participants are appropriately informed and understand what is involved with their research participation. A few studies have examined study participants' understanding of the informed consent process and the adequacy of the information they received when agreeing to participate in a randomised controlled trial. Deficiencies in the consent process have been found. This topic remains an under researched area of acupuncture research. The aim of this study was to examine participants' understanding of their informed consent and the adequacy of the information presented when agreeing to participate in a randomised controlled trial of acupuncture. All women who participated in a randomised controlled trial over an 11 month period were invited to participate in a survey. An anonymous self-completion questionnaire was designed and covered participants' understanding of informed consent in the clinical trial, their views of the information provided, the opportunity to ask questions, the use of sham acupuncture, their recall of study visits and processes for withdrawal, and their reason for participating in the trial. A response rate of 59% was obtained. Over 90% of subjects indicated there was plenty of opportunity to discuss the study prior to giving consent, and 89% indicated that questions asked were answered to their satisfaction. The majority of women indicated the amount of information describing acupuncture was about right, however 24% would have liked more. Information describing sham acupuncture was not considered adequate by 48% of women, and 35% would have liked more information, 30% could not recall why, or were uncertain why a sham group was used. Participants indicated less understanding of the information relating to payment if they became ill due to study participation, risks and discomforts from the study interventions, which of the procedures were experimental and for how long they would be involved in the study. Trial participants' understanding of informed consent was overall satisfactory but highlighted some areas of deficiency. Future studies could consider use of supplementary material such as Q and A fact sheets.

  14. Non-rigid ultrasound image registration using generalized relaxation labeling process

    NASA Astrophysics Data System (ADS)

    Lee, Jong-Ha; Seong, Yeong Kyeong; Park, MoonHo; Woo, Kyoung-Gu; Ku, Jeonghun; Park, Hee-Jun

    2013-03-01

    This research proposes a novel non-rigid registration method for ultrasound images. The most predominant anatomical features in medical images are tissue boundaries, which appear as edges. In ultrasound images, however, other features can be identified as well due to the specular reflections that appear as bright lines superimposed on the ideal edge location. In this work, an image's local phase information (via the frequency domain) is used to find the ideal edge location. The generalized relaxation labeling process is then formulated to align the feature points extracted from the ideal edge location. In this work, the original relaxation labeling method was generalized by taking n compatibility coefficient values to improve non-rigid registration performance. This contextual information combined with a relaxation labeling process is used to search for a correspondence. Then the transformation is calculated by the thin plate spline (TPS) model. These two processes are iterated until the optimal correspondence and transformation are found. We have tested our proposed method and the state-of-the-art algorithms with synthetic data and bladder ultrasound images of in vivo human subjects. Experiments show that the proposed method improves registration performance significantly, as compared to other state-of-the-art non-rigid registration algorithms.

  15. Crossmodal interactions during non-linguistic auditory processing in cochlear-implanted deaf patients.

    PubMed

    Barone, Pascal; Chambaudie, Laure; Strelnikov, Kuzma; Fraysse, Bernard; Marx, Mathieu; Belin, Pascal; Deguine, Olivier

    2016-10-01

    Due to signal distortion, speech comprehension in cochlear-implanted (CI) patients relies strongly on visual information, a compensatory strategy supported by important cortical crossmodal reorganisations. Though crossmodal interactions are evident for speech processing, it is unclear whether a visual influence is observed in CI patients during non-linguistic visual-auditory processing, such as face-voice interactions, which are important in social communication. We analyse and compare visual-auditory interactions in CI patients and normal-hearing subjects (NHS) at equivalent auditory performance levels. Proficient CI patients and NHS performed a voice-gender categorisation in the visual-auditory modality from a morphing-generated voice continuum between male and female speakers, while ignoring the presentation of a male or female visual face. Our data show that during the face-voice interaction, CI deaf patients are strongly influenced by visual information when performing an auditory gender categorisation task, in spite of maximum recovery of auditory speech. No such effect is observed in NHS, even in situations of CI simulation. Our hypothesis is that the functional crossmodal reorganisation that occurs in deafness could influence nonverbal processing, such as face-voice interaction; this is important for patient internal supramodal representation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. An Approach to Automated Fusion System Design and Adaptation

    PubMed Central

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-01-01

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum. PMID:28300762

  17. An Approach to Automated Fusion System Design and Adaptation.

    PubMed

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-03-16

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  18. Due Diligence Processes for Public Acquisition of Mining-Impacted Landscapes

    NASA Astrophysics Data System (ADS)

    Martin, E.; Monohan, C.; Keeble-Toll, A. K.

    2016-12-01

    The acquisition of public land is critical for achieving conservation and habitat goals in rural regions projected to experience continuously high rates of population growth. To ensure that public funds are utilized responsibly in the purchase of conservation easements appropriate due diligence processes must be established that limit landowner liability post-acquisition. Traditional methods of characterizing contamination in regions where legacy mining activities were prevalent may not utilize current scientific knowledge and understanding of contaminant fate, transport and bioavailability, and therefore are likely to have type two error. Agency prescribed assessment methods utilized under CERLA in many cases fail to detect contamination that presents liability issues by failing to require water quality sampling that would reveal offsite transport potential of contaminants posing human health risks, including mercury. Historical analysis can be used to inform judgmental sampling to identify hotspots and contaminants of concern. Land acquisition projects at two historic mine sites in Nevada County, California, the Champion Mine Complex and the Black Swan Preserve have established the necessity of re-thinking due diligence processes for mining-impacted landscapes. These pilot projects demonstrate that pre-acquisition assessment in the Gold Country must include judgmental sampling and evaluation of contaminant transport. Best practices using the current scientific knowledge must be codified by agencies, consultants, and NGOs in order to ensure responsible use of public funds and to safeguard public health.

  19. Testing of information condensation in a model reverberating spiking neural network.

    PubMed

    Vidybida, Alexander

    2011-06-01

    Information about external world is delivered to the brain in the form of structured in time spike trains. During further processing in higher areas, information is subjected to a certain condensation process, which results in formation of abstract conceptual images of external world, apparently, represented as certain uniform spiking activity partially independent on the input spike trains details. Possible physical mechanism of condensation at the level of individual neuron was discussed recently. In a reverberating spiking neural network, due to this mechanism the dynamics should settle down to the same uniform/ periodic activity in response to a set of various inputs. Since the same periodic activity may correspond to different input spike trains, we interpret this as possible candidate for information condensation mechanism in a network. Our purpose is to test this possibility in a network model consisting of five fully connected neurons, particularly, the influence of geometric size of the network, on its ability to condense information. Dynamics of 20 spiking neural networks of different geometric sizes are modelled by means of computer simulation. Each network was propelled into reverberating dynamics by applying various initial input spike trains. We run the dynamics until it becomes periodic. The Shannon's formula is used to calculate the amount of information in any input spike train and in any periodic state found. As a result, we obtain explicit estimate of the degree of information condensation in the networks, and conclude that it depends strongly on the net's geometric size.

  20. Error characterization and quantum control benchmarking in liquid state NMR using quantum information processing techniques

    NASA Astrophysics Data System (ADS)

    Laforest, Martin

    Quantum information processing has been the subject of countless discoveries since the early 1990's. It is believed to be the way of the future for computation: using quantum systems permits one to perform computation exponentially faster than on a regular classical computer. Unfortunately, quantum systems that not isolated do not behave well. They tend to lose their quantum nature due to the presence of the environment. If key information is known about the noise present in the system, methods such as quantum error correction have been developed in order to reduce the errors introduced by the environment during a given quantum computation. In order to harness the quantum world and implement the theoretical ideas of quantum information processing and quantum error correction, it is imperative to understand and quantify the noise present in the quantum processor and benchmark the quality of the control over the qubits. Usual techniques to estimate the noise or the control are based on quantum process tomography (QPT), which, unfortunately, demands an exponential amount of resources. This thesis presents work towards the characterization of noisy processes in an efficient manner. The protocols are developed from a purely abstract setting with no system-dependent variables. To circumvent the exponential nature of quantum process tomography, three different efficient protocols are proposed and experimentally verified. The first protocol uses the idea of quantum error correction to extract relevant parameters about a given noise model, namely the correlation between the dephasing of two qubits. Following that is a protocol using randomization and symmetrization to extract the probability that a given number of qubits are simultaneously corrupted in a quantum memory, regardless of the specifics of the error and which qubits are affected. Finally, a last protocol, still using randomization ideas, is developed to estimate the average fidelity per computational gates for single and multi qubit systems. Even though liquid state NMR is argued to be unsuitable for scalable quantum information processing, it remains the best test-bed system to experimentally implement, verify and develop protocols aimed at increasing the control over general quantum information processors. For this reason, all the protocols described in this thesis have been implemented in liquid state NMR, which then led to further development of control and analysis techniques.

  1. The role of risk communication planning in the release of the oral rabies vaccine in New Jersey: An evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pflugh, K.K.

    1995-12-01

    Communicating health risk information is a complicated task. Citizen reaction to such information is difficult to predict, which makes it hard to plan an appropriate response. Research indicates that the way citizens respond to risk information often depends on whether the risk is familiar or unfamiliar, whether it is seen as imposed on them, whether it is man made or natural, or whether they have control over the risk. Potentially controversial cases that deal with delivering risk information have a special need for a well planned communication effort. Natural resource issues with an impact on public health are no exception.more » In New Jersey, a proposal to release an experimental bioengineered oral rabies vaccine for raccoons to test the effectiveness of the vaccine in halting the spread of rabies into an as yet unaffected area met with widespread public support and approval due in large part to the use of a unique risk communication planning process. This paper will describe the risk communication planning process used to gain public support and approval for release of oral rabies raccoon vaccine while focusing on the evaluation component of the process. The seven step process includes setting goals, profiling the issue or information gathering, audience identification and assessment, message development, method selection, implementation of the strategy and evaluation and follow-up. The goal of the evaluation component was to determine the effectiveness of the public information campaign on citizen`s knowledge of the field trial nearly three years after the initial announcement. In addition, it sought to learn citizen interest in maintaining the rabies free barrier that was created by the field trial using funds from local taxes. This evaluation includes the results of a mailed survey to 280 citizens, local officials and professional organizations. Finally, this paper will discuss the implications for future outreach efforts dealing complicated technical issues.« less

  2. Does Guiding Toward Task-Relevant Information Help Improve Graph Processing and Graph Comprehension of Individuals with Low or High Numeracy? An Eye-Tracker Experiment.

    PubMed

    Keller, Carmen; Junghans, Alex

    2017-11-01

    Individuals with low numeracy have difficulties with understanding complex graphs. Combining the information-processing approach to numeracy with graph comprehension and information-reduction theories, we examined whether high numerates' better comprehension might be explained by their closer attention to task-relevant graphical elements, from which they would expect numerical information to understand the graph. Furthermore, we investigated whether participants could be trained in improving their attention to task-relevant information and graph comprehension. In an eye-tracker experiment ( N = 110) involving a sample from the general population, we presented participants with 2 hypothetical scenarios (stomach cancer, leukemia) showing survival curves for 2 treatments. In the training condition, participants received written instructions on how to read the graph. In the control condition, participants received another text. We tracked participants' eye movements while they answered 9 knowledge questions. The sum constituted graph comprehension. We analyzed visual attention to task-relevant graphical elements by using relative fixation durations and relative fixation counts. The mediation analysis revealed a significant ( P < 0.05) indirect effect of numeracy on graph comprehension through visual attention to task-relevant information, which did not differ between the 2 conditions. Training had a significant main effect on visual attention ( P < 0.05) but not on graph comprehension ( P < 0.07). Individuals with high numeracy have better graph comprehension due to their greater attention to task-relevant graphical elements than individuals with low numeracy. With appropriate instructions, both groups can be trained to improve their graph-processing efficiency. Future research should examine (e.g., motivational) mediators between visual attention and graph comprehension to develop appropriate instructions that also result in higher graph comprehension.

  3. Quantum state conversion in opto-electro-mechanical systems via shortcut to adiabaticity

    NASA Astrophysics Data System (ADS)

    Zhou, Xiao; Liu, Bao-Jie; Shao, L.-B.; Zhang, Xin-Ding; Xue, Zheng-Yuan

    2017-09-01

    Adiabatic processes have found many important applications in modern physics, the distinct merit of which is that accurate control over process timing is not required. However, such processes are slow, which limits their application in quantum computation, due to the limited coherent times of typical quantum systems. Here, we propose a scheme to implement quantum state conversion in opto-electro-mechanical systems via a shortcut to adiabaticity, where the process can be greatly speeded up while precise timing control is still not necessary. In our scheme, by modifying only the coupling strength, we can achieve fast quantum state conversion with high fidelity, where the adiabatic condition does not need to be met. In addition, the population of the unwanted intermediate state can be further suppressed. Therefore, our protocol presents an important step towards practical state conversion between optical and microwave photons, and thus may find many important applications in hybrid quantum information processing.

  4. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujimoto, Kazufumi, E-mail: m_fuji@kvj.biglobe.ne.jp; Nagai, Hideo, E-mail: nagai@sigmath.es.osaka-u.ac.jp; Runggaldier, Wolfgang J., E-mail: runggal@math.unipd.it

    2013-02-15

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand itmore » considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).« less

  5. Isothermal dehydration of thin films of water and sugar solutions

    NASA Astrophysics Data System (ADS)

    Heyd, R.; Rampino, A.; Bellich, B.; Elisei, E.; Cesàro, A.; Saboungi, M.-L.

    2014-03-01

    The process of quasi-isothermal dehydration of thin films of pure water and aqueous sugar solutions is investigated with a dual experimental and theoretical approach. A nanoporous paper disk with a homogeneous internal structure was used as a substrate. This experimental set-up makes it possible to gather thermodynamic data under well-defined conditions, develop a numerical model, and extract needed information about the dehydration process, in particular the water activity. It is found that the temperature evolution of the pure water film is not strictly isothermal during the drying process, possibly due to the influence of water diffusion through the cellulose web of the substrate. The role of sugar is clearly detectable and its influence on the dehydration process can be identified. At the end of the drying process, trehalose molecules slow down the diffusion of water molecules through the substrate in a more pronounced way than do the glucose molecules.

  6. Isothermal dehydration of thin films of water and sugar solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heyd, R.; Rampino, A.; Laboratory of Physical and Macromolecular Chemistry, University of Trieste, Via Giorgieri 1, 34127 Trieste

    The process of quasi-isothermal dehydration of thin films of pure water and aqueous sugar solutions is investigated with a dual experimental and theoretical approach. A nanoporous paper disk with a homogeneous internal structure was used as a substrate. This experimental set-up makes it possible to gather thermodynamic data under well-defined conditions, develop a numerical model, and extract needed information about the dehydration process, in particular the water activity. It is found that the temperature evolution of the pure water film is not strictly isothermal during the drying process, possibly due to the influence of water diffusion through the cellulose webmore » of the substrate. The role of sugar is clearly detectable and its influence on the dehydration process can be identified. At the end of the drying process, trehalose molecules slow down the diffusion of water molecules through the substrate in a more pronounced way than do the glucose molecules.« less

  7. Psychophysiology of dissociated consciousness.

    PubMed

    Bob, Petr

    2014-01-01

    Recent study of consciousness provides an evidence that there is a limit of consciousness, which presents a barrier between conscious and unconscious processes. This barrier likely is specifically manifested as a disturbance of neural mechanisms of consciousness that through distributed brain processing, attentional mechanisms and memory processes enable to constitute integrative conscious experience. According to recent findings a level of conscious integration may change during certain conditions related to experimental cognitive manipulations, hypnosis, or stressful experiences that can lead to dissociation of consciousness. In psychopathological research the term dissociation was proposed by Pierre Janet for explanation of processes related to splitting of consciousness due to traumatic events or during hypnosis. According to several recent findings dissociation of consciousness likely is related to deficits in global distribution of information and may lead to heightened levels of "neural complexity" that reflects brain integration or differentiation based on numbers of independent neural processes in the brain that may be specifically related to various mental disorders.

  8. California nearshore processes - ERTS 1. [coastal currents and sediments

    NASA Technical Reports Server (NTRS)

    Steller, D. D.; Pirie, D. M.

    1974-01-01

    The detectability of many nearshore processes from ERTS is made possible due to the suspended sediment present in the coastal waters. From viewing and analyzing the California coastal imagery collected during the last year and a half, the overall current patterns and their changes have become evident. It is now possible to map monthly and seasonal changes that occur throughout the year. The original objectives of detecting currents, sediment transport, estuaries and river discharge have now been expanded to include the use of ERTS information in operational problems of the U.S. Army Corps of Engineers. This incorporates the detected nearshore features into planning and organizing shore protection facilities.

  9. Cophenetic correlation analysis as a strategy to select phylogenetically informative proteins: an example from the fungal kingdom

    PubMed Central

    Kuramae, Eiko E; Robert, Vincent; Echavarri-Erasun, Carlos; Boekhout, Teun

    2007-01-01

    Background The construction of robust and well resolved phylogenetic trees is important for our understanding of many, if not all biological processes, including speciation and origin of higher taxa, genome evolution, metabolic diversification, multicellularity, origin of life styles, pathogenicity and so on. Many older phylogenies were not well supported due to insufficient phylogenetic signal present in the single or few genes used in phylogenetic reconstructions. Importantly, single gene phylogenies were not always found to be congruent. The phylogenetic signal may, therefore, be increased by enlarging the number of genes included in phylogenetic studies. Unfortunately, concatenation of many genes does not take into consideration the evolutionary history of each individual gene. Here, we describe an approach to select informative phylogenetic proteins to be used in the Tree of Life (TOL) and barcoding projects by comparing the cophenetic correlation coefficients (CCC) among individual protein distance matrices of proteins, using the fungi as an example. The method demonstrated that the quality and number of concatenated proteins is important for a reliable estimation of TOL. Approximately 40–45 concatenated proteins seem needed to resolve fungal TOL. Results In total 4852 orthologous proteins (KOGs) were assigned among 33 fungal genomes from the Asco- and Basidiomycota and 70 of these represented single copy proteins. The individual protein distance matrices based on 531 concatenated proteins that has been used for phylogeny reconstruction before [14] were compared one with another in order to select those with the highest CCC, which then was used as a reference. This reference distance matrix was compared with those of the 70 single copy proteins selected and their CCC values were calculated. Sixty four KOGs showed a CCC above 0.50 and these were further considered for their phylogenetic potential. Proteins belonging to the cellular processes and signaling KOG category seem more informative than those belonging to the other three categories: information storage and processing; metabolism; and the poorly characterized category. After concatenation of 40 proteins the topology of the phylogenetic tree remained stable, but after concatenation of 60 or more proteins the bootstrap support values of some branches decreased, most likely due to the inclusion of proteins with lowers CCC values. The selection of protein sequences to be used in various TOL projects remains a critical and important process. The method described in this paper will contribute to a more objective selection of phylogenetically informative protein sequences. Conclusion This study provides candidate protein sequences to be considered as phylogenetic markers in different branches of fungal TOL. The selection procedure described here will be useful to select informative protein sequences to resolve branches of TOL that contain few or no species with completely sequenced genomes. The robust phylogenetic trees resulting from this method may contribute to our understanding of organismal diversification processes. The method proposed can be extended easily to other branches of TOL. PMID:17688684

  10. Integrating SAR with Optical and Thermal Remote Sensing for Operational Near Real-Time Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.

    2013-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.

  11. Economic valuation of environmental benefits from wastewater treatment processes: an empirical approach for Spain.

    PubMed

    Hernández-Sancho, Francesc; Molinos-Senante, María; Sala-Garrido, Ramón

    2010-01-15

    Economic research into the design and implementation of policies for the efficient management of water resources has been emphasized by the European Water Framework Directive (Directive 2000/60/EC). The efficient implementation of policies to prevent the degradation and depletion of water resources requires determining their value in social and economic terms and incorporating this information into the decision-making process. A process of wastewater treatment has many associated environmental benefits. However, these benefits are often not calculated because they are not set by the market, due to inadequate property rights, the presence of externalities, and the lack of perfect information. Nevertheless, the valuation of these benefits is necessary to justify a suitable investment policy and a limited number of studies exist on the subject of the economic valuation of environmental benefits. In this paper, we propose a methodology based on the estimation of shadow prices for the pollutants removed in a treatment process. This value represents the environmental benefit (avoided cost) associated with undischarged pollution. This is a pioneering approach to the economic valuation of wastewater treatment. The comparison of these benefits with the internal costs of the treatment process will provide a useful indicator for the feasibility of wastewater treatment projects. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Examples of Sentinel-2A Mission Exploitation Results

    NASA Astrophysics Data System (ADS)

    Koetz, Benjamin; Hoersch, Bianca; Gascon, Ferran; Desnos, Yves-Louis; Seifert, Frank Martin; Paganini, Marc; Ramoino, Fabrizio; Arino, Olivier

    2017-04-01

    The Sentinel-2 Copernicus mission will bring significant breakthrough in the exploitation of space borne optical data. Sentinel-2 time series will transform land cover, agriculture, forestry, in-land water and costal EO applications from mapping to monitoring, from snapshot to time series data analysis, from image-based to pixel-based processing. The 5-days temporal revisiting of the Sentinel-2 satellites, when both units will be operated together, will usher us in a new era for time series analysis at high spatial resolutions (HR) of 10-20 meters. The monitoring of seasonal variations and processes in phenology and hydrology are examples of the many R&D areas to be studied. The mission's large swath and systematic acquisitions will further support unprecedented coverage at the national scale addressing information requirements of national to regional policies. Within ESA programs, such as the Data User Element (DUE), Scientific Exploitation of Operational Missions (SEOM) and Climate Change Initiative (CCI), several R&D activities are preparing the exploitation of the Sentinel-2 mission towards reliable measurements and monitoring of e.g. Essential Climate Variables and indicators for the Sustainable Development Goals. Early Sentinel-2 results will be presented related to a range of applications and scientific domains such as agricultural monitoring at national scale (DUE Sen2Agri), wetland extent and condition over African Ramsar sites (DUE GlobWetland-Africa), land cover mapping for climate change (CCI Land Cover), national land monitoring (Cadaster-Env), forest degradation (DUE ForMoSa), urban mapping (DUE EO4Urban), in-land water quality (DUE SPONGE), map of Mediterranean aquaculture (DUE SMART) and coral reef habitat mapping (SEOM S2-4Sci Coral). The above-mentioned activities are only a few examples from the very active international land imaging community building on the long-term Landsat and Spot heritage and knowledge.

  13. On the creation of a clinical gold standard corpus in Spanish: Mining adverse drug reactions.

    PubMed

    Oronoz, Maite; Gojenola, Koldo; Pérez, Alicia; de Ilarraza, Arantza Díaz; Casillas, Arantza

    2015-08-01

    The advances achieved in Natural Language Processing make it possible to automatically mine information from electronically created documents. Many Natural Language Processing methods that extract information from texts make use of annotated corpora, but these are scarce in the clinical domain due to legal and ethical issues. In this paper we present the creation of the IxaMed-GS gold standard composed of real electronic health records written in Spanish and manually annotated by experts in pharmacology and pharmacovigilance. The experts mainly annotated entities related to diseases and drugs, but also relationships between entities indicating adverse drug reaction events. To help the experts in the annotation task, we adapted a general corpus linguistic analyzer to the medical domain. The quality of the annotation process in the IxaMed-GS corpus has been assessed by measuring the inter-annotator agreement, which was 90.53% for entities and 82.86% for events. In addition, the corpus has been used for the automatic extraction of adverse drug reaction events using machine learning. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Emerging Patterns for Engineered Nanomaterials in the Environment: A Review of Fate and Toxicity Studies

    NASA Astrophysics Data System (ADS)

    Garner, K.; Keller, A. A.

    2014-12-01

    The technical complexity of measuring ENM fate and transport processes in all environments necessitates identifying trends in these same processes. As part of our research, we collected emerging information on the environmental fate and toxicity of many ENMs and investigated transportation and transformation processes in air, water, and soil. Generally, studies suggest that (i) ENMs will have limited transport in the atmosphere, because they settle rapidly; (ii) ENMs are more stable in freshwater and stormwater than in seawater or groundwater primarily due to variations in ionic strength and the presence of natural organic matter; and (iii) in soil, the fate of ENMs strongly depends on the size of the ENM aggregates and groundwater chemistry, as well as pore and soil particle size. Emerging patterns regarding ENM fate, transport, and exposure combined with emerging information on toxicity indicate the risk is low for most ENMs although current exposure estimates compared with current data on toxicity indicate that at current production and release levels, exposure to Ag, nZVI, and ZnO may cause a toxic response to freshwater and marine species.

  15. Support for Debugging Automatically Parallelized Programs

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Hood, Robert; Biegel, Bryan (Technical Monitor)

    2001-01-01

    We describe a system that simplifies the process of debugging programs produced by computer-aided parallelization tools. The system uses relative debugging techniques to compare serial and parallel executions in order to show where the computations begin to differ. If the original serial code is correct, errors due to parallelization will be isolated by the comparison. One of the primary goals of the system is to minimize the effort required of the user. To that end, the debugging system uses information produced by the parallelization tool to drive the comparison process. In particular the debugging system relies on the parallelization tool to provide information about where variables may have been modified and how arrays are distributed across multiple processes. User effort is also reduced through the use of dynamic instrumentation. This allows us to modify the program execution without changing the way the user builds the executable. The use of dynamic instrumentation also permits us to compare the executions in a fine-grained fashion and only involve the debugger when a difference has been detected. This reduces the overhead of executing instrumentation.

  16. [Use of MALDI-TOF in the rapid diagnosis of sepsis].

    PubMed

    Carlos Rodríguez, Juan; Ángel Bratos, Miguel; Merino, Esperanza; Ezpeleta, Carmen

    2016-06-01

    The introduction of mass spectrometry through MALDI-TOF (matrix-assisted laser desorption ionization time-of-flight) in the diagnosis of bacteraemia and fungaemia has represented a revolution due to the rapidity and reliability of the results that it can offer to microbiology services and laboratories through analysis of the mass spectrum of the bacterial protein directly from positive blood culture bottles. These data are more useful if they are used in conjunction with other techniques able to identify the antibiotic resistance pattern of the microorganism. There is a need for a process of standardising sample processing protocols and for perfecting the identification of the agents causing bacteraemia, especially in some species of Gram-positive cocci and in polymicrobial processes. The introduction of this methodology provides rapid information that is highly important for the clinical management of bacteraemia. The availability of a multidisciplinary working group that applies all this information quickly and correctly in hospitals will improve the quality of care, reduce antibiotic expenditure and hospital stay and help to control the serious problem of antibiotic resistance. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.

  17. Relative Debugging of Automatically Parallelized Programs

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Hood, Robert; Biegel, Bryan (Technical Monitor)

    2002-01-01

    We describe a system that simplifies the process of debugging programs produced by computer-aided parallelization tools. The system uses relative debugging techniques to compare serial and parallel executions in order to show where the computations begin to differ. If the original serial code is correct, errors due to parallelization will be isolated by the comparison. One of the primary goals of the system is to minimize the effort required of the user. To that end, the debugging system uses information produced by the parallelization tool to drive the comparison process. In particular, the debugging system relies on the parallelization tool to provide information about where variables may have been modified and how arrays are distributed across multiple processes. User effort is also reduced through the use of dynamic instrumentation. This allows us to modify, the program execution with out changing the way the user builds the executable. The use of dynamic instrumentation also permits us to compare the executions in a fine-grained fashion and only involve the debugger when a difference has been detected. This reduces the overhead of executing instrumentation.

  18. Who do you love, your mother or your horse? An event-related brain potential analysis of tone processing in Mandarin Chinese.

    PubMed

    Brown-Schmidt, Sarah; Canseco-Gonzalez, Enriqueta

    2004-03-01

    In Mandarin Chinese, word meaning is partially determined by lexical tone (Wang, 1973). Previous studies suggest that lexical tone is processed as linguistic information and not as pure tonal information (Gandour, 1998; Van Lanker & Fromkin, 1973). The current study explored the online processing of lexical tones. Event-related potentials were obtained from 25 Mandarin speakers while they listened to normal and anomalous sentences containing one of three types of semantic anomalies created by manipulating the tone, the syllable, or both tone and syllable (double-anomaly) of sentence-final words. We hypothesized N400 effects elicited by all three types of anomalies and the largest by the double-anomaly. As expected, all three elicited N400 effects starting approximately 150 ms poststimulus and continuing until 1000 ms in some areas. Surprisingly, onset of the double-anomaly effect was approximately 50 ms later than the rest. Delayed detection of errors in this condition may be responsible for the apparent delay. Slight differences between syllable and tone conditions may be due to the relative timing of these acoustic cues.

  19. Quantitative Aspects of Single Molecule Microscopy

    PubMed Central

    Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally

    2015-01-01

    Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102

  20. Quantum information processing by weaving quantum Talbot carpets

    NASA Astrophysics Data System (ADS)

    Farías, Osvaldo Jiménez; de Melo, Fernando; Milman, Pérola; Walborn, Stephen P.

    2015-06-01

    Single-photon interference due to passage through a periodic grating is considered in a novel proposal for processing D -dimensional quantum systems (quDits) encoded in the spatial degrees of freedom of light. We show that free-space propagation naturally implements basic single-quDit gates by means of the Talbot effect: an intricate time-space carpet of light in the near-field diffraction regime. By adding a diagonal phase gate, we show that a complete set of single-quDit gates can be implemented. We then introduce a spatially dependent beam splitter that allows for projective measurements in the computational basis and can be used for the implementation of controlled operations between two quDits. Universal quantum information processing can then be implemented with linear optics and ancilla photons via postselection and feed-forward following the original proposal of Knill-Laflamme and Milburn. Although we consider photons, our scheme should be directly applicable to a number of other physical systems. Interpretation of the Talbot effect as a quantum logic operation provides a beautiful and interesting way to visualize quantum computation through wave propagation and interference.

Top