Tombaugh, Tom N; Rees, Laura; Stormer, Peter; Harrison, Allyson G; Smith, Andra
2007-01-01
In spite of the fact that reaction time (RT) measures are sensitive to the effects of traumatic brain injury (TBI), few RT procedures have been developed for use in standard clinical evaluations. The computerized test of information processing (CTIP) [Tombaugh, T. N., & Rees, L. (2000). Manual for the computerized tests of information processing (CTIP). Ottawa, Ont.: Carleton University] was designed to measure the degree to which TBI decreases the speed at which information is processed. The CTIP consists of three computerized programs that progressively increase the amount of information that is processed. Results of the current study demonstrated that RT increased as the difficulty of the CTIP tests increased (known as the complexity effect), and as severity of injury increased (from mild to severe TBI). The current study also demonstrated the importance of selecting a non-biased measure of variability. Overall, findings suggest that the CTIP is an easy to administer and sensitive measure of information processing speed.
Integrated Information Increases with Fitness in the Evolution of Animats
Edlund, Jeffrey A.; Chaumont, Nicolas; Hintze, Arend; Koch, Christof; Tononi, Giulio; Adami, Christoph
2011-01-01
One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent (“animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its “fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data. PMID:22028639
Some Information-Processing Correlates of Measures of Intelligence
ERIC Educational Resources Information Center
Lunneborg, Clifford E.
1978-01-01
Group and individually administered measure of intelligence were related to laboratory based measures of human information processing on a group of college freshmen. Among other results, high IQ was related to right hemisphere efficiency in processing non-linguistic stimuli. (Author/JKS)
Electrocortical measures of information processing biases in social anxiety disorder: A review.
Harrewijn, Anita; Schmidt, Louis A; Westenberg, P Michiel; Tang, Alva; van der Molen, Melle J W
2017-10-01
Social anxiety disorder (SAD) is characterized by information processing biases, however, their underlying neural mechanisms remain poorly understood. The goal of this review was to give a comprehensive overview of the most frequently studied EEG spectral and event-related potential (ERP) measures in social anxiety during rest, anticipation, stimulus processing, and recovery. A Web of Science search yielded 35 studies reporting on electrocortical measures in individuals with social anxiety or related constructs. Social anxiety was related to increased delta-beta cross-frequency correlation during anticipation and recovery, and information processing biases during early processing of faces (P1) and errors (error-related negativity). These electrocortical measures are discussed in relation to the persistent cycle of information processing biases maintaining SAD. Future research should further investigate the mechanisms of this persistent cycle and study the utility of electrocortical measures in early detection, prevention, treatment and endophenotype research. Copyright © 2017 Elsevier B.V. All rights reserved.
Vagos, Paula; Rijo, Daniel; Santos, Isabel M
2016-04-01
Relatively little is known about measures used to investigate the validity and applications of social information processing theory. The Scenes for Social Information Processing in Adolescence includes items built using a participatory approach to evaluate the attribution of intent, emotion intensity, response evaluation, and response decision steps of social information processing. We evaluated a sample of 802 Portuguese adolescents (61.5% female; mean age = 16.44 years old) using this instrument. Item analysis and exploratory and confirmatory factor analytic procedures were used for psychometric examination. Two measures for attribution of intent were produced, including hostile and neutral; along with 3 emotion measures, focused on negative emotional states; 8 response evaluation measures; and 4 response decision measures, including prosocial and impaired social behavior. All of these measures achieved good internal consistency values and fit indicators. Boys seemed to favor and choose overt and relational aggression behaviors more often; girls conveyed higher levels of neutral attribution, sadness, and assertiveness and passiveness. The Scenes for Social Information Processing in Adolescence achieved adequate psychometric results and seems a valuable alternative for evaluating social information processing, even if it is essential to continue investigation into its internal and external validity. (c) 2016 APA, all rights reserved.
Information Theory Broadens the Spectrum of Molecular Ecology and Evolution.
Sherwin, W B; Chao, A; Jost, L; Smouse, P E
2017-12-01
Information or entropy analysis of diversity is used extensively in community ecology, and has recently been exploited for prediction and analysis in molecular ecology and evolution. Information measures belong to a spectrum (or q profile) of measures whose contrasting properties provide a rich summary of diversity, including allelic richness (q=0), Shannon information (q=1), and heterozygosity (q=2). We present the merits of information measures for describing and forecasting molecular variation within and among groups, comparing forecasts with data, and evaluating underlying processes such as dispersal. Importantly, information measures directly link causal processes and divergence outcomes, have straightforward relationship to allele frequency differences (including monotonicity that q=2 lacks), and show additivity across hierarchical layers such as ecology, behaviour, cellular processes, and nongenetic inheritance. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko
2013-01-01
The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of "processing speed" may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive…
Polcari, J.
2013-08-16
The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less
Nott, Melissa T; Chapparo, Christine
2008-09-01
Agitation following traumatic brain injury is characterised by a heightened state of activity with disorganised information processing that interferes with learning and achieving functional goals. This study aimed to identify information processing problems during task performance of a severely agitated adult using the Perceive, Recall, Plan and Perform (PRPP) System of Task Analysis. Second, this study aimed to examine the sensitivity of the PRPP System to changes in task performance over a short period of rehabilitation, and third, to evaluate the guidance provided by the PRPP in directing intervention. A case study research design was employed. The PRPP System of Task Analysis was used to assess changes in task embedded information processing capacity during occupational therapy intervention with a severely agitated adult in a rehabilitation context. Performance is assessed on three selected tasks over a one-month period. Information processing difficulties during task performance can be clearly identified when observing a severely agitated adult following a traumatic brain injury. Processing skills involving attention, sensory processing and planning were most affected at this stage of rehabilitation. These processing difficulties are linked to established descriptions of agitated behaviour. Fluctuations in performance across three tasks of differing processing complexity were evident, leading to hypothesised relationships between task complexity, environment and novelty with information processing errors. Changes in specific information processing capacity over time were evident based on repeated measures using the PRPP System of Task Analysis. This lends preliminary support for its utility as an outcome measure, and raises hypotheses about the type of therapy required to enhance information processing in people with severe agitation. The PRPP System is sensitive to information processing changes in severely agitated adults when used to reassess performance over short intervals and can provide direct guidance to occupational therapy intervention to improve task embedded information processing by categorising errors under four stages of an information processing model: Perceive, Recall, Plan and Perform.
Balthazor, M J; Wagner, R K; Pelham, W E
1991-02-01
There appear to be beneficial effects of stimulant medication on daily classroom measures of cognitive functioning for Attention Deficit Disorder (ADD) children, but the specificity and origin of such effects is unclear. Consistent with previous results, 0.3 mg/kg methylphenidate improved ADD children's performance on a classroom reading comprehension measure. Using the Posner letting-matching task and four additional measures of phonological processing, we attempted to isolate the effects of methylphenidate to parameter estimates of (a) selective attention, (b) the basic cognitive process of retrieving name codes from permanent memory, and (c) a constant term that represented nonspecific aspects of information processing. Responses to the letter-matching stimuli were faster and more accurate with medication compared to placebo. The improvement in performance was isolated to the parameter estimate that reflected nonspecific aspects of information processing. A lack of medication effect on the other measures of phonological processing supported the Posner task findings in indicating that methylphenidate appears to exert beneficial effects on academic processing through general rather than specific aspects of information processing.
Brain white matter structure and information processing speed in healthy older age.
Kuznetsova, Ksenia A; Maniega, Susana Muñoz; Ritchie, Stuart J; Cox, Simon R; Storkey, Amos J; Starr, John M; Wardlaw, Joanna M; Deary, Ian J; Bastin, Mark E
2016-07-01
Cognitive decline, especially the slowing of information processing speed, is associated with normal ageing. This decline may be due to brain cortico-cortical disconnection caused by age-related white matter deterioration. We present results from a large, narrow age range cohort of generally healthy, community-dwelling subjects in their seventies who also had their cognitive ability tested in youth (age 11 years). We investigate associations between older age brain white matter structure, several measures of information processing speed and childhood cognitive ability in 581 subjects. Analysis of diffusion tensor MRI data using Tract-based Spatial Statistics (TBSS) showed that all measures of information processing speed, as well as a general speed factor composed from these tests (g speed), were significantly associated with fractional anisotropy (FA) across the white matter skeleton rather than in specific tracts. Cognitive ability measured at age 11 years was not associated with older age white matter FA, except for the g speed-independent components of several individual processing speed tests. These results indicate that quicker and more efficient information processing requires global connectivity in older age, and that associations between white matter FA and information processing speed (both individual test scores and g speed), unlike some other aspects of later life brain structure, are generally not accounted for by cognitive ability measured in youth.
Quantum Information Theory of Measurement
NASA Astrophysics Data System (ADS)
Glick, Jennifer Ranae
Quantum measurement lies at the heart of quantum information processing and is one of the criteria for quantum computation. Despite its central role, there remains a need for a robust quantum information-theoretical description of measurement. In this work, I will quantify how information is processed in a quantum measurement by framing it in quantum information-theoretic terms. I will consider a diverse set of measurement scenarios, including weak and strong measurements, and parallel and consecutive measurements. In each case, I will perform a comprehensive analysis of the role of entanglement and entropy in the measurement process and track the flow of information through all subsystems. In particular, I will discuss how weak and strong measurements are fundamentally of the same nature and show that weak values can be computed exactly for certain measurements with an arbitrary interaction strength. In the context of the Bell-state quantum eraser, I will derive a trade-off between the coherence and "which-path" information of an entangled pair of photons and show that a quantum information-theoretic approach yields additional insights into the origins of complementarity. I will consider two types of quantum measurements: those that are made within a closed system where every part of the measurement device, the ancilla, remains under control (what I will call unamplified measurements), and those performed within an open system where some degrees of freedom are traced over (amplified measurements). For sequences of measurements of the same quantum system, I will show that information about the quantum state is encoded in the measurement chain and that some of this information is "lost" when the measurements are amplified-the ancillae become equivalent to a quantum Markov chain. Finally, using the coherent structure of unamplified measurements, I will outline a protocol for generating remote entanglement, an essential resource for quantum teleportation and quantum cryptographic tasks.
Direct Thermodynamic Measurements of the Energetics of Information Processing
2017-08-08
Report: Direct thermodynamic measurements of the energetics of information processing The views, opinions and/or findings contained in this report are... information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding this burden estimate or any other
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... what types of quality measures should a combination of natural language processing and structured data... collection, analysis, processing, and its ability to facilitate information exchange among and across care...
ERIC Educational Resources Information Center
Woodell, Eric A.
2013-01-01
Information Technology (IT) professionals use the Information Technology Infrastructure Library (ITIL) process to better manage their business operations, measure performance, improve reliability and lower costs. This study examined the operational results of those data centers using ITIL against those that do not, and whether the results change…
Cepeda, Nicholas J.; Blackwell, Katharine A.; Munakata, Yuko
2012-01-01
The rate at which people process information appears to influence many aspects of cognition across the lifespan. However, many commonly accepted measures of “processing speed” may require goal maintenance, manipulation of information in working memory, and decision-making, blurring the distinction between processing speed and executive control and resulting in overestimation of processing-speed contributions to cognition. This concern may apply particularly to studies of developmental change, as even seemingly simple processing speed measures may require executive processes to keep children and older adults on task. We report two new studies and a re-analysis of a published study, testing predictions about how different processing speed measures influence conclusions about executive control across the life span. We find that the choice of processing speed measure affects the relationship observed between processing speed and executive control, in a manner that changes with age, and that choice of processing speed measure affects conclusions about development and the relationship among executive control measures. Implications for understanding processing speed, executive control, and their development are discussed. PMID:23432836
Black, Stephanie Winkeljohn; Pössel, Patrick
2013-08-01
Adolescents who develop depression have worse interpersonal and affective experiences and are more likely to develop substance problems and/or suicidal ideation compared to adolescents who do not develop depression. This study examined the combined effects of negative self-referent information processing and rumination (i.e., brooding and reflection) on adolescent depressive symptoms. It was hypothesized that the interaction of negative self-referent information processing and brooding would significantly predict depressive symptoms, while the interaction of negative self-referent information processing and reflection would not predict depressive symptoms. Adolescents (n = 92; 13-15 years; 34.7% female) participated in a 6-month longitudinal study. Self-report instruments measured depressive symptoms and rumination; a cognitive task measured information processing. Path modelling in Amos 19.0 analyzed the data. The interaction of negative information processing and brooding significantly predicted an increase in depressive symptoms 6 months later. The interaction of negative information processing and reflection did not significantly predict depression, however, the model not meet a priori standards to accept the null hypothesis. Results suggest clinicians working with adolescents at-risk for depression should consider focusing on the reduction of brooding and negative information processing to reduce long-term depressive symptoms.
Multiscale analysis of information dynamics for linear multivariate processes.
Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele
2016-08-01
In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.
Adhikari, Mohit H; Hacker, Carl D; Siegel, Josh S; Griffa, Alessandra; Hagmann, Patric; Deco, Gustavo; Corbetta, Maurizio
2017-04-01
While several studies have shown that focal lesions affect the communication between structurally normal regions of the brain, and that these changes may correlate with behavioural deficits, their impact on brain's information processing capacity is currently unknown. Here we test the hypothesis that focal lesions decrease the brain's information processing capacity, of which changes in functional connectivity may be a measurable correlate. To measure processing capacity, we turned to whole brain computational modelling to estimate the integration and segregation of information in brain networks. First, we measured functional connectivity between different brain areas with resting state functional magnetic resonance imaging in healthy subjects (n = 26), and subjects who had suffered a cortical stroke (n = 36). We then used a whole-brain network model that coupled average excitatory activities of local regions via anatomical connectivity. Model parameters were optimized in each healthy or stroke participant to maximize correlation between model and empirical functional connectivity, so that the model's effective connectivity was a veridical representation of healthy or lesioned brain networks. Subsequently, we calculated two model-based measures: 'integration', a graph theoretical measure obtained from functional connectivity, which measures the connectedness of brain networks, and 'information capacity', an information theoretical measure that cannot be obtained empirically, representative of the segregative ability of brain networks to encode distinct stimuli. We found that both measures were decreased in stroke patients, as compared to healthy controls, particularly at the level of resting-state networks. Furthermore, we found that these measures, especially information capacity, correlate with measures of behavioural impairment and the segregation of resting-state networks empirically measured. This study shows that focal lesions affect the brain's ability to represent stimuli and task states, and that information capacity measured through whole brain models is a theory-driven measure of processing capacity that could be used as a biomarker of injury for outcome prediction or target for rehabilitation intervention. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Measurement of operator workload in an information processing task
NASA Technical Reports Server (NTRS)
Jenney, L. L.; Older, H. J.; Cameron, B. J.
1972-01-01
This was an experimental study to develop an improved methodology for measuring workload in an information processing task and to assess the effects of shift length and communication density (rate of information flow) on the ability to process and classify verbal messages. Each of twelve subjects was exposed to combinations of three shift lengths and two communication densities in a counterbalanced, repeated measurements experimental design. Results indicated no systematic variation in task performance measures or in other dependent measures as a function of shift length or communication density. This is attributed to the absence of a secondary loading task, an insufficiently taxing work schedule, and the lack of psychological stress. Subjective magnitude estimates of workload showed fatigue (and to a lesser degree, tension) to be a power function of shift length. Estimates of task difficulty and fatigue were initially lower but increased more sharply over time under low density than under high density conditions. An interpretation of findings and recommedations for furture research are included. This research has major implications to human workload problems in information processing of air traffic control verbal data.
Operator Performance Measures for Assessing Voice Communication Effectiveness
1989-07-01
performance and work- load assessment techniques have been based.I Broadbent (1958) described a limited capacity filter model of human information...INFORMATION PROCESSING 20 3.1.1. Auditory Attention 20 3.1.2. Auditory Memory 24 3.2. MODELS OF INFORMATION PROCESSING 24 3.2.1. Capacity Theories 25...Learning 0 Attention * Language Specialization • Decision Making• Problem Solving Auditory Information Processing Models of Processing Ooemtor
Comparative Effects of Antihistamines on Aircrew Mission Effectiveness under Sustained Operations
1992-06-01
measures consist mainly of process measures. Process measures are measures of activities used to accomplish the mission and produce the final results...They include task completion times and response variability, and information processing rates as they relate to unique task assignment. Performance...contains process measures that assess the Individual contributions of hardware/software and human components to overall system performance. Measures
NASA Astrophysics Data System (ADS)
Vandenbroucke, D.; Vancauwenberghe, G.
2016-12-01
The European Union Location Framework (EULF), as part of the Interoperable Solutions for European Public Administrations (ISA) Programme of the EU (EC DG DIGIT), aims to enhance the interactions between governments, businesses and citizens by embedding location information into e-Government processes. The challenge remains to find scientific sound and at the same time practicable approaches to estimate or measure the impact of location enablement of e-Government processes on the performance of the processes. A method has been defined to estimate process performance in terms of variables describing the efficiency, effectiveness, as well as the quality of the output of the work processes. A series of use cases have been identified, corresponding to existing e-Government work processes in which location information could bring added value. In a first step, the processes are described by means of BPMN (Business Process Model and Notation) to better understand the process steps, the actors involved, the spatial data flows, as well as the required input and the generated output. In a second step the processes are assessed in terms of the (sub-optimal) use of location information and the potential enhancement of the process by better integrating location information and services. The process performance is measured ex ante (before using location enabled e-Government services) and ex-post (after the integration of such services) in order to estimate and measure the impact of location information. The paper describes the method for performance measurement and highlights how the method is applied to one use case, i.e. the process of traffic safety monitoring. The use case is analysed and assessed in terms of location enablement and its potential impact on process performance. The results of applying the methodology on the use case revealed that performance is highly impacted by factors such as the way location information is collected, managed and shared throughout the process, and the degree to which spatial data are harmonized. The work led also to the formulation of some recommendations to enrich the BPMN standard with additional methods for annotating processes, and to the proposal of the development of some tools for automatic process performance. In that context some planned future work is highlighted as well.
Social information processing in children: specific relations to anxiety, depression, and affect.
Luebbe, Aaron M; Bell, Debora J; Allwood, Maureen A; Swenson, Lance P; Early, Martha C
2010-01-01
Two studies examined shared and unique relations of social information processing (SIP) to youth's anxious and depressive symptoms. Whether SIP added unique variance over and above trait affect in predicting internalizing symptoms was also examined. In Study 1, 215 youth (ages 8-13) completed symptom measures of anxiety and depression and a vignette-based interview measure of SIP. Anxiety and depression were each related to a more negative information-processing style. Only depression was uniquely related to a less positive information processing style. In Study 2, 127 youth (ages 10-13) completed measures of anxiety, depression, SIP, and trait affect. SIP's relations to internalizing symptoms were replicated. Over and above negative affect, negative SIP predicted both anxiety and depression. Low positive SIP added variance over and above positive affect in predicting only depression. Finally, SIP functioning partially mediated the relations of affect to internalizing symptoms.
Information processing efficiency in patients with multiple sclerosis.
Archibald, C J; Fisk, J D
2000-10-01
Reduced information processing efficiency, consequent to impaired neural transmission, has been proposed as underlying various cognitive problems in patients with Multiple Sclerosis (MS). This study employed two measures developed from experimental psychology that control for the potential confound of perceptual-motor abnormalities (Salthouse, Babcock, & Shaw, 1991; Sternberg, 1966, 1969) to assess the speed of information processing and working memory capacity in patients with mild to moderate MS. Although patients had significantly more cognitive complaints than neurologically intact matched controls, their performance on standard tests of immediate memory span did not differ from control participants and their word list learning was within normal limits. On the experimental measures, both relapsing-remitting and secondary-progressive patients exhibited significantly slowed information processing speed relative to controls. However, only the secondary-progressive patients had an additional decrement in working memory capacity. Depression, fatigue, or neurologic disability did not account for performance differences on these measures. While speed of information processing may be slowed early in the disease process, deficits in working memory capacity may appear only as there is progression of MS. It is these latter deficits, however, that may underlie the impairment of new learning that patients with MS demonstrate.
Lopopolo, Alessandro; Frank, Stefan L; van den Bosch, Antal; Willems, Roel M
2017-01-01
Language comprehension involves the simultaneous processing of information at the phonological, syntactic, and lexical level. We track these three distinct streams of information in the brain by using stochastic measures derived from computational language models to detect neural correlates of phoneme, part-of-speech, and word processing in an fMRI experiment. Probabilistic language models have proven to be useful tools for studying how language is processed as a sequence of symbols unfolding in time. Conditional probabilities between sequences of words are at the basis of probabilistic measures such as surprisal and perplexity which have been successfully used as predictors of several behavioural and neural correlates of sentence processing. Here we computed perplexity from sequences of words and their parts of speech, and their phonemic transcriptions. Brain activity time-locked to each word is regressed on the three model-derived measures. We observe that the brain keeps track of the statistical structure of lexical, syntactic and phonological information in distinct areas.
Forced guidance and distribution of practice in sequential information processing.
NASA Technical Reports Server (NTRS)
Decker, L. R.; Rogers, C. A., Jr.
1973-01-01
Distribution of practice and forced guidance were used in a sequential information-processing task in an attempt to increase the capacity of human information-processing mechanisms. A reaction time index of the psychological refractory period was used as the response measure. Massing of practice lengthened response times while forced guidance shortened them. Interpretation was in terms of load reduction upon the response-selection stage of the information-processing system.-
ERIC Educational Resources Information Center
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
2011-01-01
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in RD through 5th grades. This study included a racially and…
Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process
NASA Technical Reports Server (NTRS)
Racette, Paul
2010-01-01
Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.
Connolly, Samantha L; Abramson, Lyn Y; Alloy, Lauren B
2016-01-01
Negative information processing biases have been hypothesised to serve as precursors for the development of depression. The current study examined negative self-referent information processing and depressive symptoms in a community sample of adolescents (N = 291, Mage at baseline = 12.34 ± 0.61, 53% female, 47.4% African-American, 49.5% Caucasian and 3.1% Biracial). Participants completed a computerised self-referent encoding task (SRET) and a measure of depressive symptoms at baseline and completed an additional measure of depressive symptoms nine months later. Several negative information processing biases on the SRET were associated with concurrent depressive symptoms and predicted increases in depressive symptoms at follow-up. Findings partially support the hypothesis that negative information processing biases are associated with depressive symptoms in a nonclinical sample of adolescents, and provide preliminary evidence that these biases prospectively predict increases in depressive symptoms.
Information content versus word length in random typing
NASA Astrophysics Data System (ADS)
Ferrer-i-Cancho, Ramon; Moscoso del Prado Martín, Fermín
2011-12-01
Recently, it has been claimed that a linear relationship between a measure of information content and word length is expected from word length optimization and it has been shown that this linearity is supported by a strong correlation between information content and word length in many languages (Piantadosi et al 2011 Proc. Nat. Acad. Sci. 108 3825). Here, we study in detail some connections between this measure and standard information theory. The relationship between the measure and word length is studied for the popular random typing process where a text is constructed by pressing keys at random from a keyboard containing letters and a space behaving as a word delimiter. Although this random process does not optimize word lengths according to information content, it exhibits a linear relationship between information content and word length. The exact slope and intercept are presented for three major variants of the random typing process. A strong correlation between information content and word length can simply arise from the units making a word (e.g., letters) and not necessarily from the interplay between a word and its context as proposed by Piantadosi and co-workers. In itself, the linear relation does not entail the results of any optimization process.
ERIC Educational Resources Information Center
Wright, John C.; And Others
A conceptual model of how children process televised information was developed with the goal of identifying those parameters of the process that are both measurable and manipulable in research settings. The model presented accommodates the nature of information processing both by the child and by the presentation by the medium. Presentation is…
The Prediction, from Infancy, of Adult IQ and Achievement
ERIC Educational Resources Information Center
Fagan, Joseph F.; Holland, Cynthia R.; Wheeler, Karyn
2007-01-01
Young adults, originally tested as infants for their ability to process information as measured by selective attention to novelty (an operational definition of visual recognition memory), were revisited. A current estimate of IQ was obtained as well as a measure of academic achievement. Information processing ability at 6-12 months was predictive…
Holistic processing, contact, and the other-race effect in face recognition.
Zhao, Mintao; Hayward, William G; Bülthoff, Isabelle
2014-12-01
Face recognition, holistic processing, and processing of configural and featural facial information are known to be influenced by face race, with better performance for own- than other-race faces. However, whether these various other-race effects (OREs) arise from the same underlying mechanisms or from different processes remains unclear. The present study addressed this question by measuring the OREs in a set of face recognition tasks, and testing whether these OREs are correlated with each other. Participants performed different tasks probing (1) face recognition, (2) holistic processing, (3) processing of configural information, and (4) processing of featural information for both own- and other-race faces. Their contact with other-race people was also assessed with a questionnaire. The results show significant OREs in tasks testing face memory and processing of configural information, but not in tasks testing either holistic processing or processing of featural information. Importantly, there was no cross-task correlation between any of the measured OREs. Moreover, the level of other-race contact predicted only the OREs obtained in tasks testing face memory and processing of configural information. These results indicate that these various cross-race differences originate from different aspects of face processing, in contrary to the view that the ORE in face recognition is due to cross-race differences in terms of holistic processing. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Kessels, Loes T E; Ruiter, Robert A C; Jansma, Bernadette M
2010-07-01
Previous studies indicate that people respond defensively to threatening health information, especially when the information challenges self-relevant goals. The authors investigated whether reduced acceptance of self-relevant health risk information is already visible in early attention processes, that is, attention disengagement processes. In a randomized, controlled trial with 29 smoking and nonsmoking students, a variant of Posner's cueing task was used in combination with the high-temporal resolution method of event-related brain potentials (ERPs). Reaction times and P300 ERP. Smokers showed lower P300 amplitudes in response to high- as opposed to low-threat invalid trials when moving their attention to a target in the opposite visual field, indicating more efficient attention disengagement processes. Furthermore, both smokers and nonsmokers showed increased P300 amplitudes in response to the presentation of high- as opposed to low-threat valid trials, indicating threat-induced attention-capturing processes. Reaction time measures did not support the ERP data, indicating that the ERP measure can be extremely informative to measure low-level attention biases in health communication. The findings provide the first neuroscientific support for the hypothesis that threatening health information causes more efficient disengagement among those for whom the health threat is self-relevant. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Problems of systems dataware using optoelectronic measuring means of linear displacement
NASA Astrophysics Data System (ADS)
Bazykin, S. N.; Bazykina, N. A.; Samohina, K. S.
2017-10-01
Problems of the dataware of the systems with the use of optoelectronic means of the linear displacement are considered in the article. The classification of the known physical effects, realized by the means of information-measuring systems, is given. The organized analysis of information flows in technical systems from the standpoint of determination of inaccuracies of measurement and management was conducted. In spite of achieved successes in automation of machine-building and instruments-building equipment in the field of dataware of the technical systems, there are unresolved problems, concerning the qualitative aspect of the production process. It was shown that the given problem can be solved using optoelectronic lazer information-measuring systems. Such information-measuring systems are capable of not only executing the measuring functions, but also solving the problems of management and control during processing, thereby guaranteeing the quality of final products.
2005-07-01
approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of
Performance measurement for information systems: Industry perspectives
NASA Technical Reports Server (NTRS)
Bishop, Peter C.; Yoes, Cissy; Hamilton, Kay
1992-01-01
Performance measurement has become a focal topic for information systems (IS) organizations. Historically, IS performance measures have dealt with the efficiency of the data processing function. Today, the function of most IS organizations goes beyond simple data processing. To understand how IS organizations have developed meaningful performance measures that reflect their objectives and activities, industry perspectives on IS performance measurement was studied. The objectives of the study were to understand the state of the practice in IS performance techniques for IS performance measurement; to gather approaches and measures of actual performance measures used in industry; and to report patterns, trends, and lessons learned about performance measurement to NASA/JSC. Examples of how some of the most forward looking companies are shaping their IS processes through measurement is provided. Thoughts on the presence of a life-cycle to performance measures development and a suggested taxonomy for performance measurements are included in the appendices.
Cohen, Alex S; Dinzeo, Thomas J; Donovan, Neila J; Brown, Caitlin E; Morrison, Sean C
2015-03-30
Vocal expression reflects an integral component of communication that varies considerably within individuals across contexts and is disrupted in a range of neurological and psychiatric disorders. There is reason to suspect that variability in vocal expression reflects, in part, the availability of "on-line" resources (e.g., working memory, attention). Thus, understanding vocal expression is a potentially important biometric index of information processing, not only across but within individuals over time. A first step in this line of research involves establishing a link between vocal expression and information processing systems in healthy adults. The present study employed a dual attention experimental task where participants provided natural speech while simultaneously engaged in a baseline, medium or high nonverbal processing-load task. Objective, automated, and computerized analysis was employed to measure vocal expression in 226 adults. Increased processing load resulted in longer pauses, fewer utterances, greater silence overall and less variability in frequency and intensity levels. These results provide compelling evidence of a link between information processing resources and vocal expression, and provide important information for the development of an automated, inexpensive and uninvasive biometric measure of information processing. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Spek, Annelies A; Scholte, Evert M; Van Berckelaer-Onnes, Ina A
2011-07-01
Local information processing in 42 adults with high functioning autism, 41 adults with Asperger syndrome and 41 neurotypical adults was examined. Contrary to our expectations, the disorder groups did not outperform the neurotypical group in the neuropsychological measures of local information processing. In line with our hypotheses, the self-reports did show higher levels of local information processing and a stronger tendency to use systemizing strategies in the two disorder groups. Absent and weak correlations were found between the self-reports and the two neuropsychological tasks in the three groups. The neuropsychological tests and the self-reports seem to measure different underlying constructs. The self-reports were most predictive of the presence of an autism spectrum diagnosis.
NASA Astrophysics Data System (ADS)
Bouty, A. A.; Koniyo, M. H.; Novian, D.
2018-02-01
This study aims to determine the level of maturity of information technology governance in Gorontalo city government by applying the COBIT framework 4.1. The research method is the case study method, by conducting surveys and data collection at 25 institution in Gorontalo City. The results of this study is the analysis of information technology needs based on the measurement of maturity level. The results of the measurement of the maturity level of information technology governance shows that there are still many business processes running at lower level, from 9 existing business processes there are 4 processes at level 2 (repetitive but intuitive) and 3 processes at level 1 (Initial/Ad hoc). With these results, is expected that the government of Gorontalo city immediately make improvements to the governance of information technology so that it can run more effectively and efficiently.
Process-Oriented Measurement Using Electronic Tangibles
ERIC Educational Resources Information Center
Veerbeek, Jochanan; Verhaegh, Janneke; Elliott, Julian G.; Resing, Wilma C. M.
2017-01-01
This study evaluated a new measure for analyzing the process of children's problem solving in a series completion task. This measure focused on a process that we entitled the "Grouping of Answer Pieces" (GAP) that was employed to provide information on problem representation and restructuring. The task was conducted using an electronic…
Lei, Xusheng; Li, Jingjing
2012-01-01
This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993
Interferometer with Continuously Varying Path Length Measured in Wavelengths to the Reference Mirror
NASA Technical Reports Server (NTRS)
Ohara, Tetsuo (Inventor)
2016-01-01
An interferometer in which the path length of the reference beam, measured in wavelengths, is continuously changing in sinusoidal fashion and the interference signal created by combining the measurement beam and the reference beam is processed in real time to obtain the physical distance along the measurement beam between the measured surface and a spatial reference frame such as the beam splitter. The processing involves analyzing the Fourier series of the intensity signal at one or more optical detectors in real time and using the time-domain multi-frequency harmonic signals to extract the phase information independently at each pixel position of one or more optical detectors and converting the phase information to distance information.
Performance measurement integrated information framework in e-Manufacturing
NASA Astrophysics Data System (ADS)
Teran, Hilaida; Hernandez, Juan Carlos; Vizán, Antonio; Ríos, José
2014-11-01
The implementation of Internet technologies has led to e-Manufacturing technologies becoming more widely used and to the development of tools for compiling, transforming and synchronising manufacturing data through the Web. In this context, a potential area for development is the extension of virtual manufacturing to performance measurement (PM) processes, a critical area for decision making and implementing improvement actions in manufacturing. This paper proposes a PM information framework to integrate decision support systems in e-Manufacturing. Specifically, the proposed framework offers a homogeneous PM information exchange model that can be applied through decision support in e-Manufacturing environment. Its application improves the necessary interoperability in decision-making data processing tasks. It comprises three sub-systems: a data model, a PM information platform and PM-Web services architecture. A practical example of data exchange for measurement processes in the area of equipment maintenance is shown to demonstrate the utility of the model.
On the Capacity of Attention: Its Estimation and Its Role in Working Memory and Cognitive Aptitudes
Cowan, Nelson; Elliott, Emily M.; Saults, J. Scott; Morey, Candice C.; Mattox, Sam; Hismjatullina, Anna; Conway, Andrew R.A.
2008-01-01
Working memory (WM) is the set of mental processes holding limited information in a temporarily accessible state in service of cognition. We provide a theoretical framework to understand the relation between WM and aptitude measures. The WM measures that have yielded high correlations with aptitudes include separate storage and processing task components, on the assumption that WM involves both storage and processing. We argue that the critical aspect of successful WM measures is that rehearsal and grouping processes are prevented, allowing a clearer estimate of how many separate chunks of information the focus of attention circumscribes at once. Storage-and-processing tasks correlate with aptitudes, according to this view, largely because the processing task prevents rehearsal and grouping of items to be recalled. In a developmental study, we document that several scope-of-attention measures that do not include a separate processing component, but nevertheless prevent efficient rehearsal or grouping, also correlate well with aptitudes and with storage-and-processing measures. So does digit span in children too young to rehearse. PMID:16039935
ERIC Educational Resources Information Center
Spek, Annelies A.; Scholte, Evert M.; Van Berckelaer-Onnes, Ina A.
2011-01-01
Local information processing in 42 adults with high functioning autism, 41 adults with Asperger syndrome and 41 neurotypical adults was examined. Contrary to our expectations, the disorder groups did not outperform the neurotypical group in the neuropsychological measures of local information processing. In line with our hypotheses, the…
Measures and Metrics of Information Processing in Complex Systems: A Rope of Sand
ERIC Educational Resources Information Center
James, Ryan Gregory
2013-01-01
How much information do natural systems store and process? In this work we attempt to answer this question in multiple ways. We first establish a mathematical framework where natural systems are represented by a canonical form of edge-labeled hidden fc models called e-machines. Then, utilizing this framework, a variety of measures are defined and…
Deterministic realization of collective measurements via photonic quantum walks.
Hou, Zhibo; Tang, Jun-Feng; Shang, Jiangwei; Zhu, Huangjun; Li, Jian; Yuan, Yuan; Wu, Kang-Da; Xiang, Guo-Yong; Li, Chuan-Feng; Guo, Guang-Can
2018-04-12
Collective measurements on identically prepared quantum systems can extract more information than local measurements, thereby enhancing information-processing efficiency. Although this nonclassical phenomenon has been known for two decades, it has remained a challenging task to demonstrate the advantage of collective measurements in experiments. Here, we introduce a general recipe for performing deterministic collective measurements on two identically prepared qubits based on quantum walks. Using photonic quantum walks, we realize experimentally an optimized collective measurement with fidelity 0.9946 without post selection. As an application, we achieve the highest tomographic efficiency in qubit state tomography to date. Our work offers an effective recipe for beating the precision limit of local measurements in quantum state tomography and metrology. In addition, our study opens an avenue for harvesting the power of collective measurements in quantum information-processing and for exploring the intriguing physics behind this power.
Torrens-Burton, Anna; Basoudan, Nasreen; Bayer, Antony J; Tales, Andrea
2017-01-01
This study examines the relationships between two measures of information processing speed associated with executive function (Trail Making Test and a computer-based visual search test), the perceived difficulty of the tasks, and perceived memory function (measured by the Memory Functioning Questionnaire) in older adults (aged 50+ y) with normal general health, cognition (Montreal Cognitive Assessment score of 26+), and mood. The participants were recruited from the community rather than through clinical services, and none had ever sought or received help from a health professional for a memory complaint or mental health problem. For both the trail making and the visual search tests, mean information processing speed was not correlated significantly with perceived memory function. Some individuals did, however, reveal substantially slower information processing speeds (outliers) that may have clinical significance and indicate those who may benefit most from further assessment and follow up. For the trail making, but not the visual search task, higher levels of subjective memory dysfunction were associated with a greater perception of task difficulty. The relationship between actual information processing speed and perceived task difficulty also varied with respect to the task used. These findings highlight the importance of taking into account the type of task and metacognition factors when examining the integrity of information processing speed in older adults, particularly as this measure is now specifically cited as a key cognitive subdomain within the diagnostic framework for neurocognitive disorders.
Torrens-Burton, Anna; Basoudan, Nasreen; Bayer, Antony J.; Tales, Andrea
2017-01-01
This study examines the relationships between two measures of information processing speed associated with executive function (Trail Making Test and a computer-based visual search test), the perceived difficulty of the tasks, and perceived memory function (measured by the Memory Functioning Questionnaire) in older adults (aged 50+ y) with normal general health, cognition (Montreal Cognitive Assessment score of 26+), and mood. The participants were recruited from the community rather than through clinical services, and none had ever sought or received help from a health professional for a memory complaint or mental health problem. For both the trail making and the visual search tests, mean information processing speed was not correlated significantly with perceived memory function. Some individuals did, however, reveal substantially slower information processing speeds (outliers) that may have clinical significance and indicate those who may benefit most from further assessment and follow up. For the trail making, but not the visual search task, higher levels of subjective memory dysfunction were associated with a greater perception of task difficulty. The relationship between actual information processing speed and perceived task difficulty also varied with respect to the task used. These findings highlight the importance of taking into account the type of task and metacognition factors when examining the integrity of information processing speed in older adults, particularly as this measure is now specifically cited as a key cognitive subdomain within the diagnostic framework for neurocognitive disorders. PMID:28984584
ERIC Educational Resources Information Center
Klaczynski, Paul A.; Fauth, James M.; Swanger, Amy
1998-01-01
The extent to which adolescents rely on rational versus experiential information processing was studied with 49 adolescents administered multiple measures of formal operations, two critical thinking questionnaires, a measure of rational processing, and a measure of ego identity status. Implications for studies of development are discussed in terms…
Fujii, Tsutomu; Uebuchi, Hisashi; Yamada, Kotono; Saito, Masahiro; Ito, Eriko; Tonegawa, Akiko; Uebuchi, Marie
2015-06-01
The purposes of the present study were (a) to use both a relational-anxiety Go/No-Go Association Task (GNAT) and an avoidance-of-intimacy GNAT in order to assess an implicit Internal Working Model (IWM) of attachment; (b) to verify the effects of both measured implicit relational anxiety and implicit avoidance of intimacy on information processing. The implicit IWM measured by GNAT differed from the explicit IWM measured by questionnaires in terms of the effects on information processing. In particular, in subliminal priming tasks involving with others, implicit avoidance of intimacy predicted accelerated response times with negative stimulus words about attachment. Moreover, after subliminally priming stimulus words about self, implicit relational anxiety predicted delayed response times with negative stimulus words about attachment.
Decentralized modal identification using sparse blind source separation
NASA Astrophysics Data System (ADS)
Sadhu, A.; Hazra, B.; Narasimhan, S.; Pandey, M. D.
2011-12-01
Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time-frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure.
The Paradox of Abstraction: Precision Versus Concreteness.
Iliev, Rumen; Axelrod, Robert
2017-06-01
We introduce a novel measure of abstractness based on the amount of information of a concept computed from its position in a semantic taxonomy. We refer to this measure as precision. We propose two alternative ways to measure precision, one based on the path length from a concept to the root of the taxonomic tree, and another one based on the number of direct and indirect descendants. Since more information implies greater processing load, we hypothesize that nouns higher in precision will have a processing disadvantage in a lexical decision task. We contrast precision to concreteness, a common measure of abstractness based on the proportion of sensory-based information associated with a concept. Since concreteness facilitates cognitive processing, we predict that while both concreteness and precision are measures of abstractness, they will have opposite effects on performance. In two studies we found empirical support for our hypothesis. Precision and concreteness had opposite effects on latency and accuracy in a lexical decision task, and these opposite effects were observable while controlling for word length, word frequency, affective content and semantic diversity. Our results support the view that concepts organization includes amodal semantic structures which are independent of sensory information. They also suggest that we should distinguish between sensory-based and amount-of-information-based abstractness.
Measuring Information Technology Performance: Operational Efficiency and Operational Effectiveness
ERIC Educational Resources Information Center
Moore, Annette G.
2012-01-01
This dissertation provides a practical approach for measuring operational efficiency and operational effectiveness for IT organizations introducing the ITIL process framework. The intent of the study was to assist Chief Information Officers (CIOs) in explaining the impact of introducing the Information Technology Infrastructure Library (ITIL)…
Information Processing in Memory Tasks.
ERIC Educational Resources Information Center
Johnston, William A.
The intensity of information processing engendered in different phases of standard memory tasks was examined in six experiments. Processing intensity was conceptualized as system capacity consumed, and was measured via a divided-attention procedure in which subjects performed a memory task and a simple reaction-time (RT) task concurrently. The…
Using Teacher Effectiveness Data for Information-Rich Hiring
ERIC Educational Resources Information Center
Cannata, Marisa; Rubin, Mollie; Goldring, Ellen; Grissom, Jason A.; Neumerski, Christine M.; Drake, Timothy A.; Schuermann, Patrick
2017-01-01
Purpose: New teacher effectiveness measures have the potential to influence how principals hire teachers as they provide new and richer information about candidates to a traditionally information-poor process. This article examines how the hiring process is changing as a result of teacher evaluation reforms. Research Methods: Data come from…
The informational architecture of the cell.
Walker, Sara Imari; Kim, Hyunju; Davies, Paul C W
2016-03-13
We compare the informational architecture of biological and random networks to identify informational features that may distinguish biological networks from random. The study presented here focuses on the Boolean network model for regulation of the cell cycle of the fission yeast Schizosaccharomyces pombe. We compare calculated values of local and global information measures for the fission yeast cell cycle to the same measures as applied to two different classes of random networks: Erdös-Rényi and scale-free. We report patterns in local information processing and storage that do indeed distinguish biological from random, associated with control nodes that regulate the function of the fission yeast cell-cycle network. Conversely, we find that integrated information, which serves as a global measure of 'emergent' information processing, does not differ from random for the case presented. We discuss implications for our understanding of the informational architecture of the fission yeast cell-cycle network in particular, and more generally for illuminating any distinctive physics that may be operative in life. © 2016 The Author(s).
Model-based pH monitor for sensor assessment.
van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert
2009-01-01
Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.
Rebar, Amanda L.; Ram, Nilam; Conroy, David E.
2014-01-01
Objective The Single-Category Implicit Association Test (SC-IAT) has been used as a method for assessing automatic evaluations of physical activity, but measurement artifact or consciously-held attitudes could be confounding the outcome scores of these measures. The objective of these two studies was to address these measurement concerns by testing the validity of a novel SC-IAT scoring technique. Design Study 1 was a cross-sectional study, and study 2 was a prospective study. Method In study 1, undergraduate students (N = 104) completed SC-IATs for physical activity, flowers, and sedentary behavior. In study 2, undergraduate students (N = 91) completed a SC-IAT for physical activity, self-reported affective and instrumental attitudes toward physical activity, physical activity intentions, and wore an accelerometer for two weeks. The EZ-diffusion model was used to decompose the SC-IAT into three process component scores including the information processing efficiency score. Results In study 1, a series of structural equation model comparisons revealed that the information processing score did not share variability across distinct SC-IATs, suggesting it does not represent systematic measurement artifact. In study 2, the information processing efficiency score was shown to be unrelated to self-reported affective and instrumental attitudes toward physical activity, and positively related to physical activity behavior, above and beyond the traditional D-score of the SC-IAT. Conclusions The information processing efficiency score is a valid measure of automatic evaluations of physical activity. PMID:25484621
Conjoint-measurement framework for the study of probabilistic information processing.
NASA Technical Reports Server (NTRS)
Wallsten, T. S.
1972-01-01
The theory of conjoint measurement described by Krantz et al. (1971) is shown to indicate how a descriptive model of human processing of probabilistic information built around Bayes' rule is to be tested and how it is to be used to obtain subjective scale values. Specific relationships concerning these scale values are shown to emerge, and the theoretical prospects resulting from this development are discussed.
Causality, Measurement, and Elementary Interactions
NASA Astrophysics Data System (ADS)
Gillis, Edward J.
2011-12-01
Signal causality, the prohibition of superluminal information transmission, is the fundamental property shared by quantum measurement theory and relativity, and it is the key to understanding the connection between nonlocal measurement effects and elementary interactions. To prevent those effects from transmitting information between the generating and observing process, they must be induced by the kinds of entangling interactions that constitute measurements, as implied in the Projection Postulate. They must also be nondeterministic as reflected in the Born Probability Rule. The nondeterminism of entanglement-generating processes explains why the relevant types of information cannot be instantiated in elementary systems, and why the sequencing of nonlocal effects is, in principle, unobservable. This perspective suggests a simple hypothesis about nonlocal transfers of amplitude during entangling interactions, which yields straightforward experimental consequences.
Mapping individual logical processes in information searching
NASA Technical Reports Server (NTRS)
Smetana, F. O.
1974-01-01
An interactive dialog with a computerized information collection was recorded and plotted in the form of a flow chart. The process permits one to identify the logical processes employed in considerable detail and is therefore suggested as a tool for measuring individual thought processes in a variety of situations. A sample of an actual test case is given.
Liebe, J D; Hübner, U; Straede, M C; Thye, J
2015-01-01
Availability and usage of individual IT applications have been studied intensively in the past years. Recently, IT support of clinical processes is attaining increasing attention. The underlying construct that describes the IT support of clinical workflows is clinical information logistics. This construct needs to be better understood, operationalised and measured. It is therefore the aim of this study to propose and develop a workflow composite score (WCS) for measuring clinical information logistics and to examine its quality based on reliability and validity analyses. We largely followed the procedural model of MacKenzie and colleagues (2011) for defining and conceptualising the construct domain, for developing the measurement instrument, assessing the content validity, pretesting the instrument, specifying the model, capturing the data and computing the WCS and testing the reliability and validity. Clinical information logistics was decomposed into the descriptors data and information, function, integration and distribution, which embraced the framework validated by an analysis of the international literature. This framework was refined selecting representative clinical processes. We chose ward rounds, pre- and post-surgery processes and discharge as sample processes that served as concrete instances for the measurements. They are sufficiently complex, represent core clinical processes and involve different professions, departments and settings. The score was computed on the basis of data from 183 hospitals of different size, ownership, location and teaching status. Testing the reliability and validity yielded encouraging results: the reliability was high with r(split-half) = 0.89, the WCS discriminated between groups; the WCS correlated significantly and moderately with two EHR models and the WCS received good evaluation results by a sample of chief information officers (n = 67). These findings suggest the further utilisation of the WCS. As the WCS does not assume ideal workflows as a gold standard but measures IT support of clinical workflows according to validated descriptors a high portability of the WCS to other hospitals in other countries is very likely. The WCS will contribute to a better understanding of the construct clinical information logistics.
Lateralization of spatial information processing in response monitoring
Stock, Ann-Kathrin; Beste, Christian
2014-01-01
The current study aims at identifying how lateralized multisensory spatial information processing affects response monitoring and action control. In a previous study, we investigated multimodal sensory integration in response monitoring processes using a Simon task. Behavioral and neurophysiologic results suggested that different aspects of response monitoring are asymmetrically and independently allocated to the hemispheres: while efference-copy-based information on the motor execution of the task is further processed in the hemisphere that originally generated the motor command, proprioception-based spatial information is processed in the hemisphere contralateral to the effector. Hence, crossing hands (entering a “foreign” spatial hemifield) yielded an augmented bilateral activation during response monitoring since these two kinds of information were processed in opposing hemispheres. Because the traditional Simon task does not provide the possibility to investigate which aspect of the spatial configuration leads to the observed hemispheric allocation, we introduced a new “double crossed” condition that allows for the dissociation of internal/physiological and external/physical influences on response monitoring processes. Comparing behavioral and neurophysiologic measures of this new condition to those of the traditional Simon task setup, we could demonstrate that the egocentric representation of the physiological effector's spatial location accounts for the observed lateralization of spatial information in action control. The finding that the location of the physical effector had a very small influence on response monitoring measures suggests that this aspect is either less important and/or processed in different brain areas than egocentric physiological information. PMID:24550855
Grek, Boris; Bartolick, Joseph; Kennedy, Alan D.
2000-01-01
A method and apparatus for measuring microstructures, anistropy and birefringence in polymers using laser scattered light includes a laser which provides a beam that can be conditioned and is directed at a fiber or film which causes the beam to scatter. Backscatter light is received and processed with detectors and beam splitters to obtain data. The data is directed to a computer where it is processed to obtain information about the fiber or film, such as the birefringence and diameter. This information provides a basis for modifications to the production process to enhance the process.
Côté, Sophie; Bouchard, Stéphane
2005-09-01
Many outcome studies have been conducted to assess the efficacy of virtual reality in the treatment of specific phobias. However, most studies used self-report data. The addition of objective measures of arousal and information processing mechanisms would be a valuable contribution in order to validate the usefulness of virtual reality in the treatment of anxiety disorders. The goal of this study was to document the impact of virtual reality exposure (VRE) on cardiac response and automatic processing of threatening stimuli. Twenty-eight adults suffering from arachnophobia were assessed and received an exposure-based treatment using virtual reality. General outcome and specific processes measures included a battery of standardized questionnaires, a pictorial emotional Stroop task, a behavioral avoidance test and a measure of participants' inter-beat intervals (IBI) while they were looking at a live tarantula. Assessment was conducted before and after treatment. Repeated measures ANOVAs revealed that therapy had a positive impact on questionnaire data, as well as on the behavioral avoidance test. Analyses made on the pictorial Stroop task showed that information processing of spider-related stimuli changed after treatment, which also indicates therapeutic success. Psychophysiological data also showed a positive change after treatment, suggesting a decrease in anxiety. In sum, VRE led to significant therapeutic improvements on objective measures as well as on self-report instruments.
Local active information storage as a tool to understand distributed neural information processing
Wibral, Michael; Lizier, Joseph T.; Vögler, Sebastian; Priesemann, Viola; Galuske, Ralf
2013-01-01
Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for today's digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding. PMID:24501593
What should we measure? Conceptualizing usage in health information exchange
Jasperson, Jon
2010-01-01
Under the provisions of the Health Information Technology for Economic & Clinical Health act providers need to demonstrate their ‘meaningful use’ of electronic health record systems' health information exchange (HIE) capability. HIE usage is not a simple construct, but the choice of its measurement must attend to the users, context, and objectives of the system being examined. This review examined how usage is reported in the existing literature and also what conceptualizations of usage might best reflect the nature and objectives of HIE. While existing literature on HIE usage included a diverse set of measures, most were theoretically weak, did not attend to the interplay of measure, level of analysis and architectural strategy, and did not reflect how HIE usage affected the actual process of care. Attention to these issues will provide greater insight into the effects of previously inaccessible information on medical decision-making and the process of care. PMID:20442148
Paulus, Markus; Schuwerk, Tobias; Sodian, Beate; Ganglmayer, Kerstin
2017-03-01
According to recent theories, social cognition is based on two different types of information-processing; an implicit or action-based one and an explicit or verbal one. The present study examined whether implicit and explicit social-cognitive information processing interact with each other by investigating young children's and adults' use of verbal (i.e., explicit) information to predict others' actions. Employing eye-tracking to measure anticipatory eye-movements as a measure of implicit processing, Experiment 1 presented 1.5-, 2.5-, and 3.5-year-old children as well as adults with agents who announced to move to either of two possible targets. The results show that only the 3.5-year-old children and adults, but not the 1.5- and 2.5-year-old children were able to use verbal information to correctly anticipate others' actions. Yet, Experiments 2 and 3 showed that 2.5-year-old children were able to use explicit information to give a correct explicit answer (Experiment 2) and that they were able to use statistical information to anticipate the other's actions (Experiment 3). Overall, the study is in line with theoretical claims that two types of information-processing underlie human social cognition. It shows that these two inform each other by 3years of age. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Wang, Lin; Bastiaansen, Marcel; Yang, Yufang; Hagoort, Peter
2011-01-01
To highlight relevant information in dialogues, both wh-question context and pitch accent in answers can be used, such that focused information gains more attention and is processed more elaborately. To evaluate the relative influence of context and pitch accent on the depth of semantic processing, we measured event-related potentials (ERPs) to…
Cassano, Michael; MacEvoy, Julie Paquette; Costigan, Tracy
2010-01-01
Over the past fifteen years many schools have utilized aggression prevention programs. Despite these apparent advances, many programs are not examined systematically to determine the areas in which they are most effective. One reason for this is that many programs, especially those in urban under-resourced areas, do not utilize outcome measures that are sensitive to the needs of ethnic minority students. The current study illustrates how a new knowledge-based measure of social information processing and anger management techniques was designed through a partnership-based process to ensure that it would be sensitive to the needs of urban, predominately African American youngsters, while also having broad potential applicability for use as an outcome assessment tool for aggression prevention programs focusing upon social information processing. The new measure was found to have strong psychometric properties within a sample of urban predominately African American youth, as item analyses suggested that almost all items discriminate well between more and less knowledgeable individuals, that the test-retest reliability of the measure is strong, and that the measure appears to be sensitive to treatment changes over time. In addition, the overall score of this new measure is moderately associated with attributions of hostility on two measures (negative correlations) and demonstrates a low to moderate negative association with peer and teacher report measures of overt and relational aggression. More research is needed to determine the measure's utility outside of the urban school context. PMID:20449645
Transfer Entropy and Transient Limits of Computation
Prokopenko, Mikhail; Lizier, Joseph T.
2014-01-01
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statistical coherence between spatiotemporal processes, and is widely used in diverse fields ranging from finance to neuroscience. However, its relationships to fundamental limits of computation, such as Landauer's limit, remain unknown. Here we show that in order to increase transfer entropy (predictability) by one bit, heat flow must match or exceed Landauer's limit. Importantly, we generalise Landauer's limit to bi-directional information dynamics for non-equilibrium processes, revealing that the limit applies to prediction, in addition to retrodiction (information erasure). Furthermore, the results are related to negentropy, and to Bremermann's limit and the Bekenstein bound, producing, perhaps surprisingly, lower bounds on the computational deceleration and information loss incurred during an increase in predictability about the process. The identified relationships set new computational limits in terms of fundamental physical quantities, and establish transfer entropy as a central measure connecting information theory, thermodynamics and theory of computation. PMID:24953547
NASA Astrophysics Data System (ADS)
Gaikwad, Akshay; Rehal, Diksha; Singh, Amandeep; Arvind, Dorai, Kavita
2018-02-01
We present the NMR implementation of a scheme for selective and efficient quantum process tomography without ancilla. We generalize this scheme such that it can be implemented efficiently using only a set of measurements involving product operators. The method allows us to estimate any element of the quantum process matrix to a desired precision, provided a set of quantum states can be prepared efficiently. Our modified technique requires fewer experimental resources as compared to the standard implementation of selective and efficient quantum process tomography, as it exploits the special nature of NMR measurements to allow us to compute specific elements of the process matrix by a restrictive set of subsystem measurements. To demonstrate the efficacy of our scheme, we experimentally tomograph the processes corresponding to "no operation," a controlled-NOT (CNOT), and a controlled-Hadamard gate on a two-qubit NMR quantum information processor, with high fidelities.
Age and Visual Information Processing.
ERIC Educational Resources Information Center
Gummerman, Kent; And Others
This paper reports on three studies concerned with aspects of human visual information processing. Study I was an effort to measure the duration of iconic storage using a partial report method in children ranging in age from 6 to 13 years. Study II was designed to detect age related changes in the rate of processing (perceptually encoding) letters…
Teaching and Learning Information Technology Process: From a 25 Year Perspective--Math Regents
ERIC Educational Resources Information Center
Lewis Sanchez, Louise
2007-01-01
This paper will describe the Teaching and Learning Informational Technology Process (TLITP). Before present day strategies, teaching and learning relied on transformations based on quantification to measure performance. The process will be a non-linear three construct of teacher, student and community. Emphasizing old practices now is the…
Neural Correlates of Individual Differences in Strategic Retrieval Processing
ERIC Educational Resources Information Center
Bridger, Emma K.; Herron, Jane E.; Elward, Rachael L.; Wilding, Edward L.
2009-01-01
Processes engaged when information is encoded into memory are an important determinant of whether that information will be recovered subsequently. Also influential, however, are processes engaged at the time of retrieval, and these were investigated here by using event-related potentials (ERPs) to measure a specific class of retrieval operations.…
Measuring health care process quality with software quality measures.
Yildiz, Ozkan; Demirörs, Onur
2012-01-01
Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.
Thul, Alexander; Lechinger, Julia; Donis, Johann; Michitsch, Gabriele; Pichler, Gerald; Kochs, Eberhard F; Jordan, Denis; Ilg, Rüdiger; Schabus, Manuel
2016-02-01
Clinical assessments that rely on behavioral responses to differentiate Disorders of Consciousness are at times inapt because of some patients' motor disabilities. To objectify patients' conditions of reduced consciousness the present study evaluated the use of electroencephalography to measure residual brain activity. We analyzed entropy values of 18 scalp EEG channels of 15 severely brain-damaged patients with clinically diagnosed Minimally-Conscious-State (MCS) or Unresponsive-Wakefulness-Syndrome (UWS) and compared the results to a sample of 24 control subjects. Permutation entropy (PeEn) and symbolic transfer entropy (STEn), reflecting information processes in the EEG, were calculated for all subjects. Participants were tested on a modified active own-name paradigm to identify correlates of active instruction following. PeEn showed reduced local information content in the EEG in patients, that was most pronounced in UWS. STEn analysis revealed altered directed information flow in the EEG of patients, indicating impaired feed-backward connectivity. Responses to auditory stimulation yielded differences in entropy measures, indicating reduced information processing in MCS and UWS. Local EEG information content and information flow are affected in Disorders of Consciousness. This suggests local cortical information capacity and feedback information transfer as neural correlates of consciousness. The utilized EEG entropy analyses were able to relate to patient groups with different Disorders of Consciousness. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa
2016-12-01
This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.
Topaz, Maxim; Lai, Kenneth; Dowding, Dawn; Lei, Victor J; Zisberg, Anna; Bowles, Kathryn H; Zhou, Li
2016-12-01
Electronic health records are being increasingly used by nurses with up to 80% of the health data recorded as free text. However, only a few studies have developed nursing-relevant tools that help busy clinicians to identify information they need at the point of care. This study developed and validated one of the first automated natural language processing applications to extract wound information (wound type, pressure ulcer stage, wound size, anatomic location, and wound treatment) from free text clinical notes. First, two human annotators manually reviewed a purposeful training sample (n=360) and random test sample (n=1100) of clinical notes (including 50% discharge summaries and 50% outpatient notes), identified wound cases, and created a gold standard dataset. We then trained and tested our natural language processing system (known as MTERMS) to process the wound information. Finally, we assessed our automated approach by comparing system-generated findings against the gold standard. We also compared the prevalence of wound cases identified from free-text data with coded diagnoses in the structured data. The testing dataset included 101 notes (9.2%) with wound information. The overall system performance was good (F-measure is a compiled measure of system's accuracy=92.7%), with best results for wound treatment (F-measure=95.7%) and poorest results for wound size (F-measure=81.9%). Only 46.5% of wound notes had a structured code for a wound diagnosis. The natural language processing system achieved good performance on a subset of randomly selected discharge summaries and outpatient notes. In more than half of the wound notes, there were no coded wound diagnoses, which highlight the significance of using natural language processing to enrich clinical decision making. Our future steps will include expansion of the application's information coverage to other relevant wound factors and validation of the model with external data. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Chuang, Hsueh-Hua; Liu, Han-Chin
2012-01-01
This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…
The IT in Secondary Science Book. A Compendium of Ideas for Using Computers and Teaching Science.
ERIC Educational Resources Information Center
Frost, Roger
Scientists need to measure and communicate, to handle information, and model ideas. In essence, they need to process information. Young scientists have the same needs. Computers have become a tremendously important addition to the processing of information through database use, graphing and modeling and also in the collection of information…
Total quality management: It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle. Four projects are described that utilize cross-functional, problem-solving teams for identifying requirements and defining tasks and task standards, management participation, attention to critical processes, and measurable long-term goals. The implementation of these projects provides the customer with measurably improved access to information that is provided through several channels: the NASA STI Database, document requests for microfiche and hardcopy, and the Centralized Help Desk.
Brain Responses to Emotional Images Related to Cognitive Ability in Older Adults
Foster, Shannon M.; Davis, Hasker P.; Kisley, Michael A.
2013-01-01
Older adults have been shown to exhibit a positivity effect in processing of emotional stimuli, seemingly focusing more on positive than negative information. Whether this reflects purposeful changes or an unintended side-effect of declining cognitive abilities is unclear. For the present study older adults displaying a wide range of cognitive abilities completed measures of attention, visual and verbal memory, executive functioning, and processing speed, as well as a socioemotional measure of time perspective. Regression analyses examined the ability of these variables to predict neural responsivity to select emotional stimuli as measured with the late positive potential (LPP), an event-related brain potential (ERP). Stronger cognitive functioning was associated with higher LPP amplitude in response to negative images (i.e., greater processing). This does not support a voluntary avoidance of negative information processing in older adults for this particular measure of attentional allocation. A model is proposed to reconcile this finding with the extant literature that has demonstrated positivity effects in measures of later, controlled attentional allocation. PMID:23276213
An fMRI Study on Conceptual, Grammatical, and Morpho-Phonological Processing
ERIC Educational Resources Information Center
Longoni, F.; Grande, M.; Hendrich, V.; Kastrau, F.; Huber, W.
2005-01-01
The aim of the present study was to determine whether processing of syntactic word information (lemma) is subserved by the same neural substrate as processing of conceptual or word form information (lexeme). We measured BOLD responses in 14 native speakers of German in three different decision tasks, each focussing specifically on one level of…
Kupersmidt, Janis B; Stelter, Rebecca; Dodge, Kenneth A
2011-12-01
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in 3rd through 5th grades. This study included a racially and ethnically diverse sample of 244 boys ages 8 through 12 (M = 9.4) from public elementary schools in 3 states. The SIP-AP includes 8 videotaped vignettes, filmed from the first-person perspective, that depict common misunderstandings among boys. Each vignette shows a negative outcome for the victim and ambiguous intent on the part of the perpetrator. Boys responded to 16 Web-based questions representing the 5 social information processing mechanisms, after viewing each vignette. Parents and teachers completed measures assessing boys' antisocial behavior. Confirmatory factor analyses revealed that a model positing the original 5 cognitive mechanisms fit the data well when the items representing prosocial cognitions were included on their own factor, creating a 6th factor. The internal consistencies for each of the 16 individual cognitions as well as for the 6 cognitive mechanism scales were excellent. Boys with elevated scores on 5 of the 6 cognitive mechanisms exhibited more antisocial behavior than boys whose scores were not elevated. These findings highlight the need for further research on the measurement of prosocial cognitions or cognitive strengths in boys in addition to assessing cognitive deficits. Findings suggest that the SIP-AP is a reliable and valid tool for use in future research of social information processing skills in boys.
Kupersmidt, Janis B.; Stelter, Rebecca; Dodge, Kenneth A.
2013-01-01
The purpose of this study was to evaluate the psychometric properties of an audio computer-assisted self-interviewing Web-based software application called the Social Information Processing Application (SIP-AP) that was designed to assess social information processing skills in boys in 3rd through 5th grades. This study included a racially and ethnically diverse sample of 244 boys ages 8 through 12 (M = 9.4) from public elementary schools in 3 states. The SIP-AP includes 8 videotaped vignettes, filmed from the first-person perspective, that depict common misunderstandings among boys. Each vignette shows a negative outcome for the victim and ambiguous intent on the part of the perpetrator. Boys responded to 16 Web-based questions representing the 5 social information processing mechanisms, after viewing each vignette. Parents and teachers completed measures assessing boys’ antisocial behavior. Confirmatory factor analyses revealed that a model positing the original 5 cognitive mechanisms fit the data well when the items representing prosocial cognitions were included on their own factor, creating a 6th factor. The internal consistencies for each of the 16 individual cognitions as well as for the 6 cognitive mechanism scales were excellent. Boys with elevated scores on 5 of the 6 cognitive mechanisms exhibited more antisocial behavior than boys whose scores were not elevated. These findings highlight the need for further research on the measurement of prosocial cognitions or cognitive strengths in boys in addition to assessing cognitive deficits. Findings suggest that the SIP-AP is a reliable and valid tool for use in future research of social information processing skills in boys. PMID:21534693
NASA Astrophysics Data System (ADS)
Liu, Shuang; Liu, Fei; Hu, Shaohua; Yin, Zhenbiao
The major power information of the main transmission system in machine tools (MTSMT) during machining process includes effective output power (i.e. cutting power), input power and power loss from the mechanical transmission system, and the main motor power loss. These information are easy to obtain in the lab but difficult to evaluate in a manufacturing process. To solve this problem, a separation method is proposed here to extract the MTSMT power information during machining process. In this method, the energy flow and the mathematical models of major power information of MTSMT during the machining process are set up first. Based on the mathematical models and the basic data tables obtained from experiments, the above mentioned power information during machining process can be separated just by measuring the real time total input power of the spindle motor. The operation program of this method is also given.
Two paths to blame: Intentionality directs moral information processing along two distinct tracks.
Monroe, Andrew E; Malle, Bertram F
2017-01-01
There is broad consensus that features such as causality, mental states, and preventability are key inputs to moral judgments of blame. What is not clear is exactly how people process these inputs to arrive at such judgments. Three studies provide evidence that early judgments of whether or not a norm violation is intentional direct information processing along 1 of 2 tracks: if the violation is deemed intentional, blame processing relies on information about the agent's reasons for committing the violation; if the violation is deemed unintentional, blame processing relies on information about how preventable the violation was. Owing to these processing commitments, when new information requires perceivers to switch tracks, they must reconfigure their judgments, which results in measurable processing costs indicated by reaction time (RT) delays. These findings offer support for a new theory of moral judgment (the Path Model of Blame) and advance the study of moral cognition as hierarchical information processing. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
EDITORIAL: Industrial Process Tomography
NASA Astrophysics Data System (ADS)
Anton Johansen, Geir; Wang, Mi
2008-09-01
There has been tremendous development within measurement science and technology over the past couple of decades. New sensor technologies and compact versatile signal recovery electronics are continuously expanding the limits of what can be measured and the accuracy with which this can be done. Miniaturization of sensors and the use of nanotechnology push these limits further. Also, thanks to powerful and cost-effective computer systems, sophisticated measurement and reconstruction algorithms previously only accessible in advanced laboratories are now available for in situ online measurement systems. The process industries increasingly require more process-related information, motivated by key issues such as improved process control, process utilization and process yields, ultimately driven by cost-effectiveness, quality assurance, environmental and safety demands. Industrial process tomography methods have taken advantage of the general progress in measurement science, and aim at providing more information, both quantitatively and qualitatively, on multiphase systems and their dynamics. The typical approach for such systems has been to carry out one local or bulk measurement and assume that this is representative of the whole system. In some cases, this is sufficient. However, there are many complex systems where the component distribution varies continuously and often unpredictably in space and time. The foundation of industrial tomography is to conduct several measurements around the periphery of a multiphase process, and use these measurements to unravel the cross-sectional distribution of the process components in time and space. This information is used in the design and optimization of industrial processes and process equipment, and also to improve the accuracy of multiphase system measurements in general. In this issue we are proud to present a selection of the 145 papers presented at the 5th World Congress on Industrial Process Tomography in Bergen, September 2007. Interestingly, x-ray technologies, one of the first imaging modalities available, keep on moving the limits on both spatial and temporal measurement resolution; experimental results of less than 100 nm and several thousand frames/s are reported, respectively. Important progress is demonstrated in research and development on sensor technologies and algorithms for data processing and image reconstruction, including unconventional sensor design and adaptation of the sensors to the application in question. The number of applications to which tomographic methods are applied is steadily increasing, and results obtained in a representative selection of applications are included. As guest editors we would like express our appreciation and thanks to all authors who have contributed and to IOP staff for excellent collaboration in the process of finalizing this special feature.
Humphries, Joyce E; Flowe, Heather D; Hall, Louise C; Williams, Louise C; Ryder, Hannah L
2016-01-01
This study examined whether beliefs about face recognition ability differentially influence memory retrieval in older compared to young adults. Participants evaluated their ability to recognise faces and were also given information about their ability to perceive and recognise faces. The information was ostensibly based on an objective measure of their ability, but in actuality, participants had been randomly assigned the information they received (high ability, low ability or no information control). Following this information, face recognition accuracy for a set of previously studied faces was measured using a remember-know memory paradigm. Older adults rated their ability to recognise faces as poorer compared to young adults. Additionally, negative information about face recognition ability improved only older adults' ability to recognise a previously seen face. Older adults were also found to engage in more familiarity than item-specific processing than young adults, but information about their face recognition ability did not affect face processing style. The role that older adults' memory beliefs have in the meta-cognitive strategies they employ is discussed.
Image processing of metal surface with structured light
NASA Astrophysics Data System (ADS)
Luo, Cong; Feng, Chang; Wang, Congzheng
2014-09-01
In structured light vision measurement system, the ideal image of structured light strip, in addition to black background , contains only the gray information of the position of the stripe. However, the actual image contains image noise, complex background and so on, which does not belong to the stripe, and it will cause interference to useful information. To extract the stripe center of mental surface accurately, a new processing method was presented. Through adaptive median filtering, the noise can be preliminary removed, and the noise which introduced by CCD camera and measured environment can be further removed with difference image method. To highlight fine details and enhance the blurred regions between the stripe and noise, the sharping algorithm is used which combine the best features of Laplacian operator and Sobel operator. Morphological opening operation and closing operation are used to compensate the loss of information.Experimental results show that this method is effective in the image processing, not only to restrain the information but also heighten contrast. It is beneficial for the following processing.
Developing a Web-Based Nursing Practice and Research Information Management System: A Pilot Study.
Choi, Jeeyae; Lapp, Cathi; Hagle, Mary E
2015-09-01
Many hospital information systems have been developed and implemented to collect clinical data from the bedside and have used the information to improve patient care. Because of a growing awareness that the use of clinical information improves quality of care and patient outcomes, measuring tools (electronic and paper based) have been developed, but most of them require multiple steps of data collection and analysis. This necessitated the development of a Web-based Nursing Practice and Research Information Management System that processes clinical nursing data to measure nurses' delivery of care and its impact on patient outcomes and provides useful information to clinicians, administrators, researchers, and policy makers at the point of care. This pilot study developed a computer algorithm based on a falls prevention protocol and programmed the prototype Web-based Nursing Practice and Research Information Management System. It successfully measured performance of nursing care delivered and its impact on patient outcomes successfully using clinical nursing data from the study site. Although Nursing Practice and Research Information Management System was tested with small data sets, results of study revealed that it has the potential to measure nurses' delivery of care and its impact on patient outcomes, while pinpointing components of nursing process in need of improvement.
ERIC Educational Resources Information Center
Chase, Justin P.; Yan, Zheng
2017-01-01
The ability to effective learn, process, and retain new information is critical to the success of any student. Since mathematics are becoming increasingly more important in our educational systems, it is imperative that we devise an efficient system to measure these types of information recall. "Assessing and Measuring Statistics Cognition in…
Scantlebury, Nadia; Bouffet, Eric; Laughlin, Suzanne; Strother, Douglas; McConnell, Dina; Hukin, Juliette; Fryer, Christopher; Laperriere, Normand; Montour-Proulx, Isabelle; Keene, Daniel; Fleming, Adam; Jabado, Nada; Liu, Fang; Riggs, Lily; Law, Nicole; Mabbott, Donald J
2016-05-01
We compared the structure of specific white matter tracts and information processing speed between children treated for posterior fossa tumors with cranial-spinal radiation (n = 30), or with surgery +/- focal radiation (n = 29), and healthy children (n = 37). Probabilistic diffusion tensor imaging (DTI) tractography was used to delineate the inferior longitudinal fasciculi, optic radiation, inferior frontal occipital fasciculi, and uncinate fasciculi bilaterally. Information processing speed was measured using the coding and symbol search subtests of the Wechsler Intelligence Scales, and visual matching, pair cancellation, and rapid picture naming subtests of the Woodcock-Johnson Test of Cognitive Ability, 3rd revision. We examined group differences using repeated measures MANOVAs and path analyses were used to test the relations between treatment, white matter structure of the tracts, and information processing speed. DTI indices of the optic radiations, the inferior longitudinal fasciculi, and the inferior fronto-occipital fasciculi differed between children treated with cranial-spinal radiation and children treated with surgery +/- focal radiation, and healthy controls (p = .045). Children treated with cranial-spinal radiation also exhibited lower processing speed scores relative to healthy control subjects (p = .002). Notably, we observed that group differences in information processing speed were related to the structure of the right optic radiation (p = .002). We show that cranial-spinal radiation may have a negative impact on information processing speed via insult to the right optic radiations. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Ozone Lidar Observations for Air Quality Studies
NASA Technical Reports Server (NTRS)
Wang, Lihua; Newchurch, Mike; Kuang, Shi; Burris, John F.; Huang, Guanyu; Pour-Biazar, Arastoo; Koshak, William; Follette-Cook, Melanie B.; Pickering, Kenneth E.; McGee, Thomas J.;
2015-01-01
Tropospheric ozone lidars are well suited to measuring the high spatio-temporal variability of this important trace gas. Furthermore, lidar measurements in conjunction with balloon soundings, aircraft, and satellite observations provide substantial information about a variety of atmospheric chemical and physical processes. Examples of processes elucidated by ozone-lidar measurements are presented, and modeling studies using WRF-Chem, RAQMS, and DALES/LES models illustrate our current understanding and shortcomings of these processes.
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
AOD furnace splash soft-sensor in the smelting process based on improved BP neural network
NASA Astrophysics Data System (ADS)
Ma, Haitao; Wang, Shanshan; Wu, Libin; Yu, Ying
2017-11-01
In view of argon oxygen refining low carbon ferrochrome production process, in the splash of smelting process as the research object, based on splash mechanism analysis in the smelting process , using multi-sensor information fusion and BP neural network modeling techniques is proposed in this paper, using the vibration signal, the audio signal and the flame image signal in the furnace as the characteristic signal of splash, the vibration signal, the audio signal and the flame image signal in the furnace integration and modeling, and reconstruct splash signal, realize the splash soft measurement in the smelting process, the simulation results show that the method can accurately forecast splash type in the smelting process, provide a new method of measurement for forecast splash in the smelting process, provide more accurate information to control splash.
Dickey, Michael Walsh; Warren, Tessa
2016-01-01
Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951
Audit of the informed consent process as a part of a clinical research quality assurance program.
Lad, Pramod M; Dahl, Rebecca
2014-06-01
Audits of the informed consent process are a key element of a clinical research quality assurance program. A systematic approach to such audits has not been described in the literature. In this paper we describe two components of the audit. The first is the audit of the informed consent document to verify adherence with federal regulations. The second component is comprised of the audit of the informed consent conference, with emphasis on a real time review of the appropriate communication of the key elements of the informed consent. Quality measures may include preparation of an informed consent history log, notes to accompany the informed consent, the use of an informed consent feedback tool, and the use of institutional surveys to assess comprehension of the informed consent process.
New Developments in Developmental Research on Social Information Processing and Antisocial Behavior
ERIC Educational Resources Information Center
Fontaine, Reid Griffith
2010-01-01
The Special Section on developmental research on social information processing (SIP) and antisocial behavior is here introduced. Following a brief history of SIP theory, comments on several themes--measurement and assessment, attributional and interpretational style, response evaluation and decision, and the relation between emotion and SIP--that…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
... decision making. It also aims to advance the science of measuring treatment preferences of patients...: Incorporating Patient Preference Information into the Medical Device Regulatory Processes.'' The purpose of the... predictability, consistency, and transparency of the premarket review process. In 2012, CDRH published the...
Young Children's Social Information Processing: Family Antecedents and Behavioral Correlates
ERIC Educational Resources Information Center
Runions, Kevin C.; Keating, Daniel P.
2007-01-01
Little research has examined whether social information processing (SIP) measures from early childhood predict externalizing problems beyond the shared association with familial risk markers. In the present study, family antecedents and first-grade externalizing behaviors were studied in relation to preschool and 1st-grade SIP using data from…
ERIC Educational Resources Information Center
Denham, Susanne A.; Way, Erin; Kalb, Sara C.; Warren-Khot, Heather K.; Bassett, Hideko H.
2013-01-01
As part of a larger longitudinal project on the assessment of preschoolers' social-emotional development, children's social information processing (SIP) responses to unambiguous hypothetical situations of peer provocation were assessed for 298 four-year-olds from Head Start and private childcare settings. Measurement focused on emotions children…
Markiewicz, Łukasz; Kubińska, Elżbieta
2015-01-01
This paper aims to provide insight into information processing differences between hot and cold risk taking decision tasks within a single domain. Decision theory defines risky situations using at least three parameters: outcome one (often a gain) with its probability and outcome two (often a loss) with a complementary probability. Although a rational agent should consider all of the parameters, s/he could potentially narrow their focus to only some of them, particularly when explicit Type 2 processes do not have the resources to override implicit Type 1 processes. Here we investigate differences in risky situation parameters' influence on hot and cold decisions. Although previous studies show lower information use in hot than in cold processes, they do not provide decision weight changes and therefore do not explain whether this difference results from worse concentration on each parameter of a risky situation (probability, gain amount, and loss amount) or from ignoring some parameters. Two studies were conducted, with participants performing the Columbia Card Task (CCT) in either its Cold or Hot version. In the first study, participants also performed the Cognitive Reflection Test (CRT) to monitor their ability to override Type 1 processing cues (implicit processes) with Type 2 explicit processes. Because hypothesis testing required comparison of the relative importance of risky situation decision weights (gain, loss, probability), we developed a novel way of measuring information use in the CCT by employing a conjoint analysis methodology. Across the two studies, results indicated that in the CCT Cold condition decision makers concentrate on each information type (gain, loss, probability), but in the CCT Hot condition they concentrate mostly on a single parameter: probability of gain/loss. We also show that an individual's CRT score correlates with information use propensity in cold but not hot tasks. Thus, the affective dimension of hot tasks inhibits correct information processing, probably because it is difficult to engage Type 2 processes in such circumstances. Individuals' Type 2 processing abilities (measured by the CRT) assist greater use of information in cold tasks but do not help in hot tasks.
Markiewicz, Łukasz; Kubińska, Elżbieta
2015-01-01
Objective: This paper aims to provide insight into information processing differences between hot and cold risk taking decision tasks within a single domain. Decision theory defines risky situations using at least three parameters: outcome one (often a gain) with its probability and outcome two (often a loss) with a complementary probability. Although a rational agent should consider all of the parameters, s/he could potentially narrow their focus to only some of them, particularly when explicit Type 2 processes do not have the resources to override implicit Type 1 processes. Here we investigate differences in risky situation parameters' influence on hot and cold decisions. Although previous studies show lower information use in hot than in cold processes, they do not provide decision weight changes and therefore do not explain whether this difference results from worse concentration on each parameter of a risky situation (probability, gain amount, and loss amount) or from ignoring some parameters. Methods: Two studies were conducted, with participants performing the Columbia Card Task (CCT) in either its Cold or Hot version. In the first study, participants also performed the Cognitive Reflection Test (CRT) to monitor their ability to override Type 1 processing cues (implicit processes) with Type 2 explicit processes. Because hypothesis testing required comparison of the relative importance of risky situation decision weights (gain, loss, probability), we developed a novel way of measuring information use in the CCT by employing a conjoint analysis methodology. Results: Across the two studies, results indicated that in the CCT Cold condition decision makers concentrate on each information type (gain, loss, probability), but in the CCT Hot condition they concentrate mostly on a single parameter: probability of gain/loss. We also show that an individual's CRT score correlates with information use propensity in cold but not hot tasks. Thus, the affective dimension of hot tasks inhibits correct information processing, probably because it is difficult to engage Type 2 processes in such circumstances. Individuals' Type 2 processing abilities (measured by the CRT) assist greater use of information in cold tasks but do not help in hot tasks. PMID:26635652
An investigation into social information processing in young people with Asperger syndrome.
Flood, Andrea Mary; Julian Hare, Dougal; Wallis, Paul
2011-09-01
Deficits in social functioning are a core feature of autistic spectrum disorders (ASD), being linked to various cognitive and developmental factors, but there has been little attempt to draw on normative models of social cognition to understand social behaviour in ASD. The current study explored the utility of Crick and Dodge's (1994) information processing model to studying social cognition in ASD, and examined associations between social information processing patterns, theory of mind skills and social functioning. A matched-group design compared young people with Asperger syndrome with typically developing peers, using a social information processing interview previously designed for this purpose. The Asperger syndrome group showed significantly different patterns of information processing at the intent attribution, response generation and response evaluation stages of the information processing model. Theory of mind skills were found to be significantly associated with parental ratings of peer problems in the Asperger syndrome group but not with parental ratings of pro-social behaviour, with only limited evidence of an association between social information processing and measures of theory of mind and social functioning. Overall, the study supports the use of normative social information processing approaches to understanding social functioning in ASD.
Evaluation methodologies for an advanced information processing system
NASA Technical Reports Server (NTRS)
Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.
1984-01-01
The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.
Burke, Daniel; Linder, Susan; Hirsch, Joshua; Dey, Tanujit; Kana, Daniel; Ringenbach, Shannon; Schindler, David; Alberts, Jay
2017-10-01
Information processing is typically evaluated using simple reaction time (SRT) and choice reaction time (CRT) paradigms in which a specific response is initiated following a given stimulus. The measurement of reaction time (RT) has evolved from monitoring the timing of mechanical switches to computerized paradigms. The proliferation of mobile devices with touch screens makes them a natural next technological approach to assess information processing. The aims of this study were to determine the validity and reliability of using of a mobile device (Apple iPad or iTouch) to accurately measure RT. Sixty healthy young adults completed SRT and CRT tasks using a traditional test platform and mobile platforms on two occasions. The SRT was similar across test modality: 300, 287, and 280 milliseconds (ms) for the traditional, iPad, and iTouch, respectively. The CRT was similar within mobile devices, though slightly faster on the traditional: 359, 408, and 384 ms for traditional, iPad, and iTouch, respectively. Intraclass correlation coefficients ranged from 0.79 to 0.85 for SRT and from 0.75 to 0.83 for CRT. The similarity and reliability of SRT across platforms and consistency of SRT and CRT across test conditions indicate that mobile devices provide the next generation of assessment platforms for information processing.
Developing measures for information ergonomics in knowledge work.
Franssila, Heljä; Okkonen, Jussi; Savolainen, Reijo
2016-03-01
Information ergonomics is an evolving application domain of ergonomics focusing on the management of workload in the real-world contexts of information-intensive tasks. This study introduces a method for the evaluation of information ergonomics in knowledge work. To this end, five key dimensions of information ergonomics were identified: contextual factors of knowledge work, multitasking, interruptions at work, practices for managing information load, and perceived job control and productivity. In total, 24 measures focusing on the above dimensions were constructed. The measures include, for example, the number of fragmented work tasks per work day. The measures were preliminarily tested in two Finnish organisations, making use of empirical data gathered by interviews, electronic questionnaires and log data applications tracking work processes on personal computers. The measures are applicable to the evaluation of information ergonomics, even though individual measures vary with regard to the amount of work and time needed for data analysis. Practitioner Summary: The study introduces a method for the evaluation of information ergonomics in knowledge work. To this end, 24 measures were constructed and tested empirically. The measures focus on contextual factors of knowledge work, multitasking, interruptions at work, practices for managing information load, and perceived job control and productivity.
Working memory capacity and redundant information processing efficiency.
Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R
2015-01-01
Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.
Controlling user access to electronic resources without password
Smith, Fred Hewitt
2015-06-16
Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.
Cao, Youfang; Wang, Lianjie; Xu, Kexue; Kou, Chunhai; Zhang, Yulei; Wei, Guifang; He, Junjian; Wang, Yunfang; Zhao, Liping
2005-07-26
A new algorithm for assessing similarity between primer and template has been developed based on the hypothesis that annealing of primer to template is an information transfer process. Primer sequence is converted to a vector of the full potential hydrogen numbers (3 for G or C, 2 for A or T), while template sequence is converted to a vector of the actual hydrogen bond numbers formed after primer annealing. The former is considered as source information and the latter destination information. An information coefficient is calculated as a measure for fidelity of this information transfer process and thus a measure of similarity between primer and potential annealing site on template. Successful prediction of PCR products from whole genomic sequences with a computer program based on the algorithm demonstrated the potential of this new algorithm in areas like in silico PCR and gene finding.
Smerecnik, Chris M R; Mesters, Ilse; Candel, Math J J M; De Vries, Hein; De Vries, Nanne K
2012-01-01
The role of information processing in understanding people's responses to risk information has recently received substantial attention. One limitation of this research concerns the unavailability of a validated questionnaire of information processing. This article presents two studies in which we describe the development and validation of the Information-Processing Questionnaire to meet that need. Study 1 describes the development and initial validation of the questionnaire. Participants were randomized to either a systematic processing or a heuristic processing condition after which they completed a manipulation check and the initial 15-item questionnaire and again two weeks later. The questionnaire was subjected to factor reliability and validity analyses on both measurement times for purposes of cross-validation of the results. A two-factor solution was observed representing a systematic processing and a heuristic processing subscale. The resulting scale showed good reliability and validity, with the systematic condition scoring significantly higher on the systematic subscale and the heuristic processing condition significantly higher on the heuristic subscale. Study 2 sought to further validate the questionnaire in a field study. Results of the second study corresponded with those of Study 1 and provided further evidence of the validity of the Information-Processing Questionnaire. The availability of this information-processing scale will be a valuable asset for future research and may provide researchers with new research opportunities. © 2011 Society for Risk Analysis.
Mazerolle, Erin L; Wojtowicz, Magdalena A; Omisade, Antonina; Fisk, John D
2013-01-01
Slowed information processing speed is commonly reported in persons with multiple sclerosis (MS), and is typically investigated using clinical neuropsychological tests, which provide sensitive indices of mean-level information processing speed. However, recent studies have demonstrated that within-person variability or intra-individual variability (IIV) in information processing speed may be a more sensitive indicator of neurologic status than mean-level performance on clinical tests. We evaluated the neural basis of increased IIV in mildly affected relapsing-remitting MS patients by characterizing the relation between IIV (controlling for mean-level performance) and white matter integrity using diffusion tensor imaging (DTI). Twenty women with relapsing-remitting MS and 20 matched control participants completed the Computerized Test of Information Processing (CTIP), from which both mean response time and IIV were calculated. Other clinical measures of information processing speed were also collected. Relations between IIV on the CTIP and DTI metrics of white matter microstructure were evaluated using tract-based spatial statistics. We observed slower and more variable responses on the CTIP in MS patients relative to controls. Significant relations between white matter microstructure and IIV were observed for MS patients. Increased IIV was associated with reduced integrity in more white matter tracts than was slowed information processing speed as measured by either mean CTIP response time or other neuropsychological test scores. Thus, despite the common use of mean-level performance as an index of cognitive dysfunction in MS, IIV may be more sensitive to the overall burden of white matter disease at the microstructural level. Furthermore, our study highlights the potential value of considering within-person fluctuations, in addition to mean-level performance, for uncovering brain-behavior relationships in neurologic disorders with widespread white matter pathology.
NASA Astrophysics Data System (ADS)
Tilch, Nils; Römer, Alexander; Jochum, Birgit; Schattauer, Ingrid
2014-05-01
In the past years, several times large-scale disasters occurred in Austria, which were characterized not only by flooding, but also by numerous shallow landslides and debris flows. Therefore, for the purpose of risk prevention, national and regional authorities also require more objective and realistic maps with information about spatially variable susceptibility of the geosphere for hazard-relevant gravitational mass movements. There are many and various proven methods and models (e.g. neural networks, logistic regression, heuristic methods) available to create such process-related (e.g. flat gravitational mass movements in soil) suszeptibility maps. But numerous national and international studies show a dependence of the suitability of a method on the quality of process data and parameter maps (f.e. Tilch & Schwarz 2011, Schwarz & Tilch 2011). In this case, it is important that also maps with detailed and process-oriented information on the process-relevant geosphere will be considered. One major disadvantage is that only occasionally area-wide process-relevant information exists. Similarly, in Austria often only soil maps for treeless areas are available. However, in almost all previous studies, randomly existing geological and geotechnical maps were used, which often have been specially adapted to the issues and objectives. This is one reason why very often conceptual soil maps must be derived from geological maps with only hard rock information, which often have a rather low quality. Based on these maps, for example, adjacent areas of different geological composition and process-relevant physical properties are razor sharp delineated, which in nature appears quite rarly. In order to obtain more realistic information about the spatial variability of the process-relevant geosphere (soil cover) and its physical properties, aerogeophysical measurements (electromagnetic, radiometric), carried out by helicopter, from different regions of Austria were interpreted. Previous studies show that, especially with radiometric measurements, the two-dimensional spatial variability of the nature of the process-relevant soil, close to the surface can be determined. In addition, the electromagnetic measurements are more important to obtain three-dimensional information of the deeper geological conditions and to improve the area-specific geological knowledge and understanding. The validation of these measurements is done with terrestrial geoelectrical measurements. So both aspects, radiometric and electromagnetic measurements, are important and subsequently, interpretation of the geophysical results can be used as the parameter maps in the modeling of more realistic susceptibility maps with respect to various processes. Within this presentation, results of geophysical measurements, the outcome and the derived parameter maps, as well as first process-oriented susceptibility maps in terms of gravitational soil mass movements will be presented. As an example results which were obtained with a heuristic method in an area in Vorarlberg (Western Austria) will be shown. References: Schwarz, L. & Tilch, N. (2011): Why are good process data so important for the modelling of landslide susceptibility maps?- EGU-Postersession "Landslide hazard and risk assessment, and landslide management" (NH 3.6), Vienna. [http://www.geologie.ac.at/fileadmin/user_upload/dokumente/pdf/poster/poster_2011_egu_schwarz_tilch_1.pdf] Tilch, N. & Schwarz, L. (2011): Spatial and scale-dependent variability in data quality and their influence on susceptibility maps for gravitational mass movements in soil, modelled by heuristic method.- EGU-Postersession "Landslide hazard and risk assessment, and landslide management" (NH 3.6); Vienna. [http://www.geologie.ac.at/fileadmin/user_upload/dokumente/pdf/poster/poster_2011_egu_tilch_schwarz.pdf
Accurate expectancies diminish perceptual distraction during visual search
Sy, Jocelyn L.; Guerin, Scott A.; Stegman, Anna; Giesbrecht, Barry
2014-01-01
The load theory of visual attention proposes that efficient selective perceptual processing of task-relevant information during search is determined automatically by the perceptual demands of the display. If the perceptual demands required to process task-relevant information are not enough to consume all available capacity, then the remaining capacity automatically and exhaustively “spills-over” to task-irrelevant information. The spill-over of perceptual processing capacity increases the likelihood that task-irrelevant information will impair performance. In two visual search experiments, we tested the automaticity of the allocation of perceptual processing resources by measuring the extent to which the processing of task-irrelevant distracting stimuli was modulated by both perceptual load and top-down expectations using behavior, functional magnetic resonance imaging, and electrophysiology. Expectations were generated using a trial-by-trial cue that provided information about the likely load of the upcoming visual search task. When the cues were valid, behavioral interference was eliminated and the influence of load on frontoparietal and visual cortical responses was attenuated relative to when the cues were invalid. In conditions in which task-irrelevant information interfered with performance and modulated visual activity, individual differences in mean blood oxygenation level dependent responses measured from the left intraparietal sulcus were negatively correlated with individual differences in the severity of distraction. These results are consistent with the interpretation that a top-down biasing mechanism interacts with perceptual load to support filtering of task-irrelevant information. PMID:24904374
Particle sizing in rocket motor studies utilizing hologram image processing
NASA Technical Reports Server (NTRS)
Netzer, David; Powers, John
1987-01-01
A technique of obtaining particle size information from holograms of combustion products is described. The holograms are obtained with a pulsed ruby laser through windows in a combustion chamber. The reconstruction is done with a krypton laser with the real image being viewed through a microscope. The particle size information is measured with a Quantimet 720 image processing system which can discriminate various features and perform measurements of the portions of interest in the image. Various problems that arise in the technique are discussed, especially those that are a consequence of the speckle due to the diffuse illumination used in the recording process.
NASA Astrophysics Data System (ADS)
Mouloudakis, K.; Kominis, I. K.
2017-02-01
Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.
Gillies, Katie; Elwyn, Glyn; Cook, Jonathan
2014-07-30
Informed consent of trial participants is both an ethical and a legal requirement. When facing a decision about trial participation, potential participants are provided with information about the trial and have the opportunity to have any questions answered before their degree of 'informed-ness' is assessed, usually subjectively, and before they are asked to sign a consent form. Currently, standardised methods for assessing informed consent have tended to be focused on aspects of understanding and associated outcomes, rather than on the process of consent and the steps associated with decision-making. Potential trial participants who were approached regarding participation in one of three randomised controlled trials were asked to complete a short questionnaire to measure their deliberation about trial participation. A total of 136 participants completed the 10-item questionnaire (DelibeRATE) before they made an explicit decision about trial participation (defined as signing the clinical trial consent form). Overall DelibeRATE scores were compared and investigated for differences between trial consenters and refusers. No differences in overall DelibeRATE scores were identified. In addition, there was no significant difference between overall score and the decision to participate, or not, in the parent trial. To our knowledge, this is the first study to prospectively measure the deliberation stage of the informed consent decision-making process of potential trial participants across different conditions and clinical areas. Although there were no differences detected in overall scores or scores of trial consenters and refusers, we did identify some interesting findings. These findings should be taken into consideration by those designing trials and others interested in developing and implementing measures of potential trial participants decision making during the informed consent process for research. International Standard Randomised Controlled Trial Number (ISRCTN) Register ISRCTN60695184 (date of registration: 13 May 2009), ISRCTN80061723 (date of registration: 8 March 2010), ISRCTN69423238 (date of registration: 18 November 2010).
Interactive information processing for NASA's mesoscale analysis and space sensor program
NASA Technical Reports Server (NTRS)
Parker, K. G.; Maclean, L.; Reavis, N.; Wilson, G.; Hickey, J. S.; Dickerson, M.; Karitani, S.; Keller, D.
1985-01-01
The Atmospheric Sciences Division (ASD) of the Systems Dynamics Laboratory at NASA's Marshall Space Flight Center (MSFC) is currently involved in interactive information processing for the Mesoscale Analysis and Space Sensor (MASS) program. Specifically, the ASD is engaged in the development and implementation of new space-borne remote sensing technology to observe and measure mesoscale atmospheric processes. These space measurements and conventional observational data are being processed together to gain an improved understanding of the mesoscale structure and the dynamical evolution of the atmosphere relative to cloud development and precipitation processes. To satisfy its vast data processing requirements, the ASD has developed a Researcher Computer System consiting of three primary computer systems which provides over 20 scientists with a wide range of capabilities for processing and displaying a large volumes of remote sensing data. Each of the computers performs a specific function according to its unique capabilities.
A multidimensional evaluation of a nursing information-literacy program.
Fox, L M; Richter, J M; White, N E
1996-01-01
The goal of an information-literacy program is to develop student skills in locating, evaluating, and applying information for use in critical thinking and problem solving. This paper describes a multidimensional evaluation process for determining nursing students' growth in cognitive and affective domains. Results indicate improvement in student skills as a result of a nursing information-literacy program. Multidimensional evaluation produces a well-rounded picture of student progress based on formal measurement as well as informal feedback. Developing new educational programs can be a time-consuming challenge. It is important, when expending so much effort, to ensure that the goals of the new program are achieved and benefits to students demonstrated. A multidimensional approach to evaluation can help to accomplish those ends. In 1988, The University of Northern Colorado School of Nursing began working with a librarian to integrate an information-literacy component, entitled Pathways to Information Literacy, into the curriculum. This article describes the program and discusses how a multidimensional evaluation process was used to assess program effectiveness. The evaluation process not only helped to measure the effectiveness of the program but also allowed the instructors to use several different approaches to evaluation. PMID:8826621
NASA Technical Reports Server (NTRS)
Kaber, David B.; McClernon, Christopher K.; Perry, Carlene M.; Segall, Noa
2004-01-01
The goal of this research was to define a measure of situation awareness (SA) in an air traffic control (ATC) task and to assess the influence of adaptive automation (AA) of various information processing functions on controller perception, comprehension and projection. The measure was also to serve as a basis for defining and developing an approach to triggering dynamic control allocations, as part of AA, based on controller SA. To achieve these objectives, an enhanced version of an ATC simulation (Multitask (copyright)) was developed for use in two human factors experiments. The simulation captured the basic functions of Terminal Radar Approach Control (TRACON) and was capable of presenting to operators four different modes of control, including information acquisition, information analysis, decision making and action implementation automation, as well as a completely manual control mode. The SA measure that was developed as part of the research was based on the Situation Awareness Global Assessment Technique (SAGAT), previous goal-directed task analyses of enroute control and TRACON, and a separate cognitive task analysis on the ATC simulation. The results of the analysis on Multitask were used as a basis for formulating SA queries as part of the SAGAT-based approach to measuring controller SA, which was used in the experiments. A total of 16 subjects were recruited for both experiments. Half the subjects were used in Experiment #1, which focused on assessing the sensitivity and reliability of the SA measurement approach in the ATC simulation. Comparisons were made of manual versus automated control. The remaining subjects were used in the second experiment, which was intended to more completely describe the SA implications of AA applied to specific controller information processing functions, and to describe how the measure could ultimately serve as a trigger of dynamic function allocations in the application of AA to ATC. Comparisons were made of the sensitivity of the SA measure to automation manipulations impacting both higher-order information processing functions, such as information analysis and decision making, versus lower-order functions, including information acquisition and action implementation. All subjects were exposed to all forms of AA of the ATC task and the manual control condition. The approach to AA used in both experiments was to match operator workload, assessed using a secondary task, to dynamic control allocations in the primary task. In total, the subjects in each experiment participated in 10 trials with each lasting between 45 minutes and 1 hour. In both experiments, ATC performance was measured in terms of aircraft cleared, conflicting, and collided. Secondary task (gauge monitoring) performance was assessed in terms of a hit-to-signal ratio. As part of the SA measure, three simulation freezes were conducted during each trial to administer queries on Level 1, 2, and 3 SA.
Student Team Projects in Information Systems Development: Measuring Collective Creative Efficacy
ERIC Educational Resources Information Center
Cheng, Hsiu-Hua; Yang, Heng-Li
2011-01-01
For information systems development project student teams, learning how to improve software development processes is an important training. Software process improvement is an outcome of a number of creative behaviours. Social cognitive theory states that the efficacy of judgment influences behaviours. This study explores the impact of three types…
ERIC Educational Resources Information Center
Calvete, Esther; Orue, Izaskun
2012-01-01
This longitudinal investigation assessed whether cognitive schemas of justification of violence, mistrust, and narcissism predicted social information processing (SIP), and SIP in turn predicted aggressive behavior in adolescents. A total of 650 adolescents completed measures of cognitive schemas at Time 1, SIP in ambiguous social scenarios at…
NIST: Information Management in the AMRF
NASA Technical Reports Server (NTRS)
Callaghan, George (Editor)
1991-01-01
The information management strategies developed for the NIST Automated Manufacturing Research Facility (AMRF) - a prototype small batch manufacturing facility used for integration and measurement related standards research are outlined in this video. The five major manufacturing functions - design, process planning, off-line programming, shop floor control, and materials processing are explained and their applications demonstrated.
Linguistic Complexity and Information Structure in Korean: Evidence from Eye-Tracking during Reading
ERIC Educational Resources Information Center
Lee, Yoonhyoung; Lee, Hanjung; Gordon, Peter C.
2007-01-01
The nature of the memory processes that support language comprehension and the manner in which information packaging influences online sentence processing were investigated in three experiments that used eye-tracking during reading to measure the ease of understanding complex sentences in Korean. All three experiments examined reading of embedded…
Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M
2001-01-01
Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.
Sleboda, Patrycja; Sokolowska, Joanna
2017-01-01
The first goal of this study was to validate the Rational-Experiential Inventory (REI) and the Cognitive Reflection Test (CRT) through checking their relation to the transitivity axiom. The second goal was to test the relation between decision strategies and cognitive style as well as the relation between decision strategies and the transitivity of preferences. The following characteristics of strategies were investigated: requirements for trade-offs, maximization vs. satisficing and option-wise vs. attribute-wise information processing. Respondents were given choices between two multi-attribute options. The options were designed so that the choice indicated which strategy was applied. Both the REI-R and the CRT were found to be good predictors of the transitivity of preferences. Respondents who applied compensatory strategies and the maximization criterion scored highly on the REI-R and in the CRT, whereas those who applied the satisficing rule scored highly on the REI-R but not in the CRT. Attribute-wise information processing was related to low scores in both measurements. Option-wise information processing led to a high transitivity of preferences. PMID:29093695
Sleboda, Patrycja; Sokolowska, Joanna
2017-01-01
The first goal of this study was to validate the Rational-Experiential Inventory ( REI ) and the Cognitive Reflection Test ( CRT ) through checking their relation to the transitivity axiom. The second goal was to test the relation between decision strategies and cognitive style as well as the relation between decision strategies and the transitivity of preferences. The following characteristics of strategies were investigated: requirements for trade-offs, maximization vs. satisficing and option-wise vs. attribute-wise information processing. Respondents were given choices between two multi-attribute options. The options were designed so that the choice indicated which strategy was applied. Both the REI-R and the CRT were found to be good predictors of the transitivity of preferences. Respondents who applied compensatory strategies and the maximization criterion scored highly on the REI-R and in the CRT , whereas those who applied the satisficing rule scored highly on the REI-R but not in the CRT . Attribute-wise information processing was related to low scores in both measurements. Option-wise information processing led to a high transitivity of preferences.
Experimental verification of an indefinite causal order
Rubino, Giulia; Rozema, Lee A.; Feix, Adrien; Araújo, Mateus; Zeuner, Jonas M.; Procopio, Lorenzo M.; Brukner, Časlav; Walther, Philip
2017-01-01
Investigating the role of causal order in quantum mechanics has recently revealed that the causal relations of events may not be a priori well defined in quantum theory. Although this has triggered a growing interest on the theoretical side, creating processes without a causal order is an experimental task. We report the first decisive demonstration of a process with an indefinite causal order. To do this, we quantify how incompatible our setup is with a definite causal order by measuring a “causal witness.” This mathematical object incorporates a series of measurements that are designed to yield a certain outcome only if the process under examination is not consistent with any well-defined causal order. In our experiment, we perform a measurement in a superposition of causal orders—without destroying the coherence—to acquire information both inside and outside of a “causally nonordered process.” Using this information, we experimentally determine a causal witness, demonstrating by almost 7 SDs that the experimentally implemented process does not have a definite causal order. PMID:28378018
ERIC Educational Resources Information Center
Cranford, Kristen N.; Tiettmeyer, Jessica M.; Chuprinko, Bryan C.; Jordan, Sophia; Grove, Nathaniel P.
2014-01-01
Information processing provides a powerful model for understanding how learning occurs and highlights the important role that cognitive load plays in this process. In instances in which the cognitive load of a problem exceeds the available working memory, learning can be seriously hindered. Previously reported methods for measuring cognitive load…
DOD Acquisition Information Management
1994-09-30
instead of on a real- time management information flow. The process of identifying risks and implementing corrective actions is lengthened by using the current system; performance measurement and reporting are impeded.
76 FR 34745 - Delegation of Authority to the Chief Operating Officer
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-14
... Information Act processing, budgeting, accounting, hiring and training employees, modernizing information... strategic planning, and performance management and measurement. Section B. Authority Excepted The authority...
Jackson, Carrie N.; Dussias, Paola E.; Hristova, Adelina
2012-01-01
This study uses eye-tracking to examine the processing of case-marking information in ambiguous subject- and object-first wh-questions in German. The position of the lexical verb was also manipulated via verb tense to investigate whether verb location influences how intermediate L2 learners process L2 sentences. Results show that intermediate L2 German learners were sensitive to case-marking information, exhibiting longer processing times on subject-first than object-first sentences, regardless of verb location. German native speakers exhibited the opposite word order preference, with longer processing times on object-first than subject-first sentences, replicating previous findings. These results are discussed in light of current L2 processing research, highlighting how methodological constraints influence researchers’ abilities to measure the on-line processing of morphosyntactic information among intermediate L2 learners. PMID:23493761
Fuzzy geometry, entropy, and image information
NASA Technical Reports Server (NTRS)
Pal, Sankar K.
1991-01-01
Presented here are various uncertainty measures arising from grayness ambiguity and spatial ambiguity in an image, and their possible applications as image information measures. Definitions are given of an image in the light of fuzzy set theory, and of information measures and tools relevant for processing/analysis e.g., fuzzy geometrical properties, correlation, bound functions and entropy measures. Also given is a formulation of algorithms along with management of uncertainties for segmentation and object extraction, and edge detection. The output obtained here is both fuzzy and nonfuzzy. Ambiguity in evaluation and assessment of membership function are also described.
Measuring, managing and maximizing refinery performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bascur, O.A.; Kennedy, J.P.
1996-01-01
Implementing continuous quality improvement is a confluence of total quality management, people empowerment, performance indicators and information engineering. Supporting information technologies allow a refiner to narrow the gap between management objectives and the process control level. Dynamic performance monitoring benefits come from production cost savings, improved communications and enhanced decision making. A refinery workgroup information flow model helps automate continuous improvement of processes, performance and the organization. The paper discusses the rethinking of refinery operations, dynamic performance monitoring, continuous process improvement, the knowledge coordinator and repository manager, an integrated plant operations workflow, and successful implementation.
Portnoy, David B
2010-07-01
Waiting for medical test results that signal physical harm can be a stressful and potentially psychologically harmful experience. Despite this, interventionists and physicians often use this wait time to deliver behavior change messages and other important information about the test, possible results and its implications. This study examined how "bracing" for a medical test result impacts cognitive processing, as well as recall of information delivered during this period. Healthy U.S. university students (N = 150) were tested for a deficiency of a fictitious saliva biomarker that was said to be predictive of long-term health problems using a 2 (Test Result) x 2 (Expected immediacy of result: 10 min, 1 month) factorial design. Participants expecting to get the test result shortly should have been bracing for the result. While waiting for the test results participants completed measures of cognitive processing. After participants received the test result, recall of information about the biomarker was tested in addition to cognitive measures. One week later, participants who were originally told they did not have the deficiency had their recall assessed again. Results showed that anticipating an imminent test result increased cognitive distraction in the processing of information and lowered recall of information about the test and the biomarker. These results suggest that delivering critical information to patients after administering a test and immediately before giving the results may not be optimal.
An Investigation of the Reliability and Self-Regulatory Correlates of Conflict Adaptation.
Feldman, Julia L; Freitas, Antonio L
2016-07-01
The study of the conflict-adaptation effect, in which encountering information-processing conflict attenuates the disruptive influence of information-processing conflicts encountered subsequently, is a burgeoning area of research. The present study investigated associations among performance measures on a Stroop-trajectory task (measuring Stroop interference and conflict adaptation), on a Wisconsin Card Sorting Task (WCST; measuring cognitive flexibility), and on self-reported measures of self-regulation (including impulsivity and tenacity). We found significant reliability of the conflict-adaptation effects across a two-week period, for response-time and accuracy. Variability in conflict adaptation was not associated significantly with any indicators of performance on the WCST or with most of the self-reported self-regulation measures. There was substantial covariance between Stroop interference for accuracy and conflict adaptation for accuracy. The lack of evidence of covariance across distinct aspects of cognitive control (conflict adaptation, WCST performance, self-reported self-control) may reflect the operation of relatively independent component processes.
Implementing an ROI Measurement Process at Dell Computer.
ERIC Educational Resources Information Center
Tesoro, Ferdinand
1998-01-01
This return-on-investment (ROI) evaluation study determined the business impact of the sales negotiation training course to Dell Computer Corporation. A five-step ROI measurement process was used: Plan-Develop-Analyze-Communicate-Leverage. The corporate sales information database was used to compare pre- and post-training metrics for both training…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamont, Stephen Philip; Brisson, Marcia; Curry, Michael
2011-02-17
Nuclear forensics assessments to determine material process history requires careful comparison of sample data to both measured and modeled nuclear material characteristics. Developing centralized databases, or nuclear forensics libraries, to house this information is an important step to ensure all relevant data will be available for comparison during a nuclear forensics analysis and help expedite the assessment of material history. The approach most widely accepted by the international community at this time is the implementation of National Nuclear Forensics libraries, which would be developed and maintained by individual nations. This is an attractive alternative toan international database since it providesmore » an understanding that each country has data on materials produced and stored within their borders, but eliminates the need to reveal any proprietary or sensitive information to other nations. To support the concept of National Nuclear Forensics libraries, the United States Department of Energy has developed a model library, based on a data dictionary, or set of parameters designed to capture all nuclear forensic relevant information about a nuclear material. Specifically, information includes material identification, collection background and current location, analytical laboratories where measurements were made, material packaging and container descriptions, physical characteristics including mass and dimensions, chemical and isotopic characteristics, particle morphology or metallurgical properties, process history including facilities, and measurement quality assurance information. While not necessarily required, it may also be valuable to store modeled data sets including reactor burn-up or enrichment cascade data for comparison. It is fully expected that only a subset of this information is available or relevant to many materials, and much of the data populating a National Nuclear Forensics library would be process analytical or material accountability measurement data as opposed to a complete forensic analysis of each material in the library.« less
A measure for assessing the effects of audiovisual speech integration.
Altieri, Nicholas; Townsend, James T; Wenger, Michael J
2014-06-01
We propose a measure of audiovisual speech integration that takes into account accuracy and response times. This measure should prove beneficial for researchers investigating multisensory speech recognition, since it relates to normal-hearing and aging populations. As an example, age-related sensory decline influences both the rate at which one processes information and the ability to utilize cues from different sensory modalities. Our function assesses integration when both auditory and visual information are available, by comparing performance on these audiovisual trials with theoretical predictions for performance under the assumptions of parallel, independent self-terminating processing of single-modality inputs. We provide example data from an audiovisual identification experiment and discuss applications for measuring audiovisual integration skills across the life span.
Social attribution processes and comorbid psychiatric symptoms in children with Asperger syndrome
Meyer, Jessica A.; Mundy, Peter C.; Van Hecke, Amy Vaughan; Durocher, Jennifer Stella
2009-01-01
The factors that place children with Asperger syndrome at risk for comorbid psychiatric symptoms, such as anxiety and depression, remain poorly understood. We investigated the possibility that the children’s emotional and behavioral difficulties are associated with social information and attribution processing. Participants were children with either Asperger syndrome (n = 31) or typical development (n = 33).To assess social information and attribution processing, children responded to hypothetical social vignettes.They also completed self-report measures of social difficulties and psychological functioning. Their parents provided information on social competence and clinical presentation. Children with Asperger syndrome showed poor psychosocial adjustment, which was related to their social information and attribution processing patterns. Cognitive and social-cognitive abilities were associated with aspects of social information processing tendencies, but not with emotional and behavioral difficulties. Results suggest that the comorbid symptoms of children with Asperger syndrome may be associated with their social perception, understanding, and experience. PMID:16908481
Silicon surface barrier detectors used for liquid hydrogen density measurement
NASA Technical Reports Server (NTRS)
James, D. T.; Milam, J. K.; Winslett, H. B.
1968-01-01
Multichannel system employing a radioisotope radiation source, strontium-90, radiation detector, and a silicon surface barrier detector, measures the local density of liquid hydrogen at various levels in a storage tank. The instrument contains electronic equipment for collecting the density information, and a data handling system for processing this information.
Georeferenced LiDAR 3D vine plantation map generation.
Llorens, Jordi; Gil, Emilio; Llop, Jordi; Queraltó, Meritxell
2011-01-01
The use of electronic devices for canopy characterization has recently been widely discussed. Among such devices, LiDAR sensors appear to be the most accurate and precise. Information obtained with LiDAR sensors during reading while driving a tractor along a crop row can be managed and transformed into canopy density maps by evaluating the frequency of LiDAR returns. This paper describes a proposed methodology to obtain a georeferenced canopy map by combining the information obtained with LiDAR with that generated using a GPS receiver installed on top of a tractor. Data regarding the velocity of LiDAR measurements and UTM coordinates of each measured point on the canopy were obtained by applying the proposed transformation process. The process allows overlap of the canopy density map generated with the image of the intended measured area using Google Earth(®), providing accurate information about the canopy distribution and/or location of damage along the rows. This methodology was applied and tested on different vine varieties and crop stages in two important vine production areas in Spain. The results indicate that the georeferenced information obtained with LiDAR sensors appears to be an interesting tool with the potential to improve crop management processes.
Value of information and natural resources decision-making
Williams, Byron K.; Johnson, Fred A.
2015-01-01
Though the potential for information to measurably improve management has been highlighted for several decades, in recent years the “value of information” has surfaced with increasing frequency in natural resources. However, the use of this phrase belies the fact that many in natural resources have only a limited understanding about what it actually means, how to measure it, and what to do with it. We introduce and describe several forms of the value of information in a context of the management of renewable natural resources. The value of information is discussed in terms of a potential gain in value with the addition of new information, as well as a loss in value associated with the absence of information. Value metrics are developed for uncertainty about resource status as well as resource processes and responses to management. We provide a common notation for the metrics of value, and discuss linkages of the value of information to strategic approaches such as adaptive resources management and partially observable decision processes.
System of Programmed Modules for Measuring Photographs with a Gamma-Telescope
NASA Technical Reports Server (NTRS)
Averin, S. A.; Veselova, G. V.; Navasardyan, G. V.
1978-01-01
Physical experiments using tracking cameras resulted in hundreds of thousands of stereo photographs of events being received. To process such a large volume of information, automatic and semiautomatic measuring systems are required. At the Institute of Space Research of the Academy of Science of the USSR, a system for processing film information from the spark gamma-telescope was developed. The system is based on a BPS-75 projector in line with the minicomputer Elektronika 1001. The report describes this system. The various computer programs available to the operators are discussed.
Automated information and control complex of hydro-gas endogenous mine processes
NASA Astrophysics Data System (ADS)
Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.
2017-09-01
The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.
1980-11-26
and J.B. Thomas, "The Effect of a Memoryless Nonlinearity on the Spectrum of a Random Process," IEEE Transactions on Information Theory, Vol. IT-23, pp...Density Function from Measurements Corrupted by Poisson Noise," IEEE Transactions on Information Theory, Vol. IT-23, pp. 764-766, November 1977. H. Derin...pp. 243-249, December 1977. G.L. Wise and N.C. Gallagher, "On Spherically Invariant Random Processes," IEEE Transactions on Information Theory, Vol. IT
Kempe, Vera; Bublitz, Dennis; Brooks, Patricia J
2015-05-01
Is the observed link between musical ability and non-native speech-sound processing due to enhanced sensitivity to acoustic features underlying both musical and linguistic processing? To address this question, native English speakers (N = 118) discriminated Norwegian tonal contrasts and Norwegian vowels. Short tones differing in temporal, pitch, and spectral characteristics were used to measure sensitivity to the various acoustic features implicated in musical and speech processing. Musical ability was measured using Gordon's Advanced Measures of Musical Audiation. Results showed that sensitivity to specific acoustic features played a role in non-native speech-sound processing: Controlling for non-verbal intelligence, prior foreign language-learning experience, and sex, sensitivity to pitch and spectral information partially mediated the link between musical ability and discrimination of non-native vowels and lexical tones. The findings suggest that while sensitivity to certain acoustic features partially mediates the relationship between musical ability and non-native speech-sound processing, complex tests of musical ability also tap into other shared mechanisms. © 2014 The British Psychological Society.
On the susceptibility of adaptive memory to false memory illusions.
Howe, Mark L; Derbish, Mary H
2010-05-01
Previous research has shown that survival-related processing of word lists enhances retention for that material. However, the claim that survival-related memories are more accurate has only been examined when true recall and recognition of neutral material has been measured. In the current experiments, we examined the adaptive memory superiority effect for different types of processing and material, measuring accuracy more directly by comparing true and false recollection rates. Survival-related information and processing was examined using word lists containing backward associates of neutral, negative, and survival-related critical lures and type of processing (pleasantness, moving, survival) was varied using an incidental memory paradigm. Across four experiments, results showed that survival-related words were more susceptible than negative and neutral words to the false memory illusion and that processing information in terms of its relevance to survival independently increased this susceptibility to the false memory illusion. Overall, although survival-related processing and survival-related information resulted in poorer, not more accurate, memory, such inaccuracies may have adaptive significance. These findings are discussed in the context of false memory research and recent theories concerning the importance of survival processing and the nature of adaptive memory. Copyright 2009 Elsevier B.V. All rights reserved.
Imperfection and Thickness Measurement of Panels Using a Coordinate Measurement Machine
NASA Technical Reports Server (NTRS)
Thornburgh, Robert P.
2006-01-01
This paper summarizes the methodology used to measure imperfection and thickness variation for flat and curved panels using a Coordinate Measurement Machine (CMM) and the software program MeasPanel. The objective is to provide a reference document so that someone with a basic understanding of CMM operation can measure a panel with minimal training. Detailed information about both the measurement system setup and computer software is provided. Information is also provided about the format of the raw data, as well as how it is post-processed for use in finite-element analysis.
Evidence for online processing during causal learning.
Liu, Pei-Pei; Luhmann, Christian C
2015-03-01
Many models of learning describe both the end product of learning (e.g., causal judgments) and the cognitive mechanisms that unfold on a trial-by-trial basis. However, the methods employed in the literature typically provide only indirect evidence about the unfolding cognitive processes. Here, we utilized a simultaneous secondary task to measure cognitive processing during a straightforward causal-learning task. The results from three experiments demonstrated that covariation information is not subject to uniform cognitive processing. Instead, we observed systematic variation in the processing dedicated to individual pieces of covariation information. In particular, observations that are inconsistent with previously presented covariation information appear to elicit greater cognitive processing than do observations that are consistent with previously presented covariation information. In addition, the degree of cognitive processing appears to be driven by learning per se, rather than by nonlearning processes such as memory and attention. Overall, these findings suggest that monitoring learning processes at a finer level may provide useful psychological insights into the nature of learning.
Exploring the joint measurability using an information-theoretic approach
NASA Astrophysics Data System (ADS)
Hsu, Li-Yi
2016-12-01
We explore the legal purity parameters for the joint measurements. Instead of direct unsharpening the measurements, we perform the quantum cloning before the sharp measurements. The necessary fuzziness in the unsharp measurements is equivalently introduced in the imperfect cloning process. Based on the information causality and the consequent noisy nonlocal computation, one can derive the information-theoretic quadratic inequalities that must be satisfied by any physical theory. On the other hand, to guarantee the classicality, the linear Bell-type inequalities deduced by these quadratic ones must be obeyed. As for the joint measurability, the purity parameters must be chosen to obey both types of inequalities. Finally, the quadratic inequalities for purity parameters in the joint measurability region are derived.
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon
2010-01-01
This slide presentation reviews the evolution of risk management (RM) at NASA. The aim of the RM approach at NASA is to promote an approach that is heuristic, proactive, and coherent across all of NASA. Risk Informed Decision Making (RIDM) is a decision making process that uses a diverse set of performance measures along with other considerations within a deliberative process to inform decision making. RIDM is invoked for key decisions such as architecture and design decisions, make-buy decisions, and budget reallocation. The RIDM process and how it relates to the continuous Risk Management (CRM) process is reviewed.
NASA Astrophysics Data System (ADS)
Haapasalo, Erkka; Pellonpää, Juha-Pekka
2017-12-01
Various forms of optimality for quantum observables described as normalized positive-operator-valued measures (POVMs) are studied in this paper. We give characterizations for observables that determine the values of the measured quantity with probabilistic certainty or a state of the system before or after the measurement. We investigate observables that are free from noise caused by classical post-processing, mixing, or pre-processing of quantum nature. Especially, a complete characterization of pre-processing and post-processing clean observables is given, and necessary and sufficient conditions are imposed on informationally complete POVMs within the set of pure states. We also discuss joint and sequential measurements of optimal quantum observables.
Bapat, Shweta S; Patel, Harshali K; Sansgiry, Sujit S
2017-10-16
In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety ( p < 0.001) and information load ( p < 0.001). The intention to read was highest and significantly different ( p < 0.001) for PILs as compared to current practice or text only leaflets. Information anxiety and information load significantly impacted intention to read ( p < 0.001). Newly developed PILs increased patient's intention to read and can help in improving the counseling services provided by pharmacists.
Bapat, Shweta S.; Patel, Harshali K.; Sansgiry, Sujit S.
2017-01-01
In this study, we evaluate the role of information anxiety and information load on the intention to read information from prescription drug information leaflets (PILs). These PILs were developed based on the principals of information load and consumer information processing. This was an experimental prospective repeated measures study conducted in the United States where 360 (62% response rate) university students (>18 years old) participated. Participants were presented with a scenario followed by exposure to the three drug product information sources used to operationalize information load. The three sources were: (i) current practice; (ii) pre-existing one-page text only; and (iii) interventional one-page prototype PILs designed for the study. Information anxiety was measured as anxiety experienced by the individual when encountering information. The outcome variable of intention to read PILs was defined as the likelihood that the patient will read the information provided in the leaflets. A survey questionnaire was used to capture the data and the objectives were analyzed by performing a repeated measures MANOVA using SAS version 9.3. When compared to current practice and one-page text only leaflets, one-page PILs had significantly lower scores on information anxiety (p < 0.001) and information load (p < 0.001). The intention to read was highest and significantly different (p < 0.001) for PILs as compared to current practice or text only leaflets. Information anxiety and information load significantly impacted intention to read (p < 0.001). Newly developed PILs increased patient’s intention to read and can help in improving the counseling services provided by pharmacists. PMID:29035337
Multimedia in the informed consent process for endoscopic sinus surgery: A randomized control trial.
Siu, Jennifer M; Rotenberg, Brian W; Franklin, Jason H; Sowerby, Leigh J
2016-06-01
To determine patient recall of specific risks associated with endoscopic sinus surgery and whether an adjunct multimedia education module is an effective patient tool in enhancing the standard informed consent process. Prospective, randomized, controlled trial. Fifty consecutive adult patients scheduled for endoscopic sinus surgery at a rhinology clinic of a tertiary care hospital were recruited for this study. Informed consent was studied by comparing the number of risks recalled when patients had a verbal discussion in conjunction with a 6-minute interactive module or the verbal discussion alone. Early recall was measured immediately following the informed consent process, and delayed recall was measured 3 to 4 weeks after patient preference details were also collected. Early risk recall in the multimedia group was significantly higher than the control group (P = .0036); however, there was no difference between the groups in delayed risk recall. Seventy-six percent of participants expressed interest in viewing the multimedia module if available online between the preoperative and procedural day. Sixty-eight percent of patients preferred having the multimedia module as an adjunct to the informed consent process as opposed to the multimedia consent process alone. There is an early improvement in overall risk recall in patients who complete an interactive multimedia module, with a clear patient preference for this method. Here we emphasize the well-known challenges of patient education and demonstrate the effectiveness of integrating technology into clinical practice in order to enhance the informed consent process. 1b Laryngoscope, 126:1273-1278, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.
Social Information Processing in Children: Specific Relations to Anxiety, Depression, and Affect
ERIC Educational Resources Information Center
Luebbe, Aaron M.; Bell, Debora J.; Allwood, Maureen A.; Swenson, Lance P.; Early, Martha C.
2010-01-01
Two studies examined shared and unique relations of social information processing (SIP) to youth's anxious and depressive symptoms. Whether SIP added unique variance over and above trait affect in predicting internalizing symptoms was also examined. In Study 1, 215 youth (ages 8-13) completed symptom measures of anxiety and depression and a…
ERIC Educational Resources Information Center
Oron, Anna; Szymaszek, Aneta; Szelag, Elzbieta
2015-01-01
Background: Temporal information processing (TIP) underlies many aspects of cognitive functions like language, motor control, learning, memory, attention, etc. Millisecond timing may be assessed by sequencing abilities, e.g. the perception of event order. It may be measured with auditory temporal-order-threshold (TOT), i.e. a minimum time gap…
ERIC Educational Resources Information Center
Lamb, Richard; Cavagnetto, Andy; Akmal, Tariq
2016-01-01
A critical problem with the examination of learning in education is that there is an underlying assumption that the dynamic systems associated with student information processing can be measured using static linear assessments. This static linear approach does not provide sufficient ability to characterize learning. Much of the modern research…
Pathways From Toddler Information Processing to Adolescent Lexical Proficiency.
Rose, Susan A; Feldman, Judith F; Jankowski, Jeffery J
2015-01-01
This study examined the relation of 3-year core information-processing abilities to lexical growth and development. The core abilities covered four domains-memory, representational competence (cross-modal transfer), processing speed, and attention. Lexical proficiency was assessed at 3 and 13 years with the Peabody Picture Vocabulary Test (PPVT) and verbal fluency. The sample (N = 128) consisted of 43 preterms (< 1750 g) and 85 full-terms. Structural equation modeling indicated concurrent relations of toddler information processing and language proficiency and, independent of stability in language, direct predictive links between (a) 3-year cross-modal ability and 13-year PPVT and (b) 3-year processing speed and both 13-year measures, PPVT and verbal fluency. Thus, toddler information processing was related to growth in lexical proficiency from 3 to 13 years. © 2015 The Authors. Child Development © 2015 Society for Research in Child Development, Inc.
NASA Astrophysics Data System (ADS)
Maciel, Thiago O.; Vianna, Reinaldo O.; Sarthour, Roberto S.; Oliveira, Ivan S.
2015-11-01
We reconstruct the time dependent quantum map corresponding to the relaxation process of a two-spin system in liquid-state NMR at room temperature. By means of quantum tomography techniques that handle informational incomplete data, we show how to properly post-process and normalize the measurements data for the simulation of quantum information processing, overcoming the unknown number of molecules prepared in a non-equilibrium magnetization state (Nj) by an initial sequence of radiofrequency pulses. From the reconstructed quantum map, we infer both longitudinal (T1) and transversal (T2) relaxation times, and introduce the J-coupling relaxation times ({T}1J,{T}2J), which are relevant for quantum information processing simulations. We show that the map associated to the relaxation process cannot be assumed approximated unital and trace-preserving for times greater than {T}2J.
A Tutorial on Probablilistic Risk Assessement and its Role in Risk-Informed Decision Making
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon
2010-01-01
This slide presentation reviews risk assessment and its role in risk-informed decision making. It includes information on probabilistic risk assessment, typical risk management process, origins of risk matrix, performance measures, performance objectives and Bayes theorem.
NASA Astrophysics Data System (ADS)
Bowling, Shannon Raye
The aircraft maintenance industry is a complex system consisting of human and machine components, because of this; much emphasis has been placed on improving aircraft-inspection performance. One proven technique for improving inspection performance is the use of training. There are several strategies that have been implemented for training, one of which is feedforward information. The use of prior information (feedforward) is known to positively affect inspection performance. This information can consist of knowledge about defect characteristics (types, severity/criticality, and location) and the probability of occurrence. Although several studies have been conducted that demonstrate the usefulness of feedforward as a training strategy, there are certain research issues that need to be addressed. This study evaluates the effect of feedforward information in a simulated 3-dimensional environment by the use of virtual reality. A controlled study was conducted to evaluate the effectiveness of feedforward information in a simulated aircraft inspection environment. The study was conducted in two phases. The first phase evaluated the difference between general and detailed inspection at different pacing levels. The second phase evaluated the effect of feedforward information pertaining to severity, probability and location. Analyses of the results showed that subjects performing detailed inspection performed significantly better than while performing general inspection. Pacing also had the effect of reducing performance for both general and detailed inspection. The study also found that as the level of feedforward information increases, performance also increases. In addition to evaluating performance measures, the study also evaluated process and subjective measures. It was found that process measures such as number of fixation points, fixation groups, mean fixation duration, and percent area covered were all affected by the treatment levels. Analyses of the subjective measures also found a correlation between the perceived usefulness of feedforward information and the actual effect on performance. The study also examined the potential of virtual reality as a training tool and analyzed the effect different calculational algorithms have on determining various process measures.
Differentiating between appraisal process and product in cognitive theories of posttraumatic stress
Nanney, John T.; Constans, Joseph I.; Kimbrell, Timothy A.; Kramer, Teresa L.; Pyne, Jeffrey M.
2014-01-01
Biased appraisal is central to cognitive theories of posttraumatic stress, but little research has examined the potentially distinct meanings of the term. The on-going process of appraising social information and the beliefs that emerge as products of that process can be distinguished conceptually. The present study sought to examine if these two meanings are empirically distinct as well, and if so, to begin exploring potential relations between these appraisal constructs and posttraumatic stress symptoms. Soldiers (N = 424) preparing for deployment to Iraq or Afghanistan were administered measures of each construct. Results of confirmatory factor analysis suggest that the appraisal process and the products of that process (i.e., beliefs) are indeed distinct. Structural equation models are consistent with cognitive bias and social information processing literatures which posit that a biased appraisal process may contribute to the development of dysfunctional beliefs and posttraumatic stress symptoms following trauma. The potential utility of distinctly conceptualizing and measuring the appraisal process in both clinical and research settings is discussed. PMID:26147520
Minimized state complexity of quantum-encoded cryptic processes
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
Differentiating between appraisal process and product in cognitive theories of posttraumatic stress.
Nanney, John T; Constans, Joseph I; Kimbrell, Timothy A; Kramer, Teresa L; Pyne, Jeffrey M
2015-07-01
Biased appraisal is central to cognitive theories of posttraumatic stress, but little research has examined the potentially distinct meanings of the term. The ongoing process of appraising social information and the beliefs that emerge as products of that process can be distinguished conceptually. This study sought to examine whether these 2 meanings are empirically distinct as well, and if so, to begin exploring potential relations between these appraisal constructs and posttraumatic stress symptoms. Soldiers (N = 424) preparing for deployment to Iraq or Afghanistan were administered measures of each construct. Results of confirmatory factor analysis suggest that the appraisal process and the products of that process (i.e., beliefs) are indeed distinct. Structural equation models are consistent with cognitive bias and social information processing literatures, which posit that a biased appraisal process may contribute to the development of dysfunctional beliefs and posttraumatic stress symptoms following trauma. The potential utility of distinctly conceptualizing and measuring the appraisal process in both clinical and research settings is discussed. (c) 2015 APA, all rights reserved).
Processing Depth, Elaboration of Encoding, Memory Stores, and Expended Processing Capacity.
ERIC Educational Resources Information Center
Eysenck, Michael W.; Eysenck, M. Christine
1979-01-01
The effects of several factors on expended processing capacity were measured. Expended processing capacity was greater when information was retrieved from secondary memory than from primary memory, when processing was of a deep, semantic nature than when it was shallow and physical, and when processing was more elaborate. (Author/GDC)
Incorporating Learning Analytics in the Classroom
ERIC Educational Resources Information Center
Thille, Candace; Zimmaro, Dawn
2017-01-01
This chapter describes an open learning analytics system focused on learning process measures and designed to engage instructors and students in an evidence-informed decision-making process to improve learning.
Riley, William; Begun, James W; Meredith, Les; Miller, Kristi K; Connolly, Kathy; Price, Rebecca; Muri, Janet H; McCullough, Mac; Davis, Stanley
2016-12-01
To improve safety practices and reduce adverse events in perinatal units of acute care hospitals. Primary data collected from perinatal units of 14 hospitals participating in the intervention between 2008 and 2012. Baseline secondary data collected from the same hospitals between 2006 and 2007. A prospective study involving 342,754 deliveries was conducted using a quality improvement collaborative that supported three primary interventions. Primary measures include adoption of three standardized care processes and four measures of outcomes. Chart audits were conducted to measure the implementation of standardized care processes. Outcome measures were collected and validated by the National Perinatal Information Center. The hospital perinatal units increased use of all three care processes, raising consolidated overall use from 38 to 81 percent between 2008 and 2012. The harms measured by the Adverse Outcome Index decreased 14 percent, and a run chart analysis revealed two special causes associated with the interventions. This study demonstrates the ability of hospital perinatal staff to implement efforts to reduce perinatal harm using a quality improvement collaborative. Findings help inform the relationship between the use of standardized care processes, teamwork training, and improved perinatal outcomes, and suggest that a multiplicity of integrated strategies, rather than a single intervention, may be essential to achieve high reliability. © Health Research and Educational Trust.
Modern Techniques in Acoustical Signal and Image Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candy, J V
2002-04-04
Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less
Influence of measurement error on Maxwell's demon
NASA Astrophysics Data System (ADS)
Sørdal, Vegard; Bergli, Joakim; Galperin, Y. M.
2017-06-01
In any general cycle of measurement, feedback, and erasure, the measurement will reduce the entropy of the system when information about the state is obtained, while erasure, according to Landauer's principle, is accompanied by a corresponding increase in entropy due to the compression of logical and physical phase space. The total process can in principle be fully reversible. A measurement error reduces the information obtained and the entropy decrease in the system. The erasure still gives the same increase in entropy, and the total process is irreversible. Another consequence of measurement error is that a bad feedback is applied, which further increases the entropy production if the proper protocol adapted to the expected error rate is not applied. We consider the effect of measurement error on a realistic single-electron box Szilard engine, and we find the optimal protocol for the cycle as a function of the desired power P and error ɛ .
ERIC Educational Resources Information Center
Alexander, Rodney T.
2017-01-01
Organizational computing devices are increasingly becoming targets of cyber-attacks, and organizations have become dependent on the safety and security of their computer networks and their organizational computing devices. Business and government often use defense in-depth information assurance measures such as firewalls, intrusion detection…
Individual differences in working memory capacity and workload capacity.
Yu, Ju-Chi; Chang, Ting-Yun; Yang, Cheng-Ta
2014-01-01
We investigated the relationship between working memory capacity (WMC) and workload capacity (WLC). Each participant performed an operation span (OSPAN) task to measure his/her WMC and three redundant-target detection tasks to measure his/her WLC. WLC was computed non-parametrically (Experiments 1 and 2) and parametrically (Experiment 2). Both levels of analyses showed that participants high in WMC had larger WLC than those low in WMC only when redundant information came from visual and auditory modalities, suggesting that high-WMC participants had superior processing capacity in dealing with redundant visual and auditory information. This difference was eliminated when multiple processes required processing for only a single working memory subsystem in a color-shape detection task and a double-dot detection task. These results highlighted the role of executive control in integrating and binding information from the two working memory subsystems for perceptual decision making.
Abnormal visual scan paths: a psychophysiological marker of delusions in schizophrenia.
Phillips, M L; David, A S
1998-02-09
The role of the visual scan path as a psychophysiological marker of visual attention has been highlighted previously (Phillips and David, 1994). We investigated information processing in schizophrenic patients with severe delusions and again when the delusions were subsiding using visual scan path measurements. We aimed to demonstrate a specific deficit in processing human faces in deluded subjects by relating this to abnormal viewing strategies. Scan paths were measured in six deluded and five non-deluded schizophrenics (matched for medication and negative symptoms), and nine age-matched normal controls. Deluded subjects had abnormal scan paths in a recognition task, fixating non-feature areas significantly more than controls, but were equally accurate. Re-testing after improvement in delusional conviction revealed fewer group differences. The results suggest state-dependent abnormal information processing in schizophrenics when deluded, with reliance on less-salient visual information for decision-making.
Non-Markovian quantum feedback networks II: Controlled flows
NASA Astrophysics Data System (ADS)
Gough, John E.
2017-06-01
The concept of a controlled flow of a dynamical system, especially when the controlling process feeds information back about the system, is of central importance in control engineering. In this paper, we build on the ideas presented by Bouten and van Handel [Quantum Stochastics and Information: Statistics, Filtering and Control (World Scientific, 2008)] and develop a general theory of quantum feedback. We elucidate the relationship between the controlling processes, Z, and the measured processes, Y, and to this end we make a distinction between what we call the input picture and the output picture. We should note that the input-output relations for the noise fields have additional terms not present in the standard theory but that the relationship between the control processes and measured processes themselves is internally consistent—we do this for the two main cases of quadrature measurement and photon-counting measurement. The theory is general enough to include a modulating filter which post-processes the measurement readout Y before returning to the system. This opens up the prospect of applying very general engineering feedback control techniques to open quantum systems in a systematic manner, and we consider a number of specific modulating filter problems. Finally, we give a brief argument as to why most of the rules for making instantaneous feedback connections [J. Gough and M. R. James, Commun. Math. Phys. 287, 1109 (2009)] ought to apply for controlled dynamical networks as well.
Camouflage target detection via hyperspectral imaging plus information divergence measurement
NASA Astrophysics Data System (ADS)
Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Ji, Yiqun; Shen, Weimin
2016-01-01
Target detection is one of most important applications in remote sensing. Nowadays accurate camouflage target distinction is often resorted to spectral imaging technique due to its high-resolution spectral/spatial information acquisition ability as well as plenty of data processing methods. In this paper, hyper-spectral imaging technique together with spectral information divergence measure method is used to solve camouflage target detection problem. A self-developed visual-band hyper-spectral imaging device is adopted to collect data cubes of certain experimental scene before spectral information divergences are worked out so as to discriminate target camouflage and anomaly. Full-band information divergences are measured to evaluate target detection effect visually and quantitatively. Information divergence measurement is proved to be a low-cost and effective tool for target detection task and can be further developed to other target detection applications beyond spectral imaging technique.
Processing distinct linguistic information types in working memory in aphasia.
Wright, Heather Harris; Downey, Ryan A; Gravier, Michelle; Love, Tracy; Shapiro, Lewis P
2007-06-01
BACKGROUND: Recent investigations have suggested that adults with aphasia present with a working memory deficit that may contribute to their language-processing difficulties. Working memory capacity has been conceptualised as a single "resource" pool for attentional, linguistic, and other executive processing-alternatively, it has been suggested that there may be separate working memory abilities for different types of linguistic information. A challenge in this line of research is developing an appropriate measure of working memory ability in adults with aphasia. One candidate measure of working memory ability that may be appropriate for this population is the n-back task. By manipulating stimulus type, the n-back task may be appropriate for tapping linguistic-specific working memory abilities. AIMS: The purposes of this study were (a) to measure working memory ability in adults with aphasia for processing specific types of linguistic information, and (b) to examine whether a relationship exists between participants' performance on working memory and auditory comprehension measures. METHOD #ENTITYSTARTX00026; PROCEDURES: Nine adults with aphasia participated in the study. Participants completed three n-back tasks, each tapping different types of linguistic information. They included the PhonoBack (phonological level), SemBack (semantic level), and SynBack (syntactic level). For all tasks, two n-back levels were administered: a 1-back and 2-back. Each level contained 20 target items; accuracy was recorded by stimulus presentation software. The Subject-relative, Object-relative, Active, Passive Test of Syntactic Complexity (SOAP) was the syntactic sentence comprehension task administered to all participants. OUTCOMES #ENTITYSTARTX00026; RESULTS: Participants' performance declined as n-back task difficulty increased. Overall, participants performed better on the SemBack than PhonoBack and SynBack tasks, but the differences were not statistically significant. Finally, participants who performed poorly on the SynBack also had more difficulty comprehending syntactically complex sentence structures (i.e., passive & object-relative sentences). CONCLUSIONS: Results indicate that working memory ability for different types of linguistic information can be measured in adults with aphasia. Further, our results add to the growing literature that favours separate working memory abilities for different types of linguistic information view.
RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.
Glaab, Enrico; Schneider, Reinhard
2015-07-01
High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Wills, Peter R
2016-03-13
This article reviews contributions to this theme issue covering the topic 'DNA as information' in relation to the structure of DNA, the measure of its information content, the role and meaning of information in biology and the origin of genetic coding as a transition from uninformed to meaningful computational processes in physical systems. © 2016 The Author(s).
The Information Sector: Definition and Measurement.
ERIC Educational Resources Information Center
Porat, Marc U.
In the last 20 years the U.S. economy had changed as a result of the increase in production, processing, and distribution of information goods and services. Three information sectors--the primary sector producing information goods and services, the private bureaucracy, and the public bureaucracy--are part of a six-sector economy. Today,…
ERIC Educational Resources Information Center
Gerjets, Peter; Kammerer, Yvonne; Werner, Benita
2011-01-01
Web searching for complex information requires to appropriately evaluating diverse sources of information. Information science studies identified different criteria applied by searchers to evaluate Web information. However, the explicit evaluation instructions used in these studies might have resulted in a distortion of spontaneous evaluation…
Assessing the Process of Retirement: a Cross-Cultural Review of Available Measures.
Rafalski, Julia C; Noone, Jack H; O'Loughlin, Kate; de Andrade, Alexsandro L
2017-06-01
Retirement research is now expanding beyond the post-World War II baby boomers' retirement attitudes and plans to include the nature of their workforce exit and how successfully they adjust to their new life. These elements are collectively known as the process of retirement. However, there is insufficient research in developing countries to inform the management of their ageing populations regarding this process. This review aims to facilitate national and cross-cultural research in developing and non-English speaking countries by reviewing the existing measures of the retirement process published in English and Portuguese. The review identified 28 existing measures assessing retirement attitudes, planning, decision making, adjustment and satisfaction with retirement. Information on each scale's item structure, internal reliability, grammatical structure and evidence of translations to other languages is presented. Of the 28 measures, 20 assessed retirement attitudes, plans and decision-making, 5 assessed adjustment to retirement and only two assessed retirement satisfaction. Only eight of the 28 scales had been translated into languages other than English. There is scope to translate measures of retirement attitudes and planning into other languages. However there is a paucity of translated measures of retirement decision-making and adjustment, and measures of retirement satisfaction in general. Within the limitations of this review, researchers are provided with the background to decide between translating existing measures or developing of more culturally appropriate assessment tools for addressing their research questions.
Constraining ecosystem processes from tower fluxes and atmospheric profiles.
Hill, T C; Williams, M; Woodward, F I; Moncrieff, J B
2011-07-01
The planetary boundary layer (PBL) provides an important link between the scales and processes resolved by global atmospheric sampling/modeling and site-based flux measurements. The PBL is in direct contact with the land surface, both driving and responding to ecosystem processes. Measurements within the PBL (e.g., by radiosondes, aircraft profiles, and flask measurements) have a footprint, and thus an integrating scale, on the order of 1-100 km. We use the coupled atmosphere-biosphere model (CAB) and a Bayesian data assimilation framework to investigate the amount of biosphere process information that can be inferred from PBL measurements. We investigate the information content of PBL measurements in a two-stage study. First, we demonstrate consistency between the coupled model (CAB) and measurements, by comparing the model to eddy covariance flux tower measurements (i.e., water and carbon fluxes) and also PBL scalar profile measurements (i.e., water, carbon dioxide, and temperature) from Canadian boreal forest. Second, we use the CAB model in a set of Bayesian inversions experiments using synthetic data for a single day. In the synthetic experiment, leaf area and respiration were relatively well constrained, whereas surface albedo and plant hydraulic conductance were only moderately constrained. Finally, the abilities of the PBL profiles and the eddy covariance data to constrain the parameters were largely similar and only slightly lower than the combination of both observations.
Remotely Sensed Information and Field Data are both Essential to Assess Biodiversity CONDITION!
NASA Astrophysics Data System (ADS)
Sparrow, B.; Schaefer, M.; Scarth, P.; Phinn, S. R.; Christensen, R.; Lowe, A. J.; O'Neill, S.; Thurgate, N.; Wundke, D.
2015-12-01
Over the past year the TERN Ausplots facility has hosted a process to determine the definition of Biodiversity Condition in an Australian Continental Context, and conducted a wide collaborative process to determine which environmental attributes are required to be measures to accurately inform on biodiversity condition. A major output from this work was the acknowledgement that good quality data from both remotely sensed sources and good quality field collected data are both essential to provide the best information possible on biodiversity condition. This poster details some background to the project, the assesment of which attributes to measure, and if the are sources primarily from field based or remotely sensed measures. It then proceeds to provide three examples of ways in which the combination of data types provides a superior product as output, with one example being provided for the three cornerstone areas of condition: Structure, Function and Composition.
Position measurement of the direct drive motor of Large Aperture Telescope
NASA Astrophysics Data System (ADS)
Li, Ying; Wang, Daxing
2010-07-01
Along with the development of space and astronomy science, production of large aperture telescope and super large aperture telescope will definitely become the trend. It's one of methods to solve precise drive of large aperture telescope using direct drive technology unified designed of electricity and magnetism structure. A direct drive precise rotary table with diameter of 2.5 meters researched and produced by us is a typical mechanical & electrical integration design. This paper mainly introduces position measurement control system of direct drive motor. In design of this motor, position measurement control system requires having high resolution, and precisely aligning the position of rotor shaft and making measurement, meanwhile transferring position information to position reversing information corresponding to needed motor pole number. This system has chosen high precision metal band coder and absolute type coder, processing information of coders, and has sent 32-bit RISC CPU making software processing, and gained high resolution composite coder. The paper gives relevant laboratory test results at the end, indicating the position measurement can apply to large aperture telescope control system. This project is subsidized by Chinese National Natural Science Funds (10833004).
Behavioral System Feedback Measurement Failure: Sweeping Quality under the Rug
ERIC Educational Resources Information Center
Mihalic, Maria T.; Ludwig, Timothy D.
2009-01-01
Behavioral Systems rely on valid measurement systems to manage processes and feedback and to deliver contingencies. An examination of measurement system components designed to track customer service quality of furniture delivery drivers revealed the measurement system failed to capture information it was designed to measure. A reason for this…
The minimal work cost of information processing
NASA Astrophysics Data System (ADS)
Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato
2015-07-01
Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.
Concrete Crack Identification Using a UAV Incorporating Hybrid Image Processing.
Kim, Hyunjun; Lee, Junhwa; Ahn, Eunjong; Cho, Soojin; Shin, Myoungsu; Sim, Sung-Han
2017-09-07
Crack assessment is an essential process in the maintenance of concrete structures. In general, concrete cracks are inspected by manual visual observation of the surface, which is intrinsically subjective as it depends on the experience of inspectors. Further, it is time-consuming, expensive, and often unsafe when inaccessible structural members are to be assessed. Unmanned aerial vehicle (UAV) technologies combined with digital image processing have recently been applied to crack assessment to overcome the drawbacks of manual visual inspection. However, identification of crack information in terms of width and length has not been fully explored in the UAV-based applications, because of the absence of distance measurement and tailored image processing. This paper presents a crack identification strategy that combines hybrid image processing with UAV technology. Equipped with a camera, an ultrasonic displacement sensor, and a WiFi module, the system provides the image of cracks and the associated working distance from a target structure on demand. The obtained information is subsequently processed by hybrid image binarization to estimate the crack width accurately while minimizing the loss of the crack length information. The proposed system has shown to successfully measure cracks thicker than 0.1 mm with the maximum length estimation error of 7.3%.
Methodology in the measurement of complex human performance : two-dimensional compensatory tracking.
DOT National Transportation Integrated Search
1972-05-01
Nineteen subjects were tested on two successive days on a complex performance device designed to measure functions of relevance to aircrew performance; included were measures of monitoring, information processing, pattern discrimination, and group pr...
A Robust and Resilient Network Design Paradigm for Region-Based Faults Inflicted by WMD Attack
2016-04-01
MEASUREMENTS FOR GRID MONITORING AND CONTROL AGAINST POSSIBLE WMD ATTACKS We investigated big data processing of PMU measurements for grid monitoring and...control against possible WMD attacks. Big data processing and analytics of synchrophasor measurements, collected from multiple locations of power grids...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
Wave data processing toolbox manual
Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul
2006-01-01
Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata structure is embedded with the data to document pertinent information regarding the deployment and the parameters used to process the data. Using this format ensures that the relevant information about how the data was collected and converted to physical units is maintained with the actual data. EPIC-standard variable names have been utilized where appropriate. These standards, developed by the NOAA Pacific Marine Environmental Laboratory (PMEL) (http://www.pmel.noaa.gov/epic/), provide a universal vernacular allowing researchers to share data without translation.
Application of online measures to monitor and evaluate multiplatform fusion performance
NASA Astrophysics Data System (ADS)
Stubberud, Stephen C.; Kowalski, Charlene; Klamer, Dale M.
1999-07-01
A primary concern of multiplatform data fusion is assessing the quality and utility of data shared among platforms. Constraints such as platform and sensor capability and task load necessitate development of an on-line system that computes a metric to determine which other platform can provide the best data for processing. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. Entropy measures quality of processed information such as localization, classification, and ambiguity in measurement-to-track association. Lower entropy scores imply less uncertainty about a particular target. When new information is provided, we compuete the level of improvement a particular track obtains from one measurement to another. The measure permits us to evaluate the utility of the new information. We couple entropy with intelligent agents that provide two main data gathering functions: estimation of another platform's performance and evaluation of the new measurement data's quality. Both functions result from the entropy metric. The intelligent agent on a platform makes an estimate of another platform's measurement and provides it to its own fusion system, which can then incorporate it, for a particular target. A resulting entropy measure is then calculated and returned to its own agent. From this metric, the agent determines a perceived value of the offboard platform's measurement. If the value is satisfactory, the agent requests the measurement from the other platform, usually by interacting with the other platform's agent. Once the actual measurement is received, again entropy is computed and the agent assesses its estimation process and refines it accordingly.
Thermal sensors to control polymer forming. Challenge and solutions
NASA Astrophysics Data System (ADS)
Lemeunier, F.; Boyard, N.; Sarda, A.; Plot, C.; Lefèvre, N.; Petit, I.; Colomines, G.; Allanic, N.; Bailleul, J. L.
2017-10-01
Many thermal sensors are already used, for many years, to better understand and control material forming processes, especially polymer processing. Due to technical constraints (high pressure, sealing, sensor dimensions…) the thermal measurement is often performed in the tool or close its surface. Thus, it only gives partial and disturbed information. Having reliable information about the heat flux exchanges between the tool and the material during the process would be very helpful to improve the control of the process and to favor the development of new materials. In this work, we present several sensors developed in labs to study the molding steps in forming processes. The analysis of the obtained thermal measurements (temperature, heat flux) shows the required sensitivity threshold of sensitivity of thermal sensors to be able to detect on-line the rate of thermal reaction. Based on these data, we will present new sensor designs which have been patented.
Remote Sensing Technologies for Estuary Research and Management (Invited)
NASA Astrophysics Data System (ADS)
Hestir, E. L.; Ustin, S.; Khanna, S.; Botha, E.; Santos, M. J.; Anstee, J.; Greenberg, J. A.
2013-12-01
Estuarine ecosystems and their biogeochemical processes are extremely vulnerable to climate and environmental changes, and are threatened by sea level rise and upstream activities such as land use/land cover and hydrological changes. Despite the recognized threat to estuaries, most aspects of how change will affect estuaries are not well understood due to the poorly resolved understanding of the complex physical, chemical and biological processes and their interactions in estuarine systems. New and innovative remote sensing technologies such as high spectral resolution optical and thermal imagers and lidar, microwave radiometers and radar imagers enable measurements of key environmental parameters needed to establish baseline conditions and improve modeling efforts. Radar's sensitivity to water provides information about water height and velocity, channel geometry and wetland inundation. Water surface temperature and salinity and can be measured from microwave radiometry, and when combined with radar-derived information can provide information about estuarine hydrodynamics. Optical and thermal hyperspectral imagers provide information about sediment, plant and water chemistry including chlorophyll, dissolved organic matter and mineralogical composition. Lidar can measure bathymetry, microtopography and emergent plant structure. Plant functional types, wetland community distributions, turbidity, suspended and deposited sediments, dissolved organic matter, water column chlorophyll and phytoplankton functional types may be estimated from these measurements. Innovative deployment of advanced remote sensing technologies on airborne and submersible un-piloted platforms provides temporally and spatially continuous measurement in temporally dynamic and spatially complex tidal systems. Through biophysically-based retrievals, these technologies provide direct measurement of physical, biological and biogeochemical conditions that can be used as models to understand estuarine processes and forecast responses to change. We demonstrate that innovative remote sensing technologies, coupled with long term datasets from satellite earth observing missions and in situ sensor networks provide the spatially contiguous measurements needed to make 'supra-regional' (e.g. river to coast) assessments of ecological communities, habitat distribution, ecosystem function, sediment, nutrient and carbon source and transport. We show that this information can be used to improve environmental modeling with increased confidence and support informed environmental management.
Processing reafferent and exafferent visual information for action and perception.
Reichenbach, Alexandra; Diedrichsen, Jörn
2015-01-01
A recent study suggests that reafferent hand-related visual information utilizes a privileged, attention-independent processing channel for motor control. This process was termed visuomotor binding to reflect its proposed function: linking visual reafferences to the corresponding motor control centers. Here, we ask whether the advantage of processing reafferent over exafferent visual information is a specific feature of the motor processing stream or whether the improved processing also benefits the perceptual processing stream. Human participants performed a bimanual reaching task in a cluttered visual display, and one of the visual hand cursors could be displaced laterally during the movement. We measured the rapid feedback responses of the motor system as well as matched perceptual judgments of which cursor was displaced. Perceptual judgments were either made by watching the visual scene without moving or made simultaneously to the reaching tasks, such that the perceptual processing stream could also profit from the specialized processing of reafferent information in the latter case. Our results demonstrate that perceptual judgments in the heavily cluttered visual environment were improved when performed based on reafferent information. Even in this case, however, the filtering capability of the perceptual processing stream suffered more from the increasing complexity of the visual scene than the motor processing stream. These findings suggest partly shared and partly segregated processing of reafferent information for vision for motor control versus vision for perception.
Chiaravalloti, Nancy D; Stojanovic-Radic, Jelena; DeLuca, John
2013-01-01
The most common cognitive impairments in multiple sclerosis (MS) have been documented in specific domains, including new learning and memory, working memory, and information processing speed. However, little attempt has been made to increase our understanding of their relationship to one another. While recent studies have shown that processing speed impacts new learning and memory abilities in MS, the role of working memory in this relationship has received less attention. The present study examines the relative contribution of impaired working memory versus processing speed in new learning and memory functions in MS. Participants consisted of 51 individuals with clinically definite MS. Participants completed two measures of processing speed, two measures of working memory, and two measures of episodic memory. Data were analyzed via correlational and multiple regression analysis. Results indicate that the variance in new learning abilities in this sample was primarily associated with processing speed, with working memory exerting much less of an influence. Results are discussed in terms of the role of cognitive rehabilitation of new learning and memory abilities in persons with MS.
Al-Hawamdih, Sajidah; Ahmad, Muayyad M
2018-03-01
The purpose of this study was to examine nursing informatics competency and the quality of information processing among nurses in Jordan. The study was conducted in a large hospital with 380 registered nurses. The hospital introduced the electronic health record in 2010. The measures used in this study were personal and job characteristics, self-efficacy, Self-Assessment Nursing Informatics Competencies, and Health Information System Monitoring Questionnaire. The convenience sample consisted of 99 nurses who used the electronic health record for at least 3 months. The analysis showed that nine predictors explained 22% of the variance in the quality of information processing, whereas the statistically significant predictors were nursing informatics competency, clinical specialty, and years of nursing experience. There is a need for policies that advocate for every nurse to be educated in nursing informatics and the quality of information processing.
An examination of anticipated g-jitter on Space Station and its effects on materials processes
NASA Technical Reports Server (NTRS)
Nelson, Emily
1992-01-01
Information on anticipated g-jitter on Space Station Freedom and the effect of the jitter on materials processes is given in viewgraph form. It was concluded that g-jitter will dominate the acceleration environment; that it is a 3D multifrequency phenomenon; and that it varies dramatically in orientation. Information is given on calculated or measured sources of residual acceleration, aerodynamic drag, Shuttle acceleration measurements, the Space Station environment, tolerable g-levels as a function of frequency, directional solidification, vapor crystal growth, protein crystal growth, float zones, and liquid bridges.
Neural correlates of distraction and conflict resolution for nonverbal auditory events.
Stewart, Hannah J; Amitay, Sygal; Alain, Claude
2017-05-09
In everyday situations auditory selective attention requires listeners to suppress task-irrelevant stimuli and to resolve conflicting information in order to make appropriate goal-directed decisions. Traditionally, these two processes (i.e. distractor suppression and conflict resolution) have been studied separately. In the present study we measured neuroelectric activity while participants performed a new paradigm in which both processes are quantified. In separate block of trials, participants indicate whether two sequential tones share the same pitch or location depending on the block's instruction. For the distraction measure, a positive component peaking at ~250 ms was found - a distraction positivity. Brain electrical source analysis of this component suggests different generators when listeners attended to frequency and location, with the distraction by location more posterior than the distraction by frequency, providing support for the dual-pathway theory. For the conflict resolution measure, a negative frontocentral component (270-450 ms) was found, which showed similarities with that of prior studies on auditory and visual conflict resolution tasks. The timing and distribution are consistent with two distinct neural processes with suppression of task-irrelevant information occurring before conflict resolution. This new paradigm may prove useful in clinical populations to assess impairments in filtering out task-irrelevant information and/or resolving conflicting information.
On the Capacity of Attention: Its Estimation and Its Role in Working Memory and Cognitive Aptitudes
ERIC Educational Resources Information Center
Cowan, N.; Elliott, E.M.; Scott Saults, J.; Morey, C.C.; Mattox, S.; Hismjatullina, A.; Conway, A.R.A.
2005-01-01
Working memory (WM) is the set of mental processes holding limited information in a temporarily accessible state in service of cognition. We provide a theoretical framework to understand the relation between WM and aptitude measures. The WM measures that have yielded high correlations with aptitudes include separate storage-and-processing task…
The Pictorial Fire Stroop: A Measure of Processing Bias for Fire-Related Stimuli
ERIC Educational Resources Information Center
Gallagher-Duffy, Joanne; MacKay, Sherri; Duffy, Jim; Sullivan-Thomas, Meara; Peterson-Badali, Michele
2009-01-01
Fire interest is a risk factor for firesetting. This study tested whether a fire-specific emotional Stroop task can effectively measure an information-processing bias for fire-related stimuli. Clinic-referred and nonreferred adolescents (aged 13-16 years) completed a pictorial "Fire Stroop," as well as a self-report fire interest questionnaire and…
Global Precipitation Measurement: GPM Microwave Imager (GMI) Algorithm Development Approach
NASA Technical Reports Server (NTRS)
Stocker, Erich Franz
2009-01-01
This slide presentation reviews the approach to the development of the Global Precipitation Measurement algorithm. This presentation includes information about the responsibilities for the development of the algorithm, and the calibration. Also included is information about the orbit, and the sun angle. The test of the algorithm code will be done with synthetic data generated from the Precipitation Processing System (PPS).
Coherence and specificity of information-processing biases in depression and social phobia.
Gotlib, Ian H; Kasch, Karen L; Traill, Saskia; Joormann, Jutta; Arnow, Bruce A; Johnson, Sheri L
2004-08-01
Research has not resolved whether depression is associated with a distinct information-processing bias, whether the content of the information-processing bias in depression is specific to themes of loss and sadness, or whether biases are consistent across the tasks most commonly used to assess attention and memory processing. In the present study, participants diagnosed with major depression, social phobia, or no Axis I disorder, completed several information-processing tasks assessing attention and memory for sad, socially threatening, physically threatening, and positive stimuli. As predicted, depressed participants exhibited specific biases for stimuli connoting sadness; social phobic participants did not evidence such specificity for threat stimuli. It is important to note that the different measures of bias in memory and attention were not systematically intercorrelated. Implications for the study of cognitive bias in depression, and for cognitive theory more broadly, are discussed.
Information Sharing within the COEA Process
1993-12-01
Foster. Cost Accounting . EnO Cliffs NJ: Prentice Hall, 1991. Irwin, F.C. An Expert System For Measuring. Inting. and Managing System Performance Factors...AD-A273 906 /El~lhll~lR~IUiM "AFIT/GIR/LAR/93D-9 DTIC ELECTE DEC2 11993 INFORMATION SHARING WITHIN THE COEA PROCESS THESIS Constance S . Maginnis, GS...the Requirements for the Degree of Master of Science in Information Resource Management Constance S . Maginnis, B.S. Michael J. Monroe, M.A. GS-13, USAF
Adaptive Sensing and Fusion of Multi-Sensor Data and Historical Information
2009-11-06
integrate MTL and semi-supervised learning into a single framework , thereby exploiting two forms of contextual information. A key new objective of the...this report we integrate MTL and semi-supervised learning into a single framework , thereby exploiting two forms of contextual information. A key new...process [8], denoted as X ∼ BeP (B), where B is a measure on Ω. If B is continuous, X is a Poisson process with intensity B and can be constructed as X = N
Measuring the Process and Quality of Informed Consent for Clinical Research: Development and Testing
Cohn, Elizabeth Gross; Jia, Haomiao; Smith, Winifred Chapman; Erwin, Katherine; Larson, Elaine L.
2013-01-01
Purpose/Objectives To develop and assess the reliability and validity of an observational instrument, the Process and Quality of Informed Consent (P-QIC). Design A pilot study of the psychometrics of a tool designed to measure the quality and process of the informed consent encounter in clinical research. The study used professionally filmed, simulated consent encounters designed to vary in process and quality. Setting A major urban teaching hospital in the northeastern region of the United States. Sample 63 students enrolled in health-related programs participated in psychometric testing, 16 students participated in test-retest reliability, and 5 investigator-participant dyads were observed for the actual consent encounters. Methods For reliability and validity testing, students watched and rated videotaped simulations of four consent encounters intentionally varied in process and content and rated them with the proposed instrument. Test-retest reliability was established by raters watching the videotaped simulations twice. Inter-rater reliability was demonstrated by two simultaneous but independent raters observing an actual consent encounter. Main Research Variables The essential elements of information and communication for informed consent. Findings The initial testing of the P-QIC demonstrated reliable and valid psychometric properties in both the simulated standardized consent encounters and actual consent encounters in the hospital setting. Conclusions The P-QIC is an easy-to-use observational tool that provides a quick assessment of the areas of strength and areas that need improvement in a consent encounter. It can be used in the initial trainings of new investigators or consent administrators and in ongoing programs of improvement for informed consent. Implications for Nursing The development of a validated observational instrument will allow investigators to assess the consent process more accurately and evaluate strategies designed to improve it. PMID:21708532
Detecting measurement outliers: remeasure efficiently
NASA Astrophysics Data System (ADS)
Ullrich, Albrecht
2010-09-01
Shrinking structures, advanced optical proximity correction (OPC) and complex measurement strategies continually challenge critical dimension (CD) metrology tools and recipe creation processes. One important quality ensuring task is the control of measurement outlier behavior. Outliers could trigger false positive alarm for specification violations impacting cycle time or potentially yield. Constant high level of outliers not only deteriorates cycle time but also puts unnecessary stress on tool operators leading eventually to human errors. At tool level the sources of outliers are natural variations (e.g. beam current etc.), drifts, contrast conditions, focus determination or pattern recognition issues, etc. Some of these can result from suboptimal or even wrong recipe settings, like focus position or measurement box size. Such outliers, created by an automatic recipe creation process faced with more complicated structures, would manifest itself rather as systematic variation of measurements than the one caused by 'pure' tool variation. I analyzed several statistical methods to detect outliers. These range from classical outlier tests for extrema, robust metrics like interquartile range (IQR) to methods evaluating the distribution of different populations of measurement sites, like the Cochran test. The latter suits especially the detection of systematic effects. The next level of outlier detection entwines additional information about the mask and the manufacturing process with the measurement results. The methods were reviewed for measured variations assumed to be normally distributed with zero mean but also for the presence of a statistically significant spatial process signature. I arrive at the conclusion that intelligent outlier detection can influence the efficiency and cycle time of CD metrology greatly. In combination with process information like target, typical platform variation and signature, one can tailor the detection to the needs of the photomask at hand. By monitoring the outlier behavior carefully, weaknesses of the automatic recipe creation process can be spotted.
Digital image processing of vascular angiograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Beckenbach, E. S.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.
1975-01-01
The paper discusses the estimation of the degree of atherosclerosis in the human femoral artery through the use of a digital image processing system for vascular angiograms. The film digitizer uses an electronic image dissector camera to scan the angiogram and convert the recorded optical density information into a numerical format. Another processing step involves locating the vessel edges from the digital image. The computer has been programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements are combined into an atherosclerosis index, which is found in a post-mortem study to correlate well with both visual and chemical estimates of atherosclerotic disease.
Razmjoo, Seyyed Ayatollah; Neissi, Sina
2010-12-01
The relationship between identity processing styles and language proficiency in English as foreign language (EFL) was investigated among the Persian EFL learners. 266 Persian candidates taking part in a Ph.D. examination at Shiraz University took part. The Language Proficiency Test was used to measure language proficiency in English. The Identity Styles Inventory was used to measure normative, informational, and diffuse-avoidant identity processing styles. Relationships between normative and informational styles and language proficiency and its subscales (grammar, vocabulary, and reading) were positive and significant. Negative relationships between diffuse-avoidant style and language proficiency and its subscales (grammar, vocabulary, and reading) were observed. There were significant sex differences for diffuse-avoidant style and for vocabulary.
Deterministic quantum teleportation and information splitting via a peculiar W-class state
NASA Astrophysics Data System (ADS)
Mei, Feng; Yu, Ya-Fei; Zhang, Zhi-Ming
2010-02-01
In the paper (Phys. Rev. 2006 A 74 062320) Agrawal et al. have introduced a kind of W-class state which can be used for the quantum teleportation of single-particle state via a three-particle von Neumann measurement, and they thought that the state could not be used to teleport an unknown state by making two-particle and one-particle measurements. Here we reconsider the features of the W-class state and the quantum teleportation process via the W-class state. We show that, by introducing a unitary operation, the quantum teleportation can be achieved deterministically by making two-particle and one-particle measurements. In addition, our protocol is extended to the process of teleporting two-particle state and splitting information.
Automatic affective processing impairments in patients with deficit syndrome schizophrenia.
Strauss, Gregory P; Allen, Daniel N; Duke, Lisa A; Ross, Sylvia A; Schwartz, Jason
2008-07-01
Affective impairments were examined in patients with and without deficit syndrome schizophrenia. Two Emotional Stroop tasks designed to measure automatic processing of emotional information were administered to deficit (n=15) and non-deficit syndrome (n=26) schizophrenia patients classified according to the Schedule for the Deficit Syndrome, and matched non-patient control subjects (n=22). In comparison to non-deficit patients and controls, deficit syndrome patients demonstrated a lack of attention bias for positive information, and an elevated attentional lingering effect for negative information. These findings suggest that positive information fails to automatically capture attention of deficit syndrome patients, and that when negative information captures attention, it produces difficulty in disengagement Attentional abnormalities were significantly correlated with negative symptoms, such that more severe symptoms were associated with less attention bias for positive emotion and a greater lingering effect for negative information. Results are generally consistent with a mood-congruent processing abnormality and suggest that impaired automatic processing may be core to diminished emotional experience symptoms exhibited in deficit syndrome patients.
Measuring Operational Resilience Using the CERT(Registered) Resilience Management Model
2010-09-01
such as ISO 27002 [ ISO 2005]) and then measure the implementation and performance of practices contained in the standard. This checklist-based ap...Security techniques – Code of practice for information security management. ISO /IEC 27002 :2005, June 2005. Also known as ISO /IEC 17799:2005. [ ISO 2007...Table 23: ISO 15939 Process Activities and Tasks 54 Table 24: CERT-RMM Measurement and Analysis Process Area Goals and Practices 55 CMU/SEI
Heisz, Jennifer J; Vakorin, Vasily; Ross, Bernhard; Levine, Brian; McIntosh, Anthony R
2014-01-01
Episodic memory and semantic memory produce very different subjective experiences yet rely on overlapping networks of brain regions for processing. Traditional approaches for characterizing functional brain networks emphasize static states of function and thus are blind to the dynamic information processing within and across brain regions. This study used information theoretic measures of entropy to quantify changes in the complexity of the brain's response as measured by magnetoencephalography while participants listened to audio recordings describing past personal episodic and general semantic events. Personal episodic recordings evoked richer subjective mnemonic experiences and more complex brain responses than general semantic recordings. Critically, we observed a trade-off between the relative contribution of local versus distributed entropy, such that personal episodic recordings produced relatively more local entropy whereas general semantic recordings produced relatively more distributed entropy. Changes in the relative contributions of local and distributed entropy to the total complexity of the system provides a potential mechanism that allows the same network of brain regions to represent cognitive information as either specific episodes or more general semantic knowledge.
New developments in developmental research on social information processing and antisocial behavior.
Fontaine, Reid Griffith
2010-07-01
The Special Section on developmental research on social information processing (SIP) and antisocial behavior is here introduced. Following a brief history of SIP theory, comments on several themes-measurement and assessment, attributional and interpretational style, response evaluation and decision, and the relation between emotion and SIP-that tie together four new empirical investigations are provided. Notable contributions of these studies are highlighted.
Truck driver informational overload, fiscal year 1992. Final report, 1 July 1991-30 September 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacAdam, C.C.
1992-09-01
The document represents the final project report for a study entitled 'Truck Driver Informational Overload' sponsored by the Motor Vehicle Manufacturers Association through its Motor Truck Research Committee and associated Operations/Performance Panels. As stated in an initial project statement, the objective of the work was to provide guidance for developing methods for measuring driving characteristics during information processing tasks. The contents of the report contain results from two basic project activities: (1) a literature review on multiple task performance driver information overload, and (2) a description of driving simulator side-task experiments and a discussion of findings from tests conducted withmore » eight subjects. Two of the key findings from a set of disturbance-input tests conducted with the simulator and the eight test subjects were that: (1) standard deviations of vehicle lateral position and heading (yaw) angle measurements showed the greatest sensitivity to the presence of side-task activities during basic information processing tasks, and (2) corresponding standard deviations of driver steering activity, vehicle yaw rate, and lateral acceleration measurements were seen to be largely insensitive indicators of side-task activity.« less
Information-driven self-organization: the dynamical system approach to autonomous robot behavior.
Ay, Nihat; Bernigau, Holger; Der, Ralf; Prokopenko, Mikhail
2012-09-01
In recent years, information theory has come into the focus of researchers interested in the sensorimotor dynamics of both robots and living beings. One root for these approaches is the idea that living beings are information processing systems and that the optimization of these processes should be an evolutionary advantage. Apart from these more fundamental questions, there is much interest recently in the question how a robot can be equipped with an internal drive for innovation or curiosity that may serve as a drive for an open-ended, self-determined development of the robot. The success of these approaches depends essentially on the choice of a convenient measure for the information. This article studies in some detail the use of the predictive information (PI), also called excess entropy or effective measure complexity, of the sensorimotor process. The PI of a process quantifies the total information of past experience that can be used for predicting future events. However, the application of information theoretic measures in robotics mostly is restricted to the case of a finite, discrete state-action space. This article aims at applying the PI in the dynamical systems approach to robot control. We study linear systems as a first step and derive exact results for the PI together with explicit learning rules for the parameters of the controller. Interestingly, these learning rules are of Hebbian nature and local in the sense that the synaptic update is given by the product of activities available directly at the pertinent synaptic ports. The general findings are exemplified by a number of case studies. In particular, in a two-dimensional system, designed at mimicking embodied systems with latent oscillatory locomotion patterns, it is shown that maximizing the PI means to recognize and amplify the latent modes of the robotic system. This and many other examples show that the learning rules derived from the maximum PI principle are a versatile tool for the self-organization of behavior in complex robotic systems.
MEASUREMENT OF INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPY MACHINES
The article provides background information on indoor air emissions from office equipment, with emphasis on dry-process photocopy machines. The test method is described in detail along with results of a study to evaluate the test method using four dry-process photocopy machines. ...
NASA Astrophysics Data System (ADS)
McNeil, Ronald D.; Miele, Renato; Shaul, Dennis
2000-10-01
Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.
Adapting large batteries of research measures for immigrants.
Aroian, Karen J
2013-06-01
A four-step, streamlined process to adapt a large battery of measures for a study of mother-child adjustment in Arab Muslim immigrants and the lessons learned are described. The streamlined process includes adapting content, translation, pilot testing, and extensive psychometric evaluation but omits in-depth qualitative inquiry to identify the full content domain of the constructs of interest and cognitive interviews to assess how respondents interpret items. Lessons learned suggest that the streamlined process is not sufficient for certain measures, particularly when there is little published information about how the measure performs with different groups, the measure requires substantial item revision to achieve content equivalence, and the measure is both challenging to translate and has little to no redundancy. When these conditions are present, condition-specific procedures need to be added to the streamlined process.
Direct measurement of exciton dissociation energy in polymers
NASA Astrophysics Data System (ADS)
Toušek, J.; Toušková, J.; Chomutová, R.; Paruzel, B.; Pfleger, J.
2017-01-01
Exciton dissociation energy was obtained based on the comparison of thickness of the space charge region estimated from the measurement of capacitance of prepared Schottky diode and from the measurement of photovoltage spectra. While the capacitance measurements provide information about the total width of the space charge region (SCR) the surface photovoltaic effect brings information only about the part of the SCR where electric field is sufficiently high to cause dissociation. For determination of the dissociation energy it is sufficient to find the electric potential in the SCR where the process starts.
NASA Technical Reports Server (NTRS)
Barkstrom, B. R.
1983-01-01
The measurement of the earth's radiation budget has been chosen to illustrate the technique of objective system design. The measurement process is an approximately linear transformation of the original field of radiant exitances, so that linear statistical techniques may be employed. The combination of variability, measurement strategy, and error propagation is presently made with the help of information theory, as suggested by Kondratyev et al. (1975) and Peckham (1974). Covariance matrices furnish the quantitative statement of field variability.
Single-snapshot 2D color measurement by plenoptic imaging system
NASA Astrophysics Data System (ADS)
Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana
2014-03-01
Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.
Measuring patterns in team interaction sequences using a discrete recurrence approach.
Gorman, Jamie C; Cooke, Nancy J; Amazeen, Polemnia G; Fouse, Shannon
2012-08-01
Recurrence-based measures of communication determinism and pattern information are described and validated using previously collected team interaction data. Team coordination dynamics has revealed that"mixing" team membership can lead to flexible interaction processes, but keeping a team "intact" can lead to rigid interaction processes. We hypothesized that communication of intact teams would have greater determinism and higher pattern information compared to that of mixed teams. Determinism and pattern information were measured from three-person Uninhabited Air Vehicle team communication sequences over a series of 40-minute missions. Because team members communicated using push-to-talk buttons, communication sequences were automatically generated during each mission. The Composition x Mission determinism effect was significant. Intact teams' determinism increased over missions, whereas mixed teams' determinism did not change. Intact teams had significantly higher maximum pattern information than mixed teams. Results from these new communication analysis methods converge with content-based methods and support our hypotheses. Because they are not content based, and because they are automatic and fast, these new methods may be amenable to real-time communication pattern analysis.
Complete information acquisition in scanning probe microscopy
Belianinov, Alex; Kalinin, Sergei V.; Jesse, Stephen
2015-03-13
In the last three decades, scanning probe microscopy (SPM) has emerged as a primary tool for exploring and controlling the nanoworld. A critical part of the SPM measurements is the information transfer from the tip-surface junction to a macroscopic measurement system. This process reduces the many degrees of freedom of a vibrating cantilever to relatively few parameters recorded as images. Similarly, the details of dynamic cantilever response at sub-microsecond time scales of transients, higher-order eigenmodes and harmonics are averaged out by transitioning to millisecond time scale of pixel acquisition. Hence, the amount of information available to the external observer ismore » severely limited, and its selection is biased by the chosen data processing method. Here, we report a fundamentally new approach for SPM imaging based on information theory-type analysis of the data stream from the detector. This approach allows full exploration of complex tip-surface interactions, spatial mapping of multidimensional variability of material s properties and their mutual interactions, and SPM imaging at the information channel capacity limit.« less
Lin, Yen-Ko; Chen, Chao-Wen; Lee, Wei-Che; Lin, Tsung-Ying; Kuo, Liang-Chi; Lin, Chia-Ju; Shi, Leiyu; Tien, Yin-Chun; Cheng, Yuan-Chia
2017-11-29
Ensuring adequate informed consent for surgery in a trauma setting is challenging. We developed and pilot tested an educational video containing information regarding the informed consent process for surgery in trauma patients and a knowledge measure instrument and evaluated whether the audiovisual presentation improved the patients' knowledge regarding their procedure and aftercare and their satisfaction with the informed consent process. A modified Delphi technique in which a panel of experts participated in successive rounds of shared scoring of items to forecast outcomes was applied to reach a consensus among the experts. The resulting consensus was used to develop the video content and questions for measuring the understanding of the informed consent for debridement surgery in limb trauma patients. The expert panel included experienced patients. The participants in this pilot study were enrolled as a convenience sample of adult trauma patients scheduled to receive surgery. The modified Delphi technique comprised three rounds over a 4-month period. The items given higher scores by the experts in several categories were chosen for the subsequent rounds until consensus was reached. The experts reached a consensus on each item after the three-round process. The final knowledge measure comprising 10 questions was developed and validated. Thirty eligible trauma patients presenting to the Emergency Department (ED) were approached and completed the questionnaires in this pilot study. The participants exhibited significantly higher mean knowledge and satisfaction scores after watching the educational video than before watching the video. Our process is promising for developing procedure-specific informed consent and audiovisual aids in medical and surgical specialties. The educational video was developed using a scientific method that integrated the opinions of different stakeholders, particularly patients. This video is a useful tool for improving the knowledge and satisfaction of trauma patients in the ED. The modified Delphi technique is an effective method for collecting experts' opinions and reaching a consensus on the content of educational materials for informed consent. Institutions should prioritize patient-centered health care and develop a structured informed consent process to improve the quality of care. The ClinicalTrials.gov Identifier is NCT01338480 . The date of registration was April 18, 2011 (retrospectively registered).
I spy with my little eye: cognitive processing of framed physical activity messages.
Bassett-Gunter, Rebecca L; Latimer-Cheung, Amy E; Martin Ginis, Kathleen A; Castelhano, Monica
2014-01-01
The primary purpose was to examine the relative cognitive processing of gain-framed versus loss-framed physical activity messages following exposure to health risk information. Guided by the Extended Parallel Process Model, the secondary purpose was to examine the relation between dwell time, message recall, and message-relevant thoughts, as well as perceived risk, personal relevance, and fear arousal. Baseline measures of perceived risk for inactivity-related disease and health problems were administered to 77 undergraduate students. Participants read population-specific health risk information while wearing a head-mounted eye tracker, which measured dwell time on message content. Perceived risk was then reassessed. Next, participants read PA messages while the eye tracker measured dwell time on message content. Immediately following message exposure, recall, thought-listing, fear arousal, and personal relevance were measured. Dwell time on gain-framed messages was significantly greater than loss-framed messages. However, message recall and thought-listing did not differ by message frame. Dwell time was not significantly related to recall or thought-listing. Consistent with the Extended Parallel Process Model, fear arousal was significantly related to recall, thought-listing, and personal relevance. In conclusion, gain-framed messages may evoke greater dwell time than loss-famed messages. However, dwell time alone may be insufficient for evoking further cognitive processing.
Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A
2014-01-01
Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants’ understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, such as genetic clinical trials consents. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. PMID:24273095
Mahnke, Andrea N; Plasek, Joseph M; Hoffman, David G; Partridge, Nathan S; Foth, Wendy S; Waudby, Carol J; Rasmussen, Luke V; McManus, Valerie D; McCarty, Catherine A
2014-01-01
Many informed consent studies demonstrate that research subjects poorly retain and understand information in written consent documents. Previous research in multimedia consent is mixed in terms of success for improving participants' understanding, satisfaction, and retention. This failure may be due to a lack of a community-centered design approach to building the interventions. The goal of this study was to gather information from the community to determine the best way to undertake the consent process. Community perceptions regarding different computer-based consenting approaches were evaluated, and a computer-based consent was developed and tested. A second goal was to evaluate whether participants make truly informed decisions to participate in research. Simulations of an informed consent process were videotaped to document the process. Focus groups were conducted to determine community attitudes towards a computer-based informed consent process. Hybrid focus groups were conducted to determine the most acceptable hardware device. Usability testing was conducted on a computer-based consent prototype using a touch-screen kiosk. Based on feedback, a computer-based consent was developed. Representative study participants were able to easily complete the consent, and all were able to correctly answer the comprehension check questions. Community involvement in developing a computer-based consent proved valuable for a population-based genetic study. These findings may translate to other types of informed consents, including those for trials involving treatment of genetic disorders. A computer-based consent may serve to better communicate consistent, clear, accurate, and complete information regarding the risks and benefits of study participation. Additional analysis is necessary to measure the level of comprehension of the check-question answers by larger numbers of participants. The next step will involve contacting participants to measure whether understanding of what they consented to is retained over time. © 2013 Wiley Periodicals, Inc.
The application of information theory for the research of aging and aging-related diseases.
Blokh, David; Stambler, Ilia
2017-10-01
This article reviews the application of information-theoretical analysis, employing measures of entropy and mutual information, for the study of aging and aging-related diseases. The research of aging and aging-related diseases is particularly suitable for the application of information theory methods, as aging processes and related diseases are multi-parametric, with continuous parameters coexisting alongside discrete parameters, and with the relations between the parameters being as a rule non-linear. Information theory provides unique analytical capabilities for the solution of such problems, with unique advantages over common linear biostatistics. Among the age-related diseases, information theory has been used in the study of neurodegenerative diseases (particularly using EEG time series for diagnosis and prediction), cancer (particularly for establishing individual and combined cancer biomarkers), diabetes (mainly utilizing mutual information to characterize the diseased and aging states), and heart disease (mainly for the analysis of heart rate variability). Few works have employed information theory for the analysis of general aging processes and frailty, as underlying determinants and possible early preclinical diagnostic measures for aging-related diseases. Generally, the use of information-theoretical analysis permits not only establishing the (non-linear) correlations between diagnostic or therapeutic parameters of interest, but may also provide a theoretical insight into the nature of aging and related diseases by establishing the measures of variability, adaptation, regulation or homeostasis, within a system of interest. It may be hoped that the increased use of such measures in research may considerably increase diagnostic and therapeutic capabilities and the fundamental theoretical mathematical understanding of aging and disease. Copyright © 2016 Elsevier Ltd. All rights reserved.
Task-technology fit of video telehealth for nurses in an outpatient clinic setting.
Cady, Rhonda G; Finkelstein, Stanley M
2014-07-01
Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.
Use of cancer performance measures in population health: a macro-level perspective.
Clauser, Steven B
2004-01-01
The use of performance measurement to inform macro-level studies of cancer control and quality of care is receiving increasing interest at the state, national, and international level. This article describes the use of these measures to inform health policy and monitor cancer disparities and disease burden. Applications are discussed in clinical and provider-reported outcomes such as cancer incidence, mortality and survival, and outcome-linked processes of care, and patient-reported outcomes such as health-related quality of life and patient satisfaction/experience with care. The use of economic measures to monitor and evaluate the burden of illness is also discussed. The growing demand for surveillance capability coupled with the need to expand both the quality and breadth of available measure sets, suggests that there is a need to supplement traditional clinical and provider-reported process and outcomes measures with patient-reported outcomes measures such as health-related quality of life and patient satisfaction and experience with care. In addition, there is also a need to broaden and standardize outcome-linked process-of-care measures to improve the ability to measure and monitor incremental progress in improving cancer care. Finally, better measures of indirect costs of cancer care, such as loss productivity and caregiver burden among the aged, would improve national estimates of the cost of illness associated with cancer.
Gillies, Katie; Entwistle, Vikki; Treweek, Shaun P; Fraser, Cynthia; Williamson, Paula R; Campbell, Marion K
2015-10-27
The process of obtaining informed consent for participation in randomised controlled trials (RCTs) was established as a mechanism to protect participants against undue harm from research and allow people to recognise any potential risks or benefits associated with the research. A number of interventions have been put forward to improve this process. Outcomes reported in trials of interventions to improve the informed consent process for decisions about trial participation tend to focus on the 'understanding' of trial information. However, the operationalization of understanding as a concept, the tools used to measure it and the timing of the measurements are heterogeneous. A lack of clarity exists regarding which outcomes matter (to whom) and why. This inconsistency between studies results in difficulties when making comparisons across studies as evidenced in two recent systematic reviews of informed consent interventions. As such, no optimal method for measuring the impact of these interventions aimed at improving informed consent for RCTs has been identified. The project will adopt and adapt methodology previously developed and used in projects developing core outcome sets for assessment of clinical treatments. Specifically, the work will consist of three stages: 1) A systematic methodology review of existing outcome measures of trial informed consent interventions; 2) Interviews with key stakeholders to explore additional outcomes relevant for trial participation decisions; and 3) A Delphi study to refine the core outcome set for evaluation of trial informed consent interventions. All stages will include the stakeholders involved in the various aspects of RCT consent: users (that is, patients), developers (that is, trialists), deliverers (focusing on research nurses) and authorisers (that is, ethics committees). A final consensus meeting including all stakeholders will be held to review outcomes. The ELICIT study aims to develop a core outcome set for the evaluation of interventions intended to improve informed consent for RCTs for use in future RCTs and reviews, thereby improving the reliability and consistency of research in this area.
[A heart function measuring and analyzing instrument based on single-chip microcomputer].
Rong, Z; Liang, H; Wang, S
1999-05-01
An Introduction a measuring and analyzing instrument, based on the single-chip microcomputer, which provides sample gathering, processing, controlling, adjusting, keyboard and printing. All informations are provided and displayed in Chinese.
Age Differences in Free Recall Rehearsal Strategies.
ERIC Educational Resources Information Center
Sanders, Raymond E.; And Others
1980-01-01
Young adults' rehearsal was serially and categorically organized. Older adults' rehearsal was nonstrategic. Results show that direct strategy measures provide more information about processes underlying age differences in memory than do outcome measures alone. (Author)
Adaptive automation of human-machine system information-processing functions.
Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P
2005-01-01
The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.
Information extraction during simultaneous motion processing.
Rideaux, Reuben; Edwards, Mark
2014-02-01
When confronted with multiple moving objects the visual system can process them in two stages: an initial stage in which a limited number of signals are processed in parallel (i.e. simultaneously) followed by a sequential stage. We previously demonstrated that during the simultaneous stage, observers could discriminate between presentations containing up to 5 vs. 6 spatially localized motion signals (Edwards & Rideaux, 2013). Here we investigate what information is actually extracted during the simultaneous stage and whether the simultaneous limit varies with the detail of information extracted. This was achieved by measuring the ability of observers to extract varied information from low detail, i.e. the number of signals presented, to high detail, i.e. the actual directions present and the direction of a specific element, during the simultaneous stage. The results indicate that the resolution of simultaneous processing varies as a function of the information which is extracted, i.e. as the information extraction becomes more detailed, from the number of moving elements to the direction of a specific element, the capacity to process multiple signals is reduced. Thus, when assigning a capacity to simultaneous motion processing, this must be qualified by designating the degree of information extraction. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Improving informed consent: Stakeholder views
Anderson, Emily E.; Newman, Susan B.; Matthews, Alicia K.
2017-01-01
Purpose Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders—research participants and those responsible for obtaining informed consent—to inform potential development of a multimedia informed consent “app.” Methods This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. Results We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Conclusions Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms. PMID:28949896
Improving informed consent: Stakeholder views.
Anderson, Emily E; Newman, Susan B; Matthews, Alicia K
2017-01-01
Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders-research participants and those responsible for obtaining informed consent-to inform potential development of a multimedia informed consent "app." This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms.
Wirshing, Donna A; Sergi, Mark J; Mintz, Jim
2005-01-01
This study evaluated a brief educational video designed to enhance the informed consent process for people with serious mental and medical illnesses who are considering participating in treatment research. Individuals with schizophrenia who were being recruited for ongoing clinical trials, medical patients without self-reported psychiatric comorbidity, and university undergraduates were randomly assigned to view either a highly structured instructional videotape about the consent process in treatment research or a control videotape that presented only general information about bioethical issues in human research. Knowledge about informed consent was measured before and after viewing. Viewing the experimental videotape resulted in larger gains in knowledge about informed consent. Standardized effect sizes were large in all groups. The videotape was thus an effective teaching tool across diverse populations, ranging from individuals with severe chronic mental illness to university undergraduates.
Sansgiry, S S; Cady, P S
1997-01-01
Currently, marketed over-the-counter (OTC) medication labels were simulated and tested in a controlled environment to understand consumer evaluation of OTC label information. Two factors, consumers' age (younger and older adults) and label designs (picture-only, verbal-only, congruent picture-verbal, and noncongruent picture-verbal) were controlled and tested to evaluate consumer information processing. The effects exerted by the independent variables, namely, comprehension of label information (understanding) and product evaluations (satisfaction, certainty, and perceived confusion) were evaluated on the dependent variable purchase intention. Intention measured as purchase recommendation was significantly related to product evaluations and affected by the factor label design. Participants' level of perceived confusion was more important than actual understanding of information on OTC medication labels. A Label Evaluation Process Model was developed which could be used for future testing of OTC medication labels.
Tung, Li-Chen; Yu, Wan-Hui; Lin, Gong-Hong; Yu, Tzu-Ying; Wu, Chien-Te; Tsai, Chia-Yin; Chou, Willy; Chen, Mei-Hsiang; Hsieh, Ching-Lin
2016-09-01
To develop a Tablet-based Symbol Digit Modalities Test (T-SDMT) and to examine the test-retest reliability and concurrent validity of the T-SDMT in patients with stroke. The study had two phases. In the first phase, six experts, nine college students and five outpatients participated in the development and testing of the T-SDMT. In the second phase, 52 outpatients were evaluated twice (2 weeks apart) with the T-SDMT and SDMT to examine the test-retest reliability and concurrent validity of the T-SDMT. The T-SDMT was developed via expert input and college student/patient feedback. Regarding test-retest reliability, the practise effects of the T-SDMT and SDMT were both trivial (d=0.12) but significant (p≦0.015). The improvement in the T-SDMT (4.7%) was smaller than that in the SDMT (5.6%). The minimal detectable changes (MDC%) of the T-SDMT and SDMT were 6.7 (22.8%) and 10.3 (32.8%), respectively. The T-SDMT and SDMT were highly correlated with each other at the two time points (Pearson's r=0.90-0.91). The T-SDMT demonstrated good concurrent validity with the SDMT. Because the T-SDMT had a smaller practise effect and less random measurement error (superior test-retest reliability), it is recommended over the SDMT for assessing information processing speed in patients with stroke. Implications for Rehabilitation The Symbol Digit Modalities Test (SDMT), a common measure of information processing speed, showed a substantial practise effect and considerable random measurement error in patients with stroke. The Tablet-based SDMT (T-SDMT) has been developed to reduce the practise effect and random measurement error of the SDMT in patients with stroke. The T-SDMT had smaller practise effect and random measurement error than the SDMT, which can provide more reliable assessments of information processing speed.
Concrete Crack Identification Using a UAV Incorporating Hybrid Image Processing
Lee, Junhwa; Ahn, Eunjong; Cho, Soojin; Shin, Myoungsu
2017-01-01
Crack assessment is an essential process in the maintenance of concrete structures. In general, concrete cracks are inspected by manual visual observation of the surface, which is intrinsically subjective as it depends on the experience of inspectors. Further, it is time-consuming, expensive, and often unsafe when inaccessible structural members are to be assessed. Unmanned aerial vehicle (UAV) technologies combined with digital image processing have recently been applied to crack assessment to overcome the drawbacks of manual visual inspection. However, identification of crack information in terms of width and length has not been fully explored in the UAV-based applications, because of the absence of distance measurement and tailored image processing. This paper presents a crack identification strategy that combines hybrid image processing with UAV technology. Equipped with a camera, an ultrasonic displacement sensor, and a WiFi module, the system provides the image of cracks and the associated working distance from a target structure on demand. The obtained information is subsequently processed by hybrid image binarization to estimate the crack width accurately while minimizing the loss of the crack length information. The proposed system has shown to successfully measure cracks thicker than 0.1 mm with the maximum length estimation error of 7.3%. PMID:28880254
Analysis of Preferred Directions in Phase Space for Tidal Measurements at Europa
NASA Astrophysics Data System (ADS)
Boone, D.; Scheeres, D. J.
2012-12-01
The NASA Jupiter Europa Orbiter mission requires a circular, near-polar orbit to measure Europa's Love numbers, geophysical coefficients which give insight into whether a liquid ocean exists. This type of orbit about planetary satellites is known to be unstable. The effects of Jupiter's tidal gravity are seen in changes in Europa's gravity field and surface deformation, which are sensed through doppler tracking over time and altimetry measurements respectively. These two measurement types separately determine the h and k Love numbers, a combination of which bounds how thick the ice shell of Europa is and whether liquid water is present. This work shows how the properties of an unstable periodic orbit about Europa generate preferred measurement directions in position and velocity phase space for the orbit determination process. We generate an error covariance over seven days for the orbiter state and science parameters using a periodic orbit and then disperse the orbit initial conditions in a Monte Carlo simulation according to this covariance. The dispersed orbits are shown to have a bias toward longer lifetimes and we discuss this as an effect of the stable and unstable manifolds of the periodic orbit. Using an epoch formulation of a square-root information filter, measurements aligned with the unstable manifold mapped back in time add more information to the orbit determination process than measurements aligned with the stable manifold. This corresponds to a contraction in the uncertainty of the estimate of the desired parameters, including the Love numbers. We demonstrate this mapping mathematically using a representation of the State Transition Matrix involving its eigenvectors and eigenvalues. Then using the properties of left and right eigenvectors, we show how measurements in the orbit determination process are mapped in time leading to a concentration of information at epoch. We present examples of measurements taken on different time schedules to show the effect of preferred phase space directions in the estimation process. Manifold coordinate decomposition is applied to the orbit initial conditions as well as measurement partials in the filter to show the alignment of each with the stable and unstable manifolds of the periodic orbit. The connection between orbit lifetime and regions of increased information density in phase space is made using the properties of these manifolds. Low altitude, near-polar periodic orbits with these characteristics are discussed along with the estimation results for the Love numbers, orbiter state, and orbit lifetime. Different measurement schedules and the resulting estimation performance are presented along with an analysis of information content for single measurements with respect to manifold alignment. These results allow more sensitive estimation of the tidal Love numbers and may allow measurements to be taken less frequently or compensate for corrupted data arcs. Other measurement types will be mapped in the same way using the State Transition matrix and have increased information density at epoch if aligned with the unstable manifold. In the same way, these results are applicable to planetary satellite orbiters about Enceladus or Dione since they share the governing equations of motion and properties of unstable periodic orbits.
A Cartesian reflex assessment of face processing.
Polewan, Robert J; Vigorito, Christopher M; Nason, Christopher D; Block, Richard A; Moore, John W
2006-03-01
Commands to blink were embedded within pictures of faces and simple geometric shapes or forms. The faces and shapes were conditioned stimuli (CSs), and the required responses were conditioned responses, or more properly, Cartesian reflexes (CRs). As in classical conditioning protocols, response times (RTs) were measured from CS onset. RTs provided a measure of the processing cost (PC) of attending to a CS. A PC is the extra time required to respond relative to RTs to unconditioned stimulus (US) commands presented alone. They reflect the interplay between attentional processing of the informational content of a CS and its signaling function with respect to the US command. This resulted in longer RTs to embedded commands. Differences between PCs of faces and geometric shapes represent a starting place for a new mental chronometry based on the traditional idea that differences in RT reflect differences in information processing.
Benefits of CMM-Based Software Process Improvement: Initial Results
1994-08-01
Institute Carnegie Mellon University Pittsburgh, Pennsylvania 15213 This report was prepar the SEI Joint Program Office HQ ESC/ENS 5 Eglin Street Hanscom AFB...Miller, Lt Col, USAF SEI Joint Program Office This work is sponsored by the U.S. Department of Defense. Copyright 0 1994 by Carnegie Mellon University...categories: descriptive information about the organizations, information about their process improvement and measurement programs , and data about the
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC.
These guidelines provide a handbook for use by federal organizations in structuring physical security and risk management programs for their automatic data processing facilities. This publication discusses security analysis, natural disasters, supporting utilities, system reliability, procedural measures and controls, off-site facilities,…
1997-09-30
Screen, abandoning changes. APPAREL ORDER PROCESSING MODULE FIELD USER MANUAL Ordering Official Screens The Ordering Official Screens are provided for...currendy selected Ordering Official will appear on the Ordering Official Information Screen. APPAREL ORDER PROCESSING MODULE FIELD USER MANUAL Ordering Official
Temporal characteristics of audiovisual information processing.
Fuhrmann Alpert, Galit; Hein, Grit; Tsai, Nancy; Naumer, Marcus J; Knight, Robert T
2008-05-14
In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.
Approximate reversibility in the context of entropy gain, information gain, and complete positivity
NASA Astrophysics Data System (ADS)
Buscemi, Francesco; Das, Siddhartha; Wilde, Mark M.
2016-06-01
There are several inequalities in physics which limit how well we can process physical systems to achieve some intended goal, including the second law of thermodynamics, entropy bounds in quantum information theory, and the uncertainty principle of quantum mechanics. Recent results provide physically meaningful enhancements of these limiting statements, determining how well one can attempt to reverse an irreversible process. In this paper, we apply and extend these results to give strong enhancements to several entropy inequalities, having to do with entropy gain, information gain, entropic disturbance, and complete positivity of open quantum systems dynamics. Our first result is a remainder term for the entropy gain of a quantum channel. This result implies that a small increase in entropy under the action of a subunital channel is a witness to the fact that the channel's adjoint can be used as a recovery map to undo the action of the original channel. We apply this result to pure-loss, quantum-limited amplifier, and phase-insensitive quantum Gaussian channels, showing how a quantum-limited amplifier can serve as a recovery from a pure-loss channel and vice versa. Our second result regards the information gain of a quantum measurement, both without and with quantum side information. We find here that a small information gain implies that it is possible to undo the action of the original measurement if it is efficient. The result also has operational ramifications for the information-theoretic tasks known as measurement compression without and with quantum side information. Our third result shows that the loss of Holevo information caused by the action of a noisy channel on an input ensemble of quantum states is small if and only if the noise can be approximately corrected on average. We finally establish that the reduced dynamics of a system-environment interaction are approximately completely positive and trace preserving if and only if the data processing inequality holds approximately.
Crisp, R J; Hewstone, M; Cairns, E
2001-12-01
A study was conducted to explore whether participants in Northern Ireland attend to, and process information about, different group members as a function of a single dimension of category membership (religion) or as a function of additional and/or alternative bases for group membership. Utilizing a bogus 'newspaper story' paradigm, we explored whether participants would differentially recall target attributes as a function of two dimensions of category membership. Findings from this recall measure suggested that information concerning ingroup and outgroup members was processed as an interactive function of both religion and gender intergroup dimensions. Religion was only used to guide processing of more specific information if the story character was also an outgroup member on the gender dimension. These findings suggest a complex pattern of intergroup representation in the processing of group-relevant information in the Northern Irish context.
Neuroergonomics: Quantitative Modeling of Individual, Shared, and Team Neurodynamic Information.
Stevens, Ronald H; Galloway, Trysha L; Willemsen-Dunlap, Ann
2018-06-01
The aim of this study was to use the same quantitative measure and scale to directly compare the neurodynamic information/organizations of individual team members with those of the team. Team processes are difficult to separate from those of individual team members due to the lack of quantitative measures that can be applied to both process sets. Second-by-second symbolic representations were created of each team member's electroencephalographic power, and quantitative estimates of their neurodynamic organizations were calculated from the Shannon entropy of the symbolic data streams. The information in the neurodynamic data streams of health care ( n = 24), submarine navigation ( n = 12), and high school problem-solving ( n = 13) dyads was separated into the information of each team member, the information shared by team members, and the overall team information. Most of the team information was the sum of each individual's neurodynamic information. The remaining team information was shared among the team members. This shared information averaged ~15% of the individual information, with momentary levels of 1% to 80%. Continuous quantitative estimates can be made from the shared, individual, and team neurodynamic information about the contributions of different team members to the overall neurodynamic organization of a team and the neurodynamic interdependencies among the team members. Information models provide a generalizable quantitative method for separating a team's neurodynamic organization into that of individual team members and that shared among team members.
Seibert, Julie; Fields, Suzanne; Fullerton, Catherine Anne; Mark, Tami L; Malkani, Sabrina; Walsh, Christine; Ehrlich, Emily; Imshaug, Melina; Tabrizi, Maryam
2015-06-01
The structure-process-outcome quality framework espoused by Donabedian provides a conceptual way to examine and prioritize behavioral health quality measures used by states. This report presents an environmental scan of the quality measures and satisfaction surveys that state Medicaid managed care and behavioral health agencies used prior to Medicaid expansion in 2014. Data were collected by reviewing online documents related to Medicaid managed care contracts for behavioral health, quality strategies, quality improvement plans, quality and performance indicators data, annual outcomes reports, performance measure specification manuals, legislative reports, and Medicaid waiver requests for proposals. Information was publicly available for 29 states. Most states relied on process measures, along with some structure and outcome measures. Although all states reported on at least one process measure of behavioral health quality, 52% of states did not use any outcomes measures and 48% of states had no structure measures. A majority of the states (69%) used behavioral health measures from the National Committee for Quality Assurance's Healthcare Effectiveness Data and Information Set, and all but one state in the sample (97%) used consumer experience-of-care surveys. Many states supplemented these data with locally developed behavioral health indicators that rely on administrative and nonadministrative data. State Medicaid agencies are using nationally recognized as well as local measures to assess quality of behavioral health care. Findings indicate a need for additional nationally endorsed measures in the area of substance use disorders and treatment outcomes.
An integrative approach for measuring semantic similarities using gene ontology.
Peng, Jiajie; Li, Hongxiang; Jiang, Qinghua; Wang, Yadong; Chen, Jin
2014-01-01
Gene Ontology (GO) provides rich information and a convenient way to study gene functional similarity, which has been successfully used in various applications. However, the existing GO based similarity measurements have limited functions for only a subset of GO information is considered in each measure. An appropriate integration of the existing measures to take into account more information in GO is demanding. We propose a novel integrative measure called InteGO2 to automatically select appropriate seed measures and then to integrate them using a metaheuristic search method. The experiment results show that InteGO2 significantly improves the performance of gene similarity in human, Arabidopsis and yeast on both molecular function and biological process GO categories. InteGO2 computes gene-to-gene similarities more accurately than tested existing measures and has high robustness. The supplementary document and software are available at http://mlg.hit.edu.cn:8082/.
Monitoring landscape level processes using remote sensing of large plots
Raymond L. Czaplewski
1991-01-01
Global and regional assessaents require timely information on landscape level status (e.g., areal extent of different ecosystems) and processes (e.g., changes in land use and land cover). To measure and understand these processes at the regional level, and model their impacts, remote sensing is often necessary. However, processing massive volumes of remotely sensing...
NASA Technical Reports Server (NTRS)
Hasler, A. F.; Desjardins, M.; Shenk, W. E.
1979-01-01
Simultaneous Geosynchronous Operational Environmental Satellite (GOES) 1 km resolution visible image pairs can provide quantitative three dimensional measurements of clouds. These data have great potential for severe storms research and as a basic parameter measurement source for other areas of meteorology (e.g. climate). These stereo cloud height measurements are not subject to the errors and ambiguities caused by unknown cloud emissivity and temperature profiles that are associated with infrared techniques. This effort describes the display and measurement of stereo data using digital processing techniques.
Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias
2017-01-31
Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.
Laser Doppler dust devil measurements
NASA Technical Reports Server (NTRS)
Bilbro, J. W.; Jeffreys, H. B.; Kaufman, J. W.; Weaver, E. A.
1977-01-01
A scanning laser doppler velocimeter (SLDV) system was used to detect, track, and measure the velocity flow field of naturally occurring tornado-like flows (dust devils) in the atmosphere. A general description of the dust devil phenomenon is given along with a description of the test program, measurement system, and data processing techniques used to collect information on the dust devil flow field. The general meteorological conditions occurring during the test program are also described, and the information collected on two selected dust devils are discussed in detail to show the type of information which can be obtained with a SLDV system. The results from these measurements agree well with those of other investigators and illustrate the potential for the SLDV in future endeavors.
Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Crutchfield, James P.
2018-03-01
The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.
Information-based models for finance and insurance
NASA Astrophysics Data System (ADS)
Hoyle, Edward
2010-10-01
In financial markets, the information that traders have about an asset is reflected in its price. The arrival of new information then leads to price changes. The `information-based framework' of Brody, Hughston and Macrina (BHM) isolates the emergence of information, and examines its role as a driver of price dynamics. This approach has led to the development of new models that capture a broad range of price behaviour. This thesis extends the work of BHM by introducing a wider class of processes for the generation of the market filtration. In the BHM framework, each asset is associated with a collection of random cash flows. The asset price is the sum of the discounted expectations of the cash flows. Expectations are taken with respect (i) an appropriate measure, and (ii) the filtration generated by a set of so-called information processes that carry noisy or imperfect market information about the cash flows. To model the flow of information, we introduce a class of processes termed Lévy random bridges (LRBs), generalising the Brownian and gamma information processes of BHM. Conditioned on its terminal value, an LRB is identical in law to a Lévy bridge. We consider in detail the case where the asset generates a single cash flow X_T at a fixed date T. The flow of information about X_T is modelled by an LRB with random terminal value X_T. An explicit expression for the price process is found by working out the discounted conditional expectation of X_T with respect to the natural filtration of the LRB. New models are constructed using information processes related to the Poisson process, the Cauchy process, the stable-1/2 subordinator, the variance-gamma process, and the normal inverse-Gaussian process. These are applied to the valuation of credit-risky bonds, vanilla and exotic options, and non-life insurance liabilities.
Acquisition of Programming Skills
1990-04-01
skills (e.g., arithmetic reasoning, work knowledge, information processing speed); and c) passive versus active learning style. Ability measures...concurrent storage and processing an individual was capable of doing), and an active learning style. Implications of the findings for the development of
α-induced reaction cross section measurements on 197Au
NASA Astrophysics Data System (ADS)
Szücs, Tamás; Gyürky, György; Halász, Zoltán; Kiss, Gábor Gy.; Fülöp, Zsolt
2018-01-01
The γ-process is responsible for creating the majority of the isotopes of heavier elements on the proton rich side of the valley of stability. The γ-process simulations fail to reproduce the measured solar system abundance of these isotopes. The problem can lie in the not well known astrophysical scenarios where the process takes place, or in the not sufficiently known nuclear physics input. To improve the latter part, α-induced reaction cross section measurements on 197Au were carried out at Atomki. With this dataset new experimental information will become available, which can be later used as validation of the theoretical cross section calculations used in the γ-process simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horodecki, Michal; Sen, Aditi; Sen, Ujjwal
The impossibility of cloning and deleting of unknown states constitute important restrictions on processing of information in the quantum world. On the other hand, a known quantum state can always be cloned or deleted. However, if we restrict the class of allowed operations, there will arise restrictions on the ability of cloning and deleting machines. We have shown that cloning and deleting of known states is in general not possible by local operations. This impossibility hints at quantum correlation in the state. We propose dual measures of quantum correlation based on the dual restrictions of no local cloning and nomore » local deleting. The measures are relative entropy distances of the desired states in a (generally impossible) perfect local cloning or local deleting process from the best approximate state that is actually obtained by imperfect local cloning or deleting machines. Just like the dual measures of entanglement cost and distillable entanglement, the proposed measures are based on important processes in quantum information. We discuss their properties. For the case of pure states, estimations of these two measures are also provided. Interestingly, the entanglement of cloning for a maximally entangled state of two two-level systems is not unity.« less
NASA Astrophysics Data System (ADS)
Grenn, Michael W.
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed to reach the desired state of quality is estimated from the perspective of the receiver. The HR may increase, decrease or remain steady depending on the degree to which additions, deletions and revisions impact the distribution of R among the quality levels. Current requirements trend metrics generally treat additions, deletions and revisions the same and simply measure the quantity of these changes over time. The REF evaluates the quantity of requirements changes over time, distinguishes between their positive and negative effects by calculating their impact on HR, Q, and AI, and forecasts when the desired state will be reached, enabling more accurate assessment of the status and progress of the requirements engineering effort. Results from random variable simulations suggest the REF is an improved leading indicator of requirements trends that can be readily combined with current methods. The increase in I, or decrease in H R and uncertainty, is proportional to the engineering effort E input into the requirements engineering process. The REF estimates the AE needed to transition R from their current state of quality to the desired end state or some other interim state of interest. Simulation results are compared with measured engineering effort data for Department of Defense programs published in the SE literature, and the results suggest the REF is a promising new method for estimation of AE.
Objectively-Measured Physical Activity and Cognitive Functioning in Breast Cancer Survivors
Marinac, Catherine R.; Godbole, Suneeta; Kerr, Jacqueline; Natarajan, Loki; Patterson, Ruth E.; Hartman, Sheri J.
2015-01-01
Purpose To explore the relationship between objectively measured physical activity and cognitive functioning in breast cancer survivors. Methods Participants were 136 postmenopausal breast cancer survivors. Cognitive functioning was assessed using a comprehensive computerized neuropsychological test. 7-day physical activity was assessed using hip-worn accelerometers. Linear regression models examined associations of minutes per day of physical activity at various intensities on individual cognitive functioning domains. The partially adjusted model controlled for primary confounders (model 1), and subsequent adjustments were made for chemotherapy history (model 2), and BMI (model 3). Interaction and stratified models examined BMI as an effect modifier. Results Moderate-to-vigorous physical activity (MVPA) was associated with Information Processing Speed. Specifically, ten minutes of MVPA was associated with a 1.35-point higher score (out of 100) on the Information Processing Speed domain in the partially adjusted model, and a 1.29-point higher score when chemotherapy was added to the model (both p<.05). There was a significant BMI x MVPA interaction (p=.051). In models stratified by BMI (<25 vs. ≥25 kg/m2), the favorable association between MVPA and Information Processing Speed was stronger in the subsample of overweight and obese women (p<.05), but not statistically significant in the leaner subsample. Light-intensity physical activity was not significantly associated with any of the measured domains of cognitive function. Conclusions MVPA may have favorable effects on Information Processing Speed in breast cancer survivors, particularly among overweight or obese women. Implications for Cancer Survivors Interventions targeting increased physical activity may enhance aspects of cognitive function among breast cancer survivors. PMID:25304986
Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.
2014-01-01
Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
Meegan, Daniel V; Honsberger, Michael J M
2005-05-01
Many neuroimaging studies have been designed to differentiate domain-specific processes in the brain. A common design constraint is to use identical stimuli for different domain-specific tasks. For example, an experiment investigating spatial versus identity processing would present compound spatial-identity stimuli in both spatial and identity tasks, and participants would be instructed to attend to, encode, maintain, or retrieve spatial information in the spatial task, and identity information in the identity task. An assumption in such studies is that spatial information will not be processed in the identity task, as it is irrelevant for that task. We report three experiments demonstrating violations of this assumption. Our results suggest that comparisons of spatial and identity tasks in existing neuroimaging studies have underestimated the amount of brain activation that is spatial-specific. For future neuroimaging studies, we recommend unique stimulus displays for each domain-specific task, and event-related measurement of post-stimulus processing.
Goychuk, I
2001-08-01
Stochastic resonance in a simple model of information transfer is studied for sensory neurons and ensembles of ion channels. An exact expression for the information gain is obtained for the Poisson process with the signal-modulated spiking rate. This result allows one to generalize the conventional stochastic resonance (SR) problem (with periodic input signal) to the arbitrary signals of finite duration (nonstationary SR). Moreover, in the case of a periodic signal, the rate of information gain is compared with the conventional signal-to-noise ratio. The paper establishes the general nonequivalence between both measures notwithstanding their apparent similarity in the limit of weak signals.
Psychiatric symptomatology and the recall of positive and negative personality information.
Furnham, A; Cheng, H
1996-09-01
Various studies from the cognitive information processing tradition have shown that neuroticism is particularly associated with the preferential processing of negative information about the self. Just over 60 'normal' subjects completed the Langner (1962, Journal of Health and Human Behaviour, 3, 269-276) 22 measure of minor psychiatric symptoms. Later, they were presented with a list of positive, neutral and negative trait words for self-rating. After 1 hr, subjects were asked to recall all the trait words. As predicted, the Langner (1962) score was associated with an increased probability of recalling negative self-referent information (r = 0.36). Implications for therapy are considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, James; Alexander, Thomas; Aalseth, Craig
2017-08-01
Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.
The Units Ontology: a tool for integrating units of measurement in science
Gkoutos, Georgios V.; Schofield, Paul N.; Hoehndorf, Robert
2012-01-01
Units are basic scientific tools that render meaning to numerical data. Their standardization and formalization caters for the report, exchange, process, reproducibility and integration of quantitative measurements. Ontologies are means that facilitate the integration of data and knowledge allowing interoperability and semantic information processing between diverse biomedical resources and domains. Here, we present the Units Ontology (UO), an ontology currently being used in many scientific resources for the standardized description of units of measurements. PMID:23060432
NASA Technical Reports Server (NTRS)
Bogomolov, E. A.; Yevstafev, Y. Y.; Karakadko, V. K.; Lubyanaya, N. D.; Romanov, V. A.; Totubalina, M. G.; Yamshchikov, M. A.
1975-01-01
A system for the recording and processing of telescope data is considered for measurements of EW asymmetry. The information is recorded by 45 channels on a continuously moving 35-mm film. The dead time of the recorder is about 0.1 sec. A sorting electronic circuit is used to reduce the errors when the statistical time distribution of the pulses is recorded. The recorded information is read out by means of photoresistors. The phototransmitter signals are fed either to the mechanical recorder unit for preliminary processing, or to a logical circuit which controls the operation of the punching device. The punched tape is processed by an electronic computer.
NASA Astrophysics Data System (ADS)
Klus, Jakub; Pořízka, Pavel; Prochazka, David; Mikysek, Petr; Novotný, Jan; Novotný, Karel; Slobodník, Marek; Kaiser, Jozef
2017-05-01
This paper presents a novel approach for processing the spectral information obtained from high-resolution elemental mapping performed by means of Laser-Induced Breakdown Spectroscopy. The proposed methodology is aimed at the description of possible elemental associations within a heterogeneous sample. High-resolution elemental mapping provides a large number of measurements. Moreover, typical laser-induced plasma spectrum consists of several thousands of spectral variables. Analysis of heterogeneous samples, where valuable information is hidden in a limited fraction of sample mass, requires special treatment. The sample under study is a sandstone-hosted uranium ore that shows irregular distribution of ore elements such as zirconium, titanium, uranium and niobium. Presented processing methodology shows the way to reduce the dimensionality of data and retain the spectral information by utilizing self-organizing maps (SOM). The spectral information from SOM is processed further to detect either simultaneous or isolated presence of elements. Conclusions suggested by SOM are in good agreement with geological studies of mineralization phases performed at the deposit. Even deeper investigation of the SOM results enables discrimination of interesting measurements and reveals new possibilities in the visualization of chemical mapping information. Suggested approach improves the description of elemental associations in mineral phases, which is crucial for the mining industry.
Cognition and Health Literacy in Older Adults’ Recall of Self-Care Information
Madison, Anna; Gao, Xuefei; Graumlich, James F.; Conner-Garcia, Thembi; Murray, Michael D.; Stine-Morrow, Elizabeth A. L.; Morrow, Daniel G.
2017-01-01
Abstract Purpose of the Study: Health literacy is associated with health outcomes presumably because it influences the understanding of information needed for self-care. However, little is known about the language comprehension mechanisms that underpin health literacy. Design and Methods: We explored the relationship between a commonly used measure of health literacy (Short Test of Functional Health Literacy in Adults [STOFHLA]) and comprehension of health information among 145 older adults. Results: Results showed that performance on the STOFHLA was associated with recall of health information. Consistent with the Process-Knowledge Model of Health Literacy, mediation analysis showed that both processing capacity and knowledge mediated the association between health literacy and recall of health information. In addition, knowledge moderated the effects of processing capacity limits, such that processing capacity was less likely to be associated with recall for older adults with higher levels of knowledge. Implications: These findings suggest that knowledge contributes to health literacy and can compensate for deficits in processing capacity to support comprehension of health information among older adults. The implications of these findings for improving patient education materials for older adults with inadequate health literacy are discussed. PMID:26209450
Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.
Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A
2002-10-01
Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.
On Meaningful Measurement: Concepts, Technology and Examples.
ERIC Educational Resources Information Center
Cheung, K. C.
This paper discusses how concepts and procedural skills in problem-solving tasks, as well as affects and emotions, can be subjected to meaningful measurement (MM), based on a multisource model of learning and a constructivist information-processing theory of knowing. MM refers to the quantitative measurement of conceptual and procedural knowledge…
Measuring Transactiving Memory Systems Using Network Analysis
ERIC Educational Resources Information Center
King, Kylie Goodell
2017-01-01
Transactive memory systems (TMSs) describe the structures and processes that teams use to share information, work together, and accomplish shared goals. First introduced over three decades ago, TMSs have been measured in a variety of ways. This dissertation proposes the use of network analysis in measuring TMS. This is accomplished by describing…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... through primary processing; (2) to analyze the economic performance effects of current management measures; and (3) to analyze the economic performance effects of alternative management measures. The measures... used to track economic performance and to evaluate the economic effects of alternative management...
Channelling information flows from observation to decision; or how to increase certainty
NASA Astrophysics Data System (ADS)
Weijs, S. V.
2015-12-01
To make adequate decisions in an uncertain world, information needs to reach the decision problem, to enable overseeing the full consequences of each possible decision.On its way from the physical world to a decision problem, information is transferred through the physical processes that influence the sensor, then through processes that happen in the sensor, through wires or electromagnetic waves. For the last decade, most information becomes digitized at some point. From moment of digitization, information can in principle be transferred losslessly. Information about the physical world is often also stored, sometimes in compressed form, such as physical laws, concepts, or models of specific hydrological systems. It is important to note, however, that all information about a physical system eventually has to originate from observation (although inevitably coloured by some prior assumptions). This colouring makes the compression lossy, but is effectively the only way to make use of similarities in time and space that enable predictions while measuring only a a few macro-states of a complex hydrological system.Adding physical process knowledge to a hydrological model can thus be seen as a convenient way to transfer information from observations from a different time or place, to make predictions about another situation, assuming the same dynamics are at work.The key challenge to achieve more certainty in hydrological prediction can therefore be formulated as a challenge to tap and channel information flows from the environment. For tapping more information flows, new measurement techniques, large scale campaigns, historical data sets, and large sample hydrology and regionalization efforts can bring progress. For channelling the information flows with minimum loss, model calibration, and model formulation techniques should be critically investigated. Some experience from research in a Swiss high alpine catchment are used as an illustration.
Quantifying the Information Habits of High School Students Engaged in Engineering Design
ERIC Educational Resources Information Center
Mentzer, Nathan; Fosmire, Michael J.
2015-01-01
This study measured the information gathering behaviors of high school students who had taken engineering design courses as they solved a design problem. The authors investigated what types of information students accessed, its quality, when it was accessed during the students' process, and if it impacted their thinking during the activity.…
ERIC Educational Resources Information Center
Dunlop, David Livingston
The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…
Teaching and Learning Information Synthesis: An Intervention and Rubric Based Assessment
ERIC Educational Resources Information Center
Lundstrom, Kacy; Diekema, Anne R.; Leary, Heather; Haderlie, Sheri; Holliday, Wendy
2015-01-01
The purpose of this research was to determine how information synthesis skills can be taught effectively, and to discover how the level of synthesis in student writing can be effectively measured. The intervention was an information synthesis lesson that broke down the synthesis process into sequenced tasks. Researchers created a rubric which they…
Rule-Based Expert Systems in the Command Estimate: An Operational Perspective
1990-06-01
control measures. 5. Prepare COA statement(s) and sketch(es). The key inputs for developing courses of action are the DFD process of IPB, data stores...mission, or a change of information provides new direction to this process for that particular operation." Formal scientific analysis of the command...30 5. Delivery of outside news . This feature contributes to the commanders insatiable need for current information. Artificial intelligence ana rule
Lognormal Infection Times of Online Information Spread
Doerr, Christian; Blenn, Norbert; Van Mieghem, Piet
2013-01-01
The infection times of individuals in online information spread such as the inter-arrival time of Twitter messages or the propagation time of news stories on a social media site can be explained through a convolution of lognormally distributed observation and reaction times of the individual participants. Experimental measurements support the lognormal shape of the individual contributing processes, and have resemblance to previously reported lognormal distributions of human behavior and contagious processes. PMID:23700473
Research on distributed optical fiber sensing data processing method based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing
2018-01-01
The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.
[Measurement and performance analysis of functional neural network].
Li, Shan; Liu, Xinyu; Chen, Yan; Wan, Hong
2018-04-01
The measurement of network is one of the important researches in resolving neuronal population information processing mechanism using complex network theory. For the quantitative measurement problem of functional neural network, the relation between the measure indexes, i.e. the clustering coefficient, the global efficiency, the characteristic path length and the transitivity, and the network topology was analyzed. Then, the spike-based functional neural network was established and the simulation results showed that the measured network could represent the original neural connections among neurons. On the basis of the former work, the coding of functional neural network in nidopallium caudolaterale (NCL) about pigeon's motion behaviors was studied. We found that the NCL functional neural network effectively encoded the motion behaviors of the pigeon, and there were significant differences in four indexes among the left-turning, the forward and the right-turning. Overall, the establishment method of spike-based functional neural network is available and it is an effective tool to parse the brain information processing mechanism.
Addissie, Adamu; Abay, Serebe; Feleke, Yeweyenhareg; Newport, Melanie; Farsides, Bobbie; Davey, Gail
2016-07-12
Maximizing comprehension is a major challenge for informed consent processes in low-literacy and resource-limited settings. Application of rapid qualitative assessments to improve the informed consent process is increasingly considered useful. This study assessed the effects of Rapid Ethical Assessment (REA) on comprehension, retention and quality of the informed consent process. A cluster randomized trial was conducted among participants of HPV sero-prevalence study in two districts of Northern Ethiopia, in 2013. A total of 300 study participants, 150 in the intervention and 150 in the control group, were included in the study. For the intervention group, the informed consent process was designed with further revisions based on REA findings. Informed consent comprehension levels and quality of the consent process were measured using the Modular Informed Consent Comprehension Assessment (MICCA) and Quality of Informed Consent (QuIC) process assessment tools, respectively. Study recruitment rates were 88.7 % and 80.7 % (p = 0.05), while study retention rates were 85.7 % and 70.3 % (p < 0.005) for the intervention and control groups respectively. Overall, the mean informed consent comprehension scores for the intervention and control groups were 73.1 % and 45.2 %, respectively, with a mean difference in comprehension score of 27.9 % (95 % CI 24.0 % - 33.4 %; p < 0.001,). Mean scores for quality of informed consent for the intervention and control groups were 89.1 % and 78.5 %, respectively, with a mean difference of 10.5 % (95 % CI 6.8 -14.2 %; p < 0.001). Levels of informed consent comprehension, quality of the consent process, study recruitment and retention rates were significantly improved in the intervention group. We recommend REA as a potential modality to improve informed consent comprehension and quality of informed consent process in low resource settings.
Patel, Harshali K; Bapat, Shweta S; Bhansali, Archita H; Sansgiry, Sujit S
2018-01-01
The objective of this study was to develop a one-page (1-page) prescription drug information leaflet (PILs) and assess their impact on the information processing variables, across 2 levels of patient involvement. One-page PILs were developed using cognitive principles to lower mental effort and improve comprehension. An experimental, 3 × 2 repeated measures study was conducted to determine the impact of cognitive effort, manipulated using leaflet type on comprehension across 2 levels (high/low) of patient involvement. Adults (≥18 years) in a university setting in Houston were recruited for the study. Each participant was exposed to 3 different types of prescription drug information leaflet (the current practice, preexisting 1-page text-only, and 1-page PILs) for the 3 drugs (Celebrex, Ventolin HFA, Prezista) for a given involvement scenario. A prevalidated survey instrument was used to measure product knowledge, attitude toward leaflet, and intention to read. Multivariate analysis of variance indicated significant positive effect of cognitive effort, involvement, and their interaction effect across all measured variables. Mean scores for product knowledge, attitude toward leaflet, and intention to read were highest for PILs ( P < .001), indicating that PILs exerted lowest cognitive effort. Univariate and post hoc analysis indicate that product knowledge significantly increases with high involvement. Patients reading PILs have higher comprehension compared with the current practice and text-only prototype leaflets evaluated. Higher levels of involvement further improve participant knowledge about the drug, increase their intention to read the leaflet, and change their attitude toward the leaflet. Implementation of PILs would improve information processing for consumers by reducing their cognitive effort.
The entropic cost of quantum generalized measurements
NASA Astrophysics Data System (ADS)
Mancino, Luca; Sbroscia, Marco; Roccia, Emanuele; Gianani, Ilaria; Somma, Fabrizia; Mataloni, Paolo; Paternostro, Mauro; Barbieri, Marco
2018-03-01
Landauer's principle introduces a symmetry between computational and physical processes: erasure of information, a logically irreversible operation, must be underlain by an irreversible transformation dissipating energy. Monitoring micro- and nano-systems needs to enter into the energetic balance of their control; hence, finding the ultimate limits is instrumental to the development of future thermal machines operating at the quantum level. We report on the experimental investigation of a lower bound to the irreversible entropy associated to generalized quantum measurements on a quantum bit. We adopted a quantum photonics gate to implement a device interpolating from the weakly disturbing to the fully invasive and maximally informative regime. Our experiment prompted us to introduce a bound taking into account both the classical result of the measurement and the outcoming quantum state; unlike previous investigation, our entropic bound is based uniquely on measurable quantities. Our results highlight what insights the information-theoretic approach provides on building blocks of quantum information processors.
Task–Technology Fit of Video Telehealth for Nurses in an Outpatient Clinic Setting
Finkelstein, Stanley M.
2014-01-01
Abstract Background: Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task–technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task–technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. Materials and Methods: The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time–motion study. Qualitative and quantitative results were merged and analyzed within the task–technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Results: Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task–technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Conclusions: Telehealth must provide the right information to the right clinician at the right time. Evaluating task–technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology. PMID:24841219
Using waveform information in nonlinear data assimilation
NASA Astrophysics Data System (ADS)
Rey, Daniel; Eldridge, Michael; Morone, Uriel; Abarbanel, Henry D. I.; Parlitz, Ulrich; Schumann-Bischoff, Jan
2014-12-01
Information in measurements of a nonlinear dynamical system can be transferred to a quantitative model of the observed system to establish its fixed parameters and unobserved state variables. After this learning period is complete, one may predict the model response to new forces and, when successful, these predictions will match additional observations. This adjustment process encounters problems when the model is nonlinear and chaotic because dynamical instability impedes the transfer of information from the data to the model when the number of measurements at each observation time is insufficient. We discuss the use of information in the waveform of the data, realized through a time delayed collection of measurements, to provide additional stability and accuracy to this search procedure. Several examples are explored, including a few familiar nonlinear dynamical systems and small networks of Colpitts oscillators.
Sensing roller for in-process thickness measurement
Novak, James L.
1996-01-01
An apparatus and method for processing materials by sensing roller, in which the sensing roller has a plurality of conductive rings (electrodes) separated by rings of dielectric material. Sensing capacitances or impedances between the electrodes provides information on thicknesses of the materials being processed, location of wires therein, and other like characteristics of the materials.
Cultural Differences in the Development of Processing Speed
ERIC Educational Resources Information Center
Kail, Robert V.; McBride-Chang, Catherine; Ferrer, Emilio; Cho, Jeung-Ryeul; Shu, Hua
2013-01-01
The aim of the present work was to examine cultural differences in the development of speed of information processing. Four samples of US children ("N" = 509) and four samples of East Asian children ("N" = 661) completed psychometric measures of processing speed on two occasions. Analyses of the longitudinal data indicated…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masanet, Eric; Masanet, Eric; Worrell, Ernst
2008-01-01
The U.S. fruit and vegetable processing industry--defined in this Energy Guide as facilities engaged in the canning, freezing, and drying or dehydrating of fruits and vegetables--consumes over $800 million worth of purchased fuels and electricity per year. Energy efficiency improvement isan important way to reduce these costs and to increase predictable earnings, especially in times of high energy price volatility. There are a variety of opportunities available at individual plants in the U.S. fruit and vegetable processing industry to reduce energy consumption in a cost-effective manner. This Energy Guide discusses energy efficiency practices and energy-efficient technologies that can be implementedmore » at the component, process, facility, and organizational levels. A discussion of the trends, structure, and energy consumption characteristics of the U.S. fruit and vegetable processing industry is provided along with a description of the major process technologies used within the industry. Next, a wide variety of energy efficiency measures applicable to fruit and vegetable processing plants are described. Many measure descriptions include expected savings in energy and energy-related costs, based on case study data from real-world applications in fruit and vegetable processing facilities and related industries worldwide. Typical measure payback periods and references to further information in the technical literature are also provided, when available. Given the importance of water in fruit and vegetable processing, a summary of basic, proven measures for improving plant-level water efficiency are also provided. The information in this Energy Guide is intended to help energy and plant managers in the U.S. fruit and vegetable processing industry reduce energy and water consumption in a cost-effective manner while maintaining the quality of products manufactured. Further research on the economics of all measures--as well as on their applicability to different production practices--is needed to assess their cost effectiveness at individual plants.« less
Self-referenced processing, neurodevelopment and joint attention in autism.
Mundy, Peter; Gwaltney, Mary; Henderson, Heather
2010-09-01
This article describes a parallel and distributed processing model (PDPM) of joint attention, self-referenced processing and autism. According to this model, autism involves early impairments in the capacity for rapid, integrated processing of self-referenced (proprioceptive and interoceptive) and other-referenced (exteroceptive) information. Measures of joint attention have proven useful in research on autism because they are sensitive to the early development of the 'parallel' and integrated processing of self- and other-referenced stimuli. Moreover, joint attention behaviors are a consequence, but also an organizer of the functional development of a distal distributed cortical system involving anterior networks including the prefrontal and insula cortices, as well as posterior neural networks including the temporal and parietal cortices. Measures of joint attention provide early behavioral indicators of atypical development in this parallel and distributed processing system in autism. In addition it is proposed that an early, chronic disturbance in the capacity for integrating self- and other-referenced information may have cascading effects on the development of self awareness in autism. The assumptions, empirical support and future research implications of this model are discussed.
Frederickson, Norah; Jones, Alice P.; Warren, Laura; Deakes, Tara; Allen, Geoff
2013-01-01
An initial evaluation of the utility of designing an intervention to address neuroscience-based subtyping of children who have conduct problems was undertaken in this pilot study. Drawing on the literature on callous–unemotional traits, a novel intervention programme, ‘Let's Get Smart’, was implemented in a school for children with social emotional and behavioural difficulties. A mixed-methods design was used to investigate the perspectives of staff participant-observers in the change process, alongside standardised scores on measures of pupil performance and behaviour. Both qualitative and quantitative results showed reductions in externalising behaviour and improvements in measures of hypothesised underlying cognitive and affective processes. While externalising behaviour improved across subtypes, associated changes in underlying processes differed by subtype, supporting the potential value of neuroscience-informed contributions to intervention planning. PMID:26635493
O'Hare, Aminda J; Atchley, Ruth Ann; Young, Keith M
2017-11-16
Two dominant theories on lateralized processing of emotional information exist in the literature. One theory posits that unpleasant emotions are processed by right frontal regions, while pleasant emotions are processed by left frontal regions. The other theory posits that the right hemisphere is more specialized for the processing of emotional information overall, particularly in posterior regions. Assessing the different roles of the cerebral hemispheres in processing emotional information can be difficult without the use of neuroimaging methodologies, which are not accessible or affordable to all scientists. Divided visual field presentation of stimuli can allow for the investigation of lateralized processing of information without the use of neuroimaging technology. This study compared central versus divided visual field presentations of emotional images to assess differences in motivated attention between the two hemispheres. The late positive potential (LPP) was recorded using electroencephalography (EEG) and event-related potentials (ERPs) methodologies to assess motivated attention. Future work will pair this paradigm with a more active behavioral task to explore the behavioral impacts on the attentional differences found.
Measurement of entanglement entropy in the two-dimensional Potts model using wavelet analysis.
Tomita, Yusuke
2018-05-01
A method is introduced to measure the entanglement entropy using a wavelet analysis. Using this method, the two-dimensional Haar wavelet transform of a configuration of Fortuin-Kasteleyn (FK) clusters is performed. The configuration represents a direct snapshot of spin-spin correlations since spin degrees of freedom are traced out in FK representation. A snapshot of FK clusters loses image information at each coarse-graining process by the wavelet transform. It is shown that the loss of image information measures the entanglement entropy in the Potts model.
Blackstock, Sheila; Harlos, Karen; Macleod, Martha L P; Hardy, Cindy L
2015-11-01
To examine the impact of organisational factors on bullying among peers (i.e. horizontal) and its effect on turnover intentions among Canadian registered nurses (RNs). Bullying among nurses is an international problem. Few studies have examined factors specific to nursing work environments that may increase exposure to bullying. An Australian model of nurse bullying was tested among Canadian registered nurse coworkers using a web-based survey (n = 103). Three factors - misuse of organisational processes/procedures, organisational tolerance and reward of bullying, and informal organisational alliances - were examined as predictors of horizontal bullying, which in turn was examined as a predictor of turnover intentions. The construct validity of model measures was explored. Informal organisational alliances and misuse of organisational processes/procedures predicted increased horizontal bullying that, in turn, predicted increased turnover intentions. Construct validity of model measures was supported. Negative informal alliances and misuse of organisational processes are antecedents to bullying, which adversely affects employment relationship stability. The results suggest that reforming flawed organisational processes that contribute to registered nurses' bullying experiences may help to reduce chronically high turnover. Nurse leaders and managers need to create workplace processes that foster positive networks, fairness and respect through more transparent and accountable practices. © 2014 John Wiley & Sons Ltd.
Signal processing methods for MFE plasma diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candy, J.V.; Casper, T.; Kane, R.
1985-02-01
The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL.
Utility-based early modulation of processing distracting stimulus information.
Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas
2014-12-10
Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
DOT National Transportation Integrated Search
1995-10-01
The primary objective of this study is to provide information relative to the development of a set of performance measures for intermodal freight transportation. To accomplish this objective, data was collected, processed, and analyzed on the basis o...
Decision Making Processes and Outcomes
Hicks Patrick, Julie; Steele, Jenessa C.; Spencer, S. Melinda
2013-01-01
The primary aim of this study was to examine the contributions of individual characteristics and strategic processing to the prediction of decision quality. Data were provided by 176 adults, ages 18 to 93 years, who completed computerized decision-making vignettes and a battery of demographic and cognitive measures. We examined the relations among age, domain-specific experience, working memory, and three measures of strategic information search to the prediction of solution quality using a 4-step hierarchical linear regression analysis. Working memory and two measures of strategic processing uniquely contributed to the variance explained. Results are discussed in terms of potential advances to both theory and intervention efforts. PMID:24282638
Investigating the impact of spatial priors on the performance of model-based IVUS elastography
Richards, M S; Doyley, M M
2012-01-01
This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648
Kao, C Y; Aranda, S; Krishnasamy, M; Hamilton, B
2017-03-01
Patient misunderstanding of cancer clinical trial participation is identified as a critical issue and researchers have developed and tested a variety of interventions to improve patient understanding. This systematic review identified nine papers published between 2000 and 2013, to evaluate the effects of interventions to improve patient understanding of cancer clinical trial participation. Types of interventions included audio-visual information, revised written information and a communication training workshop. Interventions were conducted alone or in combination with other forms of information provision. The nine papers, all with methodological limitations, reported mixed effects on a small range of outcomes regarding improved patient understanding of cancer clinical trial participation. The methodological limitations included: (1) the intervention development process was poorly described; (2) only a small element of the communication process was addressed; (3) studies lacked evidence regarding what information is essential and critical to enable informed consent; (4) studies lacked reliable and valid outcome measures to show that patients are sufficiently informed to provide consent; and (5) the intervention development process lacked a theoretical framework. Future research needs to consider these factors when developing interventions to improve communication and patient understanding during the informed consent process. © 2016 John Wiley & Sons Ltd.
Context-based virtual metrology
NASA Astrophysics Data System (ADS)
Ebersbach, Peter; Urbanowicz, Adam M.; Likhachev, Dmitriy; Hartig, Carsten; Shifrin, Michael
2018-03-01
Hybrid and data feed forward methodologies are well established for advanced optical process control solutions in highvolume semiconductor manufacturing. Appropriate information from previous measurements, transferred into advanced optical model(s) at following step(s), provides enhanced accuracy and exactness of the measured topographic (thicknesses, critical dimensions, etc.) and material parameters. In some cases, hybrid or feed-forward data are missed or invalid for dies or for a whole wafer. We focus on approaches of virtual metrology to re-create hybrid or feed-forward data inputs in high-volume manufacturing. We discuss missing data inputs reconstruction which is based on various interpolation and extrapolation schemes and uses information about wafer's process history. Moreover, we demonstrate data reconstruction approach based on machine learning techniques utilizing optical model and measured spectra. And finally, we investigate metrics that allow one to assess error margin of virtual data input.
NASA Astrophysics Data System (ADS)
Bu, Xianye; Dong, Hongli; Han, Fei; Li, Gongfa
2018-07-01
This paper is concerned with the distributed filtering problem for a class of time-varying systems subject to deception attacks and event-triggering protocols. Due to the bandwidth limitation, an event-triggered communication strategy is adopted to alleviate the data transmission pressure in the algorithm implementation process. The partial nodes-based filtering problem is considered, where only a partial of nodes can measure the information of the plant. Meanwhile, the measurement information possibly suffers the deception attacks in the transmission process. Sufficient conditions can be established such that the error dynamics satisfies the prescribed average ? performance constraints. The parameters of designed filters can be calculated by solving a series of recursive linear matrix inequalities. A simulation example is presented to demonstrate the effectiveness of the proposed filtering method in this paper.
Dynamic information processing states revealed through neurocognitive models of object semantics
Clarke, Alex
2015-01-01
Recognising objects relies on highly dynamic, interactive brain networks to process multiple aspects of object information. To fully understand how different forms of information about objects are represented and processed in the brain requires a neurocognitive account of visual object recognition that combines a detailed cognitive model of semantic knowledge with a neurobiological model of visual object processing. Here we ask how specific cognitive factors are instantiated in our mental processes and how they dynamically evolve over time. We suggest that coarse semantic information, based on generic shared semantic knowledge, is rapidly extracted from visual inputs and is sufficient to drive rapid category decisions. Subsequent recurrent neural activity between the anterior temporal lobe and posterior fusiform supports the formation of object-specific semantic representations – a conjunctive process primarily driven by the perirhinal cortex. These object-specific representations require the integration of shared and distinguishing object properties and support the unique recognition of objects. We conclude that a valuable way of understanding the cognitive activity of the brain is though testing the relationship between specific cognitive measures and dynamic neural activity. This kind of approach allows us to move towards uncovering the information processing states of the brain and how they evolve over time. PMID:25745632
Kuniecki, Michał; Wołoszyn, Kinga; Domagalik, Aleksandra; Pilarczyk, Joanna
2018-05-01
Processing of emotional visual information engages cognitive functions and induces arousal. We aimed to examine the modulatory role of emotional valence on brain activations linked to the processing of visual information and those linked to arousal. Participants were scanned and their pupil size was measured while viewing negative and neutral images. The visual noise was added to the images in various proportions to parametrically manipulate the amount of visual information. Pupil size was used as an index of physiological arousal. We show that arousal induced by the negative images, as compared to the neutral ones, is primarily related to greater amygdala activity while increasing visibility of negative content to enhanced activity in the lateral occipital complex (LOC). We argue that more intense visual processing of negative scenes can occur irrespective of the level of arousal. It may suggest that higher areas of the visual stream are fine-tuned to process emotionally relevant objects. Both arousal and processing of emotional visual information modulated activity within the ventromedial prefrontal cortex (vmPFC). Overlapping activations within the vmPFC may reflect the integration of these aspects of emotional processing. Additionally, we show that emotionally-evoked pupil dilations are related to activations in the amygdala, vmPFC, and LOC.
Thomas, Michael L; Green, Michael F; Hellemann, Gerhard; Sugar, Catherine A; Tarasenko, Melissa; Calkins, Monica E; Greenwood, Tiffany A; Gur, Raquel E; Gur, Ruben C; Lazzeroni, Laura C; Nuechterlein, Keith H; Radant, Allen D; Seidman, Larry J; Shiluk, Alexandra L; Siever, Larry J; Silverman, Jeremy M; Sprock, Joyce; Stone, William S; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L; Light, Gregory A
2017-01-01
Neurophysiologic measures of early auditory information processing (EAP) are used as endophenotypes in genomic studies and biomarkers in clinical intervention studies. Research in schizophrenia has established correlations among measures of EAP, cognition, clinical symptoms, and functional outcome. Clarifying these associations by determining the pathways through which deficits in EAP affect functioning would suggest when and where to therapeutically intervene. To characterize the pathways from EAP to outcome and to estimate the extent to which enhancement of basic information processing might improve cognition and psychosocial functioning in schizophrenia. Cross-sectional data were analyzed using structural equation modeling to examine the associations among EAP, cognition, negative symptoms, and functional outcome. Participants were recruited from the community at 5 geographically distributed laboratories as part of the Consortium on the Genetics of Schizophrenia 2 from July 1, 2010, through January 31, 2014. This well-characterized cohort of 1415 patients with schizophrenia underwent EAP, cognitive, and thorough clinical and functional assessment. Mismatch negativity, P3a, and reorienting negativity were used to measure EAP. Cognition was measured by the Letter Number Span test and scales from the California Verbal Learning Test-Second Edition, the Wechsler Memory Scale-Third Edition, and the Penn Computerized Neurocognitive Battery. Negative symptoms were measured by the Scale for the Assessment of Negative Symptoms. Functional outcome was measured by the Role Functioning Scale. Participants included 1415 unrelated outpatients diagnosed with schizophrenia or schizoaffective disorder (mean [SD] age, 46 [11] years; 979 males [69.2%] and 619 white [43.7%]). Early auditory information processing had a direct effect on cognition (β = 0.37, P < .001), cognition had a direct effect on negative symptoms (β = -0.16, P < .001), and both cognition (β = 0.26, P < .001) and experiential negative symptoms (β = -0.75, P < .001) had direct effects on functional outcome. The indirect effect of EAP on functional outcome was significant as well (β = 0.14, P < .001). Overall, EAP had a fully mediated effect on functional outcome, engaging general rather than modality-specific cognition, with separate pathways that involved or bypassed negative symptoms. The data support a model in which EAP deficits lead to poor functional outcome via impaired cognition and increased negative symptoms. Results can be used to help guide mechanistically informed, personalized treatments and support the strategy of using EAP measures as surrogate end points in early-stage procognitive intervention studies.
Cognitive models of pilot categorization and prioritization of flight-deck information
NASA Technical Reports Server (NTRS)
Jonsson, Jon E.; Ricks, Wendell R.
1995-01-01
In the past decade, automated systems on modern commercial flight decks have increased dramatically. Pilots now regularly interact and share tasks with these systems. This interaction has led human factors research to direct more attention to the pilot's cognitive processing and mental model of the information flow occurring on the flight deck. The experiment reported herein investigated how pilots mentally represent and process information typically available during flight. Fifty-two commercial pilots participated in tasks that required them to provide similarity ratings for pairs of flight-deck information and to prioritize this information under two contextual conditions. Pilots processed the information along three cognitive dimensions. These dimensions included the flight function and the flight action that the information supported and how frequently pilots refer to the information. Pilots classified the information as aviation, navigation, communications, or systems administration information. Prioritization results indicated a high degree of consensus among pilots, while scaling results revealed two dimensions along which information is prioritized. Pilot cognitive workload for flight-deck tasks and the potential for using these findings to operationalize cognitive metrics are evaluated. Such measures may be useful additions for flight-deck human performance evaluation.
Development of Field Information Monitoring System Based on the Internet of Things
NASA Astrophysics Data System (ADS)
Cai, Ken; Liang, Xiaoying; Wang, Keqiang
With the rapid development and wide application of electronics, communication and embedded system technologies, the global agriculture is changing from traditional agriculture that is to improve the production relying on the increase of labor, agricultural inputs to the new stage of modern agriculture with low yields, high efficiency, real-time and accuracy. On the other hand the research and development of the Internet of Things, which is an information network to connect objects, with the full capacity to perceive objects, and having the capabilities of reliable transmission and intelligence processing for information, allows us to obtain real-time information of anything. The application of the Internet of Things in field information online monitoring is an effective solution for present wired sensor monitoring system, which has much more disadvantages, such as high cost, the problems of laying lines and so on. In this paper, a novel field information monitoring system based on the Internet of Things is proposed. It can satisfy the requirements of multi-point measurement, mobility, convenience in the field information monitoring process. The whole structure of system is given and the key designs of system design are described in the hardware and software aspect. The studies have expanded current field information measurement methods and strengthen the application of the Internet of Things.
Taylor, H E; Bramley, D E P
2012-11-01
The provision of written information is a component of the informed consent process for research participants. We conducted a readability analysis to test the hypothesis that the language used in patient information and consent forms in anaesthesia research in Australia and New Zealand does not meet the readability standards or expectations of the Good Clinical Practice Guidelines, the National Health and Medical Research Council in Australia and the Health Research Council of New Zealand. We calculated readability scores for 40 patient information and consent forms using the Simple Measure of Gobbledygook and Flesch-Kincaid formulas. The mean grade level of patient information and consent forms when using the Simple Measure of Gobbledygook and Flesch-Kincaid readability formulas was 12.9 (standard deviation of 0.8, 95% confidence interval 12.6 to 13.1) and 11.9 (standard deviation 1.1, 95% confidence interval 11.6 to 12.3), respectively. This exceeds the average literacy and comprehension of the general population in Australia and New Zealand. Complex language decreases readability and negatively impacts on the informed consent process. Care should be exercised when providing written information to research participants to ensure language and readability is appropriate for the audience.
Gaussian random bridges and a geometric model for information equilibrium
NASA Astrophysics Data System (ADS)
Mengütürk, Levent Ali
2018-03-01
The paper introduces a class of conditioned stochastic processes that we call Gaussian random bridges (GRBs) and proves some of their properties. Due to the anticipative representation of any GRB as the sum of a random variable and a Gaussian (T , 0) -bridge, GRBs can model noisy information processes in partially observed systems. In this spirit, we propose an asset pricing model with respect to what we call information equilibrium in a market with multiple sources of information. The idea is to work on a topological manifold endowed with a metric that enables us to systematically determine an equilibrium point of a stochastic system that can be represented by multiple points on that manifold at each fixed time. In doing so, we formulate GRB-based information diversity over a Riemannian manifold and show that it is pinned to zero over the boundary determined by Dirac measures. We then define an influence factor that controls the dominance of an information source in determining the best estimate of a signal in the L2-sense. When there are two sources, this allows us to construct information equilibrium as a functional of a geodesic-valued stochastic process, which is driven by an equilibrium convergence rate representing the signal-to-noise ratio. This leads us to derive price dynamics under what can be considered as an equilibrium probability measure. We also provide a semimartingale representation of Markovian GRBs associated with Gaussian martingales and a non-anticipative representation of fractional Brownian random bridges that can incorporate degrees of information coupling in a given system via the Hurst exponent.
BIOMONIITORING RESEARCH WITHIN THE U.S. EPA'S OFFICE OF RESEARCH AND DEVELOPMENT
Current ORD exposure research is directed toward developing the processes, tools, and information to put biomonitoring data into perspective for the risk assessment process, to define the appropriate uses of specific biomarkers, and to integrate biomarker measurements with exposu...
Peters, Judith C; Vlamings, Petra; Kemner, Chantal
2013-05-01
Face perception in adults depends on skilled processing of interattribute distances ('configural' processing), which is disrupted for faces presented in inverted orientation (face inversion effect or FIE). Children are not proficient in configural processing, and this might relate to an underlying immaturity to use facial information in low spatial frequency (SF) ranges, which capture the coarse information needed for configural processing. We hypothesized that during adolescence a shift from use of high to low SF information takes place. Therefore, we studied the influence of SF content on neural face processing in groups of children (9-10 years), adolescents (14-15 years) and young adults (21-29 years) by measuring event-related potentials (ERPs) to upright and inverted faces which varied in SF content. Results revealed that children show a neural FIE in early processing stages (i.e. P1; generated in early visual areas), suggesting a superficial, global facial analysis. In contrast, ERPs of adults revealed an FIE at later processing stages (i.e. N170; generated in face-selective, higher visual areas). Interestingly, adolescents showed FIEs in both processing stages, suggesting a hybrid developmental stage. Furthermore, adolescents and adults showed FIEs for stimuli containing low SF information, whereas such effects were driven by both low and high SF information in children. These results indicate that face processing has a protracted maturational course into adolescence, and is dependent on changes in SF processing. During adolescence, sensitivity to configural cues is developed, which aids the fast and holistic processing that is so special for faces. © 2013 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Information-theoretic decomposition of embodied and situated systems.
Da Rold, Federico
2018-07-01
The embodied and situated view of cognition stresses the importance of real-time and nonlinear bodily interaction with the environment for developing concepts and structuring knowledge. In this article, populations of robots controlled by an artificial neural network learn a wall-following task through artificial evolution. At the end of the evolutionary process, time series are recorded from perceptual and motor neurons of selected robots. Information-theoretic measures are estimated on pairings of variables to unveil nonlinear interactions that structure the agent-environment system. Specifically, the mutual information is utilized to quantify the degree of dependence and the transfer entropy to detect the direction of the information flow. Furthermore, the system is analyzed with the local form of such measures, thus capturing the underlying dynamics of information. Results show that different measures are interdependent and complementary in uncovering aspects of the robots' interaction with the environment, as well as characteristics of the functional neural structure. Therefore, the set of information-theoretic measures provides a decomposition of the system, capturing the intricacy of nonlinear relationships that characterize robots' behavior and neural dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Spike count, spike timing and temporal information in the cortex of awake, freely moving rats
Scaglione, Alessandro; Foffani, Guglielmo; Moxon, Karen A.
2014-01-01
Objective Sensory processing of peripheral information is not stationary but is, in general, a dynamic process related to the behavioral state of the animal. Yet the link between the state of the behavior and the encoding properties of neurons is unclear. This report investigates the impact of the behavioral state on the encoding mechanisms used by cortical neurons for both detection and discrimination of somatosensory stimuli in awake, freely moving, rats. Approach Neuronal activity was recorded from the primary somatosensory cortex of five rats under two different behavioral states (quiet vs. whisking) while electrical stimulation of increasing stimulus strength was delivered to the mystacial pad. Information theoretical measures were then used to measure the contribution of different encoding mechanisms to the information carried by neurons in response to the whisker stimulation. Main Results We found that the behavioral state of the animal modulated the total amount of information conveyed by neurons and that the timing of individual spikes increased the information compared to the total count of spikes alone. However, the temporal information, i.e. information exclusively related to when the spikes occur, was not modulated by behavioral state. Significance We conclude that information about somatosensory stimuli is modulated by the behavior of the animal and this modulation is mainly expressed in the spike count while the temporal information is more robust to changes in behavioral state. PMID:25024291
The Social Context of Urban Classrooms: Measuring Student Psychological Climate
ERIC Educational Resources Information Center
Frazier, Stacy L.; Mehta, Tara G.; Atkins, Marc S.; Glisson, Charles; Green, Philip D.; Gibbons, Robert D.; Kim, Jong Bae; Chapman, Jason E.; Schoenwald, Sonja K.; Cua, Grace; Ogle, Robert R.
2015-01-01
Classrooms are unique and complex work settings in which teachers and students both participate in and contribute to classroom processes. This article describes the measurement phase of a study that examined the social ecology of urban classrooms. Informed by the dimensions and items of an established measure of organizational climate, we designed…
Approaches to the Measurement of the Impact of Knowledge Management Programmes.
ERIC Educational Resources Information Center
Martin, William J.
2000-01-01
Examines the problem of knowledge measurement and, in reviewing some of the current alternatives, argues for the importance of metrics to the overall process of knowledge management. Discusses the nature of intellectual capital and emphasizes the significance of knowledge measurement to the information science community. (Contains 10 references.)…
Design and validation of instruments to measure knowledge.
Elliott, T E; Regal, R R; Elliott, B A; Renier, C M
2001-01-01
Measuring health care providers' learning after they have participated in educational interventions that use experimental designs requires valid, reliable, and practical instruments. A literature review was conducted. In addition, experience gained from designing and validating instruments for measuring the effect of an educational intervention informed this process. The eight main steps for designing, validating, and testing the reliability of instruments for measuring learning outcomes are presented. The key considerations and rationale for this process are discussed. Methods for critiquing and adapting existent instruments and creating new ones are offered. This study may help other investigators in developing valid, reliable, and practical instruments for measuring the outcomes of educational activities.
Calvete, Esther; Orue, Izaskun; Gamez-Guadix, Manuel; López de Arroyabe, Elena
2016-04-01
The aim of this study was to assess the reciprocal associations among social information processing (SIP) in dating conflicts and the perpetration of dating aggression. A first step involved the development of a measure (The Social Information Processing Questionnaire in Dating Conflicts, SIPQ-DC) to assess social information in scenarios of conflict with dating partners. A sample of 1,272 adolescents (653 girls, 619 boys; Mage = 14.74 years, SD = 1.21) completed measures of SIP and dating aggression perpetration in two different times, which were spaced 1 year apart. Confirmatory factor analyses provided support for a model with five correlated factors for the SIPQ-DC, namely, hostile attribution, anger, aggressive response access, anticipation of positive consequences for oneself, and anticipation of negative consequences for partners. Although the perpetration of dating aggression at T1 was cross-sectionally associated with all the SIP components, anger was the only component that predicted the residual increase in dating aggression behavior over time. The perpetration of dating aggression predicted a worsening of cognitive-emotional processes involved in dating conflicts. Some longitudinal paths were significant only in male adolescents. In conclusion, relationships among SIP and aggression are reciprocal. Gender differences in longitudinal paths can contribute to explaining men's higher perpetration of violence in adulthood. © The Author(s) 2014.
What constitutes evidence-based patient information? Overview of discussed criteria.
Bunge, Martina; Mühlhauser, Ingrid; Steckelberg, Anke
2010-03-01
To survey quality criteria for evidence-based patient information (EBPI) and to compile the evidence for the identified criteria. Databases PubMed, Cochrane Library, PsycINFO, PSYNDEX and Education Research Information Center (ERIC) were searched to update the pool of criteria for EBPI. A subsequent search aimed to identify evidence for each criterion. Only studies on health issues with cognitive outcome measures were included. Evidence for each criterion is presented using descriptive methods. 3 systematic reviews, 24 randomized-controlled studies and 1 non-systematic review were included. Presentation of numerical data, verbal presentation of risks and diagrams, graphics and charts are based on good evidence. Content of information and meta-information, loss- and gain-framing and patient-oriented outcome measures are based on ethical guidelines. There is a lack of studies on quality of evidence, pictures and drawings, patient narratives, cultural aspects, layout, language and development process. The results of this review allow specification of EBPI and may help to advance the discourse among related disciplines. Research gaps are highlighted. Findings outline the type and extent of content of EBPI, guide the presentation of information and describe the development process. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Impaired Filtering of Behaviourally Irrelevant Visual Information in Dyslexia
ERIC Educational Resources Information Center
Roach, Neil W.; Hogben, John H.
2007-01-01
A recent proposal suggests that dyslexic individuals suffer from attentional deficiencies, which impair the ability to selectively process incoming visual information. To investigate this possibility, we employed a spatial cueing procedure in conjunction with a single fixation visual search task measuring thresholds for discriminating the…
Towards a Performance Data and Development System: Getting Rid of Performance Appraisal.
ERIC Educational Resources Information Center
Janz, Tom
If organizations are to measure and use worker performance information effectively, they must distinguish between two components of performance appraisal: performance data (recorded information for comparing workers) and performance development (the process of improving human assets by discouraging ineffective and reinforcing effective job…
Understanding Science: Studies of Communication and Information.
ERIC Educational Resources Information Center
Griffith, Belver C.
1989-01-01
Sets bibliometrics in the context of the sociology of science by tracing the influences of Robert Merton, Thomas Kuhn, and D. J. Price. Explores the discovery of strong empirical relationships among measured communication and information that capture important features of social process and cognitive change in science. (SR)
Measurement and modeling of unsaturated hydraulic conductivity
Perkins, Kim S.; Elango, Lakshmanan
2011-01-01
The unsaturated zone plays an extremely important hydrologic role that influences water quality and quantity, ecosystem function and health, the connection between atmospheric and terrestrial processes, nutrient cycling, soil development, and natural hazards such as flooding and landslides. Unsaturated hydraulic conductivity is one of the main properties considered to govern flow; however it is very difficult to measure accurately. Knowledge of the highly nonlinear relationship between unsaturated hydraulic conductivity (K) and volumetric water content is required for widely-used models of water flow and solute transport processes in the unsaturated zone. Measurement of unsaturated hydraulic conductivity of sediments is costly and time consuming, therefore use of models that estimate this property from more easily measured bulk-physical properties is common. In hydrologic studies, calculations based on property-transfer models informed by hydraulic property databases are often used in lieu of measured data from the site of interest. Reliance on database-informed predicted values with the use of neural networks has become increasingly common. Hydraulic properties predicted using databases may be adequate in some applications, but not others. This chapter will discuss, by way of examples, various techniques used to measure and model hydraulic conductivity as a function of water content, K. The parameters that describe the K curve obtained by different methods are used directly in Richards’ equation-based numerical models, which have some degree of sensitivity to those parameters. This chapter will explore the complications of using laboratory measured or estimated properties for field scale investigations to shed light on how adequately the processes are represented. Additionally, some more recent concepts for representing unsaturated-zone flow processes will be discussed.
Continuous Odour Measurement with Chemosensor Systems
NASA Astrophysics Data System (ADS)
Boeker, Peter; Haas, T.; Diekmann, B.; Lammer, P. Schulze
2009-05-01
The continuous odour measurement is a challenging task for chemosensor systems. Firstly, a long term and stable measurement mode must be guaranteed in order to preserve the validity of the time consuming and expensive olfactometric calibration data. Secondly, a method is needed to deal with the incoming sensor data. The continuous online detection of signal patterns, the correlated gas emission and the assigned odour data is essential for the continuous odour measurement. Thirdly, a severe danger of over-fitting in the process of the odour calibration is present, because of the high measurement uncertainty of the olfactometry. In this contribution we present a technical solution for continuous measurements comprising of a hybrid QMB-sensor array and electrochemical cells. A set of software tools enables the efficient data processing and calibration and computes the calibration parameters. The internal software of the measurement systems microcontroller processes the calibration parameters online for the output of the desired odour information.
IMM estimator with out-of-sequence measurements
NASA Astrophysics Data System (ADS)
Bar-Shalom, Yaakov; Chen, Huimin
2004-08-01
In multisensor tracking systems that operate in a centralized information processing architecture, measurements from the same target obtained by different sensors can arrive at the processing center out of sequence. In order to avoid either a delay in the output or the need for reordering and reprocessing an entire sequence of measurements, such measurements have to be processed as out-of-sequence measurements (OOSM). Recent work developed procedures for incorporating OOSMs into a Kalman filter (KF). Since the state of the art tracker for real (maneuvering) targets is the Interacting Multiple Model (IMM) estimator, this paper presents the algorithm for incorporating OOSMs into an IMM estimator. Both data association and estimation are considered. Simulation results are presented for two realistic problems using measurements from two airborne GMTI sensors. It is shown that the proposed algorithm for incorporating OOSMs into an IMM estimator yields practically the same performance as the reordering and in-sequence reprocessing of the measurements.
Natural language processing and the representation of clinical data.
Sager, N; Lyman, M; Bucknall, C; Nhan, N; Tick, L J
1994-01-01
OBJECTIVE: Develop a representation of clinical observations and actions and a method of processing free-text patient documents to facilitate applications such as quality assurance. DESIGN: The Linguistic String Project (LSP) system of New York University utilizes syntactic analysis, augmented by a sublanguage grammar and an information structure that are specific to the clinical narrative, to map free-text documents into a database for querying. MEASUREMENTS: Information precision (I-P) and information recall (I-R) were measured for queries for the presence of 13 asthma-health-care quality assurance criteria in a database generated from 59 discharge letters. RESULTS: I-P, using counts of major errors only, was 95.7% for the 28-letter training set and 98.6% for the 31-letter test set. I-R, using counts of major omissions only, was 93.9% for the training set and 92.5% for the test set. PMID:7719796
Pupil measures of alertness and mental load
NASA Technical Reports Server (NTRS)
Backs, Richard W.; Walrath, Larry C.
1988-01-01
A study of eight adults given active and passive search tasks showed that evoked pupillary response was sensitive to information processing demands. In particular, large pupillary diameter was observed in the active search condition where subjects were actively processing information relevant to task performance, as opposed to the passive search (control) condition where subjects passively viewed the displays. However, subjects may have simply been more aroused in the active search task. Of greater importance was that larger pupillary diameter, corresponding to longer search time, was observed for noncoded than for color-coded displays in active search. In the control condition, pupil diameter was larger with the color displays. The data indicate potential usefulness of pupillary responses in evaluating the information processing requirements of visual displays.
Freeway performance measurement system : an operational analysis tool
DOT National Transportation Integrated Search
2001-07-30
PeMS is a freeway performance measurement system for all of California. It processes 2 : GB/day of 30-second loop detector data in real time to produce useful information. Managers : at any time can have a uniform, and comprehensive assessment of fre...
NASA Astrophysics Data System (ADS)
Hartkorn, O. A.; Ritter, B.; Meskers, A. J. H.; Miles, O.; Russwurm, M.; Scully, S.; Roldan, A.; Juestel, P.; Reville, V.; Lupu, S.; Ruffenach, A.
2014-12-01
The Earth's magnetosphere is formed as a consequence of the interaction between the planet's magnetic field and the solar wind, a continuous plasma stream from the Sun. A number of different solar wind phenomena have been studied over the past forty years with the intention of understandingand forcasting solar behavior and space weather. In particular, Earth-bound interplanetary coronal mass ejections (CMEs) can significantly disturb the Earth's magnetosphere for a short time and cause geomagnetic storms. We present a mission concept consisting of six spacecraft that are equally spaced in a heliocentric orbit at 0.72 AU. These spacecraft will monitor the plasma properties, the magnetic field's orientation and magnitude, and the 3D-propagation trajectory of CMEs heading for Earth. The primary objective of this mission is to increase space weather forecasting time by means of a near real-time information service, that is based upon in-situ and remote measurements of the CME properties. The mission secondary objective is the improvement of scientific space weather models. In-situ measurements are performed using a Solar Wind Analyzer instrumentation package and flux gate magnetometers. For remote measurements, coronagraphs are employed. The proposed instruments originate from other space missions with the intention to reduce mission costs and to streamline the mission design process. Communication with the six identical spacecraft is realized via a deep space network consisting of six ground stations. This network provides an information service that is in uninterrupted contact with the spacecraft, allowing for continuos space weather monitoring. A dedicated data processing center will handle all the data, and forward the processed data to the SSA Space Weather Coordination Center. This organization will inform the general public through a space weather forecast. The data processing center will additionally archive the data for the scientific community. This concept mission allows for major advances in space weather forecasting and the scientific modeling of space weather.
Measuring and improving quality of care in an academic medical center.
Blayney, Douglas W
2013-05-01
The Donabedian definition of quality—structure, process, and outcome—provides a useful framework. A relentless focus on measuring process adherence and outcome is critical. Systemic improvements usually require teams to plan and to implement them. The lean or Toyota production system for process improvement is one useful method of organizing work, although different approaches are often necessary at the physician, practice unit, and statewide level. Challenges include scalability of the change (ie, rolling them out across the institution or system), tailoring the information technology tools, and building systems for sustainability.
NASA Technical Reports Server (NTRS)
Horsham, Gray A. P.
1998-01-01
Market research sources were used to initially gather primary technological problems and needs data from non-aerospace companies in targeted industry sectors. The company-supplied information served as input data to activate or start-up an internal, phased match-making process. This process was based on technical-level relationship exploration followed by business-level agreement negotiations, and culminated with project management and execution. Space Act Agreements represented near-term outputs. Company product or process commercialization derived from Lewis support and measurable economic effects represented far-term outputs.
Pisoni, David B; Kronenberger, William G; Roman, Adrienne S; Geers, Ann E
2011-02-01
Conventional assessments of outcomes in deaf children with cochlear implants (CIs) have focused primarily on endpoint or product measures of speech and language. Little attention has been devoted to understanding the basic underlying core neurocognitive factors involved in the development and processing of speech and language. In this study, we examined the development of factors related to the quality of phonological information in immediate verbal memory, including immediate memory capacity and verbal rehearsal speed, in a sample of deaf children after >10 yrs of CI use and assessed the correlations between these two process measures and a set of speech and language outcomes. Of an initial sample of 180 prelingually deaf children with CIs assessed at ages 8 to 9 yrs after 3 to 7 yrs of CI use, 112 returned for testing again in adolescence after 10 more years of CI experience. In addition to completing a battery of conventional speech and language outcome measures, subjects were administered the Wechsler Intelligence Scale for Children-III Digit Span subtest to measure immediate verbal memory capacity. Sentence durations obtained from the McGarr speech intelligibility test were used as a measure of verbal rehearsal speed. Relative to norms for normal-hearing children, Digit Span scores were well below average for children with CIs at both elementary and high school ages. Improvement was observed over the 8-yr period in the mean longest digit span forward score but not in the mean longest digit span backward score. Longest digit span forward scores at ages 8 to 9 yrs were significantly correlated with all speech and language outcomes in adolescence, but backward digit spans correlated significantly only with measures of higher-order language functioning over that time period. While verbal rehearsal speed increased for almost all subjects between elementary grades and high school, it was still slower than the rehearsal speed obtained from a control group of normal-hearing adolescents. Verbal rehearsal speed at ages 8 to 9 yrs was also found to be strongly correlated with speech and language outcomes and Digit Span scores in adolescence. Despite improvement after 8 additional years of CI use, measures of immediate verbal memory capacity and verbal rehearsal speed, which reflect core fundamental information processing skills associated with representational efficiency and information processing capacity, continue to be delayed in children with CIs relative to NH peers. Furthermore, immediate verbal memory capacity and verbal rehearsal speed at 8 to 9 yrs of age were both found to predict speech and language outcomes in adolescence, demonstrating the important contribution of these processing measures for speech-language development in children with CIs. Understanding the relations between these core underlying processes and speech-language outcomes in children with CIs may help researchers to develop new approaches to intervention and treatment of deaf children who perform poorly with their CIs. Moreover, this knowledge could be used for early identification of deaf children who may be at high risk for poor speech and language outcomes after cochlear implantation as well as for the development of novel targeted interventions that focus selectively on these core elementary information processing variables.
Roth, Alexandra K; Denney, Douglas R; Lynch, Sharon G
2015-01-01
The Attention Network Test (ANT) assesses attention in terms of discrepancies between response times to items that differ in the burden they place on some facet of attention. However, simple arithmetic difference scores commonly used to capture these discrepancies fail to provide adequate control for information processing speed, leading to distorted findings when patient and control groups differ markedly in the speed with which they process and respond to stimulus information. This study examined attention networks in patients with multiple sclerosis (MS) using simple difference scores, proportional scores, and residualized scores that control for processing speed through statistical regression. Patients with relapsing-remitting (N = 20) or secondary progressive (N = 20) MS and healthy controls (N = 40) of similar age, education, and gender completed the ANT. Substantial differences between patients and controls were found on all measures of processing speed. Patients exhibited difficulties in the executive control network, but only when difference scores were considered. When deficits in information processing speed were adequately controlled using proportional or residualized score, deficits in the alerting network emerged. The effect sizes for these deficits were notably smaller than those for overall information processing speed and were also limited to patients with secondary progressive MS. Deficits in processing speed are more prominent in MS than those involving attention, and when the former are properly accounted for, differences in the latter are confined to the alerting network.
NASA Technical Reports Server (NTRS)
Mah, Robert W. (Inventor)
2005-01-01
System and method for performing one or more relevant measurements at a target site in an animal body, using a probe. One or more of a group of selected internal measurements is performed at the target site, is optionally combined with one or more selected external measurements, and is optionally combined with one or more selected heuristic information items, in order to reduce to a relatively small number the probable medical conditions associated with the target site. One or more of the internal measurements is optionally used to navigate the probe to the target site. Neural net information processing is performed to provide a reduced set of probable medical conditions associated with the target site.
Sensing roller for in-process thickness measurement
Novak, J.L.
1996-07-16
An apparatus and method are disclosed for processing materials by sensing roller, in which the sensing roller has a plurality of conductive rings (electrodes) separated by rings of dielectric material. Sensing capacitances or impedances between the electrodes provides information on thicknesses of the materials being processed, location of wires therein, and other like characteristics of the materials. 6 figs.
ERIC Educational Resources Information Center
Pittman, Joe F.; Kerpelman, Jennifer L.; Lamke, Leanne K.; Sollie, Donna L.
2009-01-01
Identity styles represent strategies individuals use to explore identity-related issues. Berzonsky (Berzonsky, M. D. (1992). Identity style and coping strategies. "Journal of Personality, 60", 771-788) identified three styles: informational, normative, and diffuse. In three studies, this paper presents (a) the identity processing style Q-sort…
Speed of perceptual grouping in acquired brain injury.
Kurylo, Daniel D; Larkin, Gabriella Brick; Waxman, Richard; Bukhari, Farhan
2014-09-01
Evidence exists that damage to white matter connections may contribute to reduced speed of information processing in traumatic brain injury and stroke. Damage to such axonal projections suggests a particular vulnerability to functions requiring integration across cortical sites. To test this prediction, measurements were made of perceptual grouping, which requires integration of stimulus components. A group of traumatic brain injury and cerebral vascular accident patients and a group of age-matched healthy control subjects viewed arrays of dots and indicated the pattern into which stimuli were perceptually grouped. Psychophysical measurements were made of perceptual grouping as well as processing speed. The patient group showed elevated grouping thresholds as well as extended processing time. In addition, most patients showed progressive slowing of processing speed across levels of difficulty, suggesting reduced resources to accommodate increased demands on grouping. These results support the prediction that brain injury results in a particular vulnerability to functions requiring integration of information across the cortex, which may result from dysfunction of long-range axonal connection.
Prioritizing Measures of Digital Patient Engagement: A Delphi Expert Panel Study
2017-01-01
Background Establishing a validated scale of patient engagement through use of information technology (ie, digital patient engagement) is the first step to understanding its role in health and health care quality, outcomes, and efficient implementation by health care providers and systems. Objective The aim of this study was to develop and prioritize measures of digital patient engagement based on patients’ use of the US Department of Veterans Affairs (VA)’s MyHealtheVet (MHV) portal, focusing on the MHV/Blue Button and Secure Messaging functions. Methods We aligned two models from the information systems and organizational behavior literatures to create a theory-based model of digital patient engagement. On the basis of this model, we conducted ten key informant interviews to identify potential measures from existing VA studies and consolidated the measures. We then conducted three rounds of modified Delphi rating by 12 national eHealth experts via Web-based surveys to prioritize the measures. Results All 12 experts completed the study’s three rounds of modified Delphi ratings, resulting in two sets of final candidate measures representing digital patient engagement for Secure Messaging (58 measures) and MHV/Blue Button (71 measures). These measure sets map to Donabedian’s three types of quality measures: (1) antecedents (eg, patient demographics); (2) processes (eg, a novel measure of Web-based care quality); and (3) outcomes (eg, patient engagement). Conclusions This national expert panel study using a modified Delphi technique prioritized candidate measures to assess digital patient engagement through patients’ use of VA’s My HealtheVet portal. The process yielded two robust measures sets prepared for future piloting and validation in surveys among Veterans. PMID:28550008
Prioritizing Measures of Digital Patient Engagement: A Delphi Expert Panel Study.
Garvin, Lynn A; Simon, Steven R
2017-05-26
Establishing a validated scale of patient engagement through use of information technology (ie, digital patient engagement) is the first step to understanding its role in health and health care quality, outcomes, and efficient implementation by health care providers and systems. The aim of this study was to develop and prioritize measures of digital patient engagement based on patients' use of the US Department of Veterans Affairs (VA)'s MyHealtheVet (MHV) portal, focusing on the MHV/Blue Button and Secure Messaging functions. We aligned two models from the information systems and organizational behavior literatures to create a theory-based model of digital patient engagement. On the basis of this model, we conducted ten key informant interviews to identify potential measures from existing VA studies and consolidated the measures. We then conducted three rounds of modified Delphi rating by 12 national eHealth experts via Web-based surveys to prioritize the measures. All 12 experts completed the study's three rounds of modified Delphi ratings, resulting in two sets of final candidate measures representing digital patient engagement for Secure Messaging (58 measures) and MHV/Blue Button (71 measures). These measure sets map to Donabedian's three types of quality measures: (1) antecedents (eg, patient demographics); (2) processes (eg, a novel measure of Web-based care quality); and (3) outcomes (eg, patient engagement). This national expert panel study using a modified Delphi technique prioritized candidate measures to assess digital patient engagement through patients' use of VA's My HealtheVet portal. The process yielded two robust measures sets prepared for future piloting and validation in surveys among Veterans. ©Lynn A Garvin, Steven R Simon. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.05.2017.
Pan, Jinger; Laubrock, Jochen; Yan, Ming
2016-08-01
We examined how reading mode (i.e., silent vs. oral reading) influences parafoveal semantic and phonological processing during the reading of Chinese sentences, using the gaze-contingent boundary paradigm. In silent reading, we found in 2 experiments that reading times on target words were shortened with semantic previews in early and late processing, whereas phonological preview effects mainly occurred in gaze duration or second-pass reading. In contrast, results showed that phonological preview information is obtained early on in oral reading. Strikingly, in oral reading, we observed a semantic preview cost on the target word in Experiment 1 and a decrease in the effect size of preview benefit from first- to second-pass measures in Experiment 2, which we hypothesize to result from increased preview duration. Taken together, our results indicate that parafoveal semantic information can be obtained irrespective of reading mode, whereas readers more efficiently process parafoveal phonological information in oral reading. We discuss implications for notions of information processing priority and saccade generation during silent and oral reading. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Russo-Ponsaran, Nicole M; McKown, Clark; Johnson, Jason K; Allen, Adelaide W; Evans-Smith, Bernadette; Fogg, Louis
2015-10-01
Difficulty processing social information is a defining feature of autism spectrum disorder (ASD). Yet the failure of children with ASD to process social information effectively is poorly understood. Using Crick and Dodge's model of social information processing (SIP), this study examined the relationship between social-emotional (SE) skills of pragmatic language, theory of mind, and emotion recognition on the one hand, and early stage SIP skills of problem identification and goal generation on the other. The study included a sample of school-aged children with and without ASD. SIP was assessed using hypothetical social situations in the context of a semistructured scenario-based interview. Pragmatic language, theory of mind, and emotion recognition were measured using direct assessments. Social thinking differences between children with and without ASD are largely differences of quantity (overall lower performance in ASD), not discrepancies in cognitive processing patterns. These data support theoretical models of the relationship between SE skills and SIP. Findings have implications for understanding the mechanisms giving rise to SIP deficits in ASD and may ultimately inform treatment development for children with ASD. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
Real-time spectral characterization of a photon pair source using a chirped supercontinuum seed.
Erskine, Jennifer; England, Duncan; Kupchak, Connor; Sussman, Benjamin
2018-02-15
Photon pair sources have wide ranging applications in a variety of quantum photonic experiments and protocols. Many of these protocols require well controlled spectral correlations between the two output photons. However, due to low cross-sections, measuring the joint spectral properties of photon pair sources has historically been a challenging and time-consuming task. Here, we present an approach for the real-time measurement of the joint spectral properties of a fiber-based four wave mixing source. We seed the four wave mixing process using a broadband chirped pulse, studying the stimulated process to extract information regarding the spontaneous process. In addition, we compare stimulated emission measurements with the spontaneous process to confirm the technique's validity. Joint spectral measurements have taken many hours historically and several minutes with recent techniques. Here, measurements have been demonstrated in 5-30 s depending on resolution, offering substantial improvement. Additional benefits of this approach include flexible resolution, large measurement bandwidth, and reduced experimental overhead.
Ito, Sosuke
2016-01-01
The transfer entropy is a well-established measure of information flow, which quantifies directed influence between two stochastic time series and has been shown to be useful in a variety fields of science. Here we introduce the transfer entropy of the backward time series called the backward transfer entropy, and show that the backward transfer entropy quantifies how far it is from dynamics to a hidden Markov model. Furthermore, we discuss physical interpretations of the backward transfer entropy in completely different settings of thermodynamics for information processing and the gambling with side information. In both settings of thermodynamics and the gambling, the backward transfer entropy characterizes a possible loss of some benefit, where the conventional transfer entropy characterizes a possible benefit. Our result implies the deep connection between thermodynamics and the gambling in the presence of information flow, and that the backward transfer entropy would be useful as a novel measure of information flow in nonequilibrium thermodynamics, biochemical sciences, economics and statistics. PMID:27833120
NASA Astrophysics Data System (ADS)
Ito, Sosuke
2016-11-01
The transfer entropy is a well-established measure of information flow, which quantifies directed influence between two stochastic time series and has been shown to be useful in a variety fields of science. Here we introduce the transfer entropy of the backward time series called the backward transfer entropy, and show that the backward transfer entropy quantifies how far it is from dynamics to a hidden Markov model. Furthermore, we discuss physical interpretations of the backward transfer entropy in completely different settings of thermodynamics for information processing and the gambling with side information. In both settings of thermodynamics and the gambling, the backward transfer entropy characterizes a possible loss of some benefit, where the conventional transfer entropy characterizes a possible benefit. Our result implies the deep connection between thermodynamics and the gambling in the presence of information flow, and that the backward transfer entropy would be useful as a novel measure of information flow in nonequilibrium thermodynamics, biochemical sciences, economics and statistics.
Functional Plasticity in Childhood Brain Disorders: When, What, How, and Whom to Assess
Dennis, Maureen; Spiegler, Brenda J.; Simic, Nevena; Sinopoli, Katia J.; Wilkinson, Amy; Yeates, Keith Owen; Taylor, H. Gerry; Bigler, Erin D.; Fletcher, Jack M.
2014-01-01
At every point in the lifespan, the brain balances malleable processes representing neural plasticity that promote change with homeostatic processes that promote stability. Whether a child develops typically or with brain injury, his or her neural and behavioral outcome is constructed through transactions between plastic and homeostatic processes and the environment. In clinical research with children in whom the developing brain has been malformed or injured, behavioral outcomes provide an index of the result of plasticity, homeostasis, and environmental transactions. When should we assess outcome in relation to age at brain insult, time since brain insult, and age of the child at testing? What should we measure? Functions involving reacting to the past and predicting the future, as well as social-affective skills, are important. How should we assess outcome? Information from performance variability, direct measures and informants, overt and covert measures, and laboratory and ecological measures should be considered. In whom are we assessing outcome? Assessment should be cognizant of individual differences in gene, socio-economic status (SES), parenting, nutrition, and interpersonal supports, which are moderators that interact with other factors influencing functional outcome. PMID:24821533
NASA Astrophysics Data System (ADS)
Päs, Heinrich
2017-08-01
A minimal approach to the measurement problem and the quantum-to-classical transition assumes a universally valid quantum formalism, i.e. unitary time evolution governed by a Schrödinger-type equation. As had been pointed out long ago, in this view the measurement process can be described by decoherence which results in a ”Many-Worlds” or ”Many-Minds” scenario according to Everett and Zeh. A silent assumption for decoherence to proceed is however, that there exists incomplete information about the environment our object system gets entangled with in the measurement process. This paper addresses the question where this information is traced out and - by adopting recent approaches to model consciousness in neuroscience - argues that a rigorous interpretation results in a perspectival notion of the quantum-to-classical transition. The information that is or is not available in the consciousness of the observer is crucial for the definition of the environment (i.e. the unknown degrees of freedom in the remainder of the Universe). As such the Many-Worlds-Interpretation, while being difficult or impossible to probe in physics, may become testable in psychology.
Menon, Mohan K; Goodnight, Janelle M; Wayne, Robin J
2006-01-01
The following is a report of a study designed to measure advertising content based on the cognitive and affective elements of informational (i.e., information processing) and transformational (i.e., experiential) content using the measure of advertising informational and transformational content developed by Puto and Wells (1984). A university hospital advertising campaign designed to be high in transformational content did not appear to affect perceived quality of local university hospitals relative to private hospitals or increase the likelihood of choosing a university hospital in the future. Further, experiences with university hospitals that seemed to be in direct contrast to the content of the advertisements based on subject perceptions affected how university hospital advertisements were perceived in terms of content. Conclusions and implications for hospital advertising campaigns are discussed.
Browning, Michael; Grol, Maud; Ly, Verena; Goodwin, Guy M; Holmes, Emily A; Harmer, Catherine J
2011-01-01
Selective serotonergic reuptake inhibitors (SSRIs) and cognitive therapies are effective in the treatment of anxiety and depression. Previous research suggests that both forms of treatments may work by altering cognitive biases in the processing of affective information. The current study assessed the effects of combining an SSRI with a cognitive intervention on measures of affective processing bias and resilience to external challenge. A total of 62 healthy participants were randomly assigned to receive either 7 days of citalopram (20 mg) or placebo capsules while also completing either an active or a control version of a computerized cognitive bias training task. After treatment, standard measures of affective processing bias were collected. Participants' resilience to external stress was also tested by measuring the increase in negative symptoms induced by a negative mood induction. Participants who received both citalopram and the active cognitive bias training task showed a smaller alteration in emotional memory and categorization bias than did those who received either active intervention singly. The degree to which memory for negative information was altered by citalopram predicted participants' resistance to the negative mood induction. These results suggest that co-administration of an SSRI and a cognitive training intervention can reduce the effectiveness of either treatment alone in terms of anxiety- and depression-relevant emotional processing. More generally, the findings suggest that pinpointing the cognitive actions of treatments may inform future development of combination strategies in mental health. PMID:21832988
Cognition and Health Literacy in Older Adults' Recall of Self-Care Information.
Chin, Jessie; Madison, Anna; Gao, Xuefei; Graumlich, James F; Conner-Garcia, Thembi; Murray, Michael D; Stine-Morrow, Elizabeth A L; Morrow, Daniel G
2017-04-01
Health literacy is associated with health outcomes presumably because it influences the understanding of information needed for self-care. However, little is known about the language comprehension mechanisms that underpin health literacy. We explored the relationship between a commonly used measure of health literacy (Short Test of Functional Health Literacy in Adults [STOFHLA]) and comprehension of health information among 145 older adults. Results showed that performance on the STOFHLA was associated with recall of health information. Consistent with the Process-Knowledge Model of Health Literacy, mediation analysis showed that both processing capacity and knowledge mediated the association between health literacy and recall of health information. In addition, knowledge moderated the effects of processing capacity limits, such that processing capacity was less likely to be associated with recall for older adults with higher levels of knowledge. These findings suggest that knowledge contributes to health literacy and can compensate for deficits in processing capacity to support comprehension of health information among older adults. The implications of these findings for improving patient education materials for older adults with inadequate health literacy are discussed. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Always look on the broad side of life: happiness increases the breadth of sensory memory.
Kuhbandner, Christof; Lichtenfeld, Stephanie; Pekrun, Reinhard
2011-08-01
Research has shown that positive affect increases the breadth of information processing at several higher stages of information processing, such as attentional selection or knowledge activation. In the present study, we examined whether these affective influences are already present at the level of transiently storing incoming information in sensory memory, before attentional selection takes place. After inducing neutral, happy, or sad affect, participants performed an iconic memory task which measures visual sensory memory. In all conditions, iconic memory performance rapidly decreased with increasing delay between stimulus presentation and test, indicating that affect did not influence the decay of iconic memory. However, positive affect increased the amount of incoming information stored in iconic memory. In particular, our results showed that this occurs due to an elimination of the spatial bias typically observed in iconic memory. Whereas performance did not differ at positions where observers in the neutral and negative conditions showed the highest performance, positive affect enhanced performance at all positions where observers in the neutral and negative conditions were relatively "blind." These findings demonstrate that affect influences the breadth of information processing already at earliest processing stages, suggesting that affect may produce an even more fundamental shift in information processing than previously believed. 2011 APA, all rights reserved
Invariance algorithms for processing NDE signals
NASA Astrophysics Data System (ADS)
Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William
1996-11-01
Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.
An information theory framework for dynamic functional domain connectivity.
Vergara, Victor M; Miller, Robyn; Calhoun, Vince
2017-06-01
Dynamic functional network connectivity (dFNC) analyzes time evolution of coherent activity in the brain. In this technique dynamic changes are considered for the whole brain. This paper proposes an information theory framework to measure information flowing among subsets of functional networks call functional domains. Our method aims at estimating bits of information contained and shared among domains. The succession of dynamic functional states is estimated at the domain level. Information quantity is based on the probabilities of observing each dynamic state. Mutual information measurement is then obtained from probabilities across domains. Thus, we named this value the cross domain mutual information (CDMI). Strong CDMIs were observed in relation to the subcortical domain. Domains related to sensorial input, motor control and cerebellum form another CDMI cluster. Information flow among other domains was seldom found. Other methods of dynamic connectivity focus on whole brain dFNC matrices. In the current framework, information theory is applied to states estimated from pairs of multi-network functional domains. In this context, we apply information theory to measure information flow across functional domains. Identified CDMI clusters point to known information pathways in the basal ganglia and also among areas of sensorial input, patterns found in static functional connectivity. In contrast, CDMI across brain areas of higher level cognitive processing follow a different pattern that indicates scarce information sharing. These findings show that employing information theory to formally measured information flow through brain domains reveals additional features of functional connectivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Sequential Ideal-Observer Analysis of Visual Discriminations.
ERIC Educational Resources Information Center
Geisler, Wilson S.
1989-01-01
A new analysis, based on the concept of the ideal observer in signal detection theory, is described. It allows: tracing of the flow of discrimination information through the initial physiological stages of visual processing for arbitrary spatio-chromatic stimuli, and measurement of the information content of said visual stimuli. (TJH)
ERIC Educational Resources Information Center
Ben-Naim, Arieh
2011-01-01
Changes in entropy can "sometimes" be interpreted in terms of changes in disorder. On the other hand, changes in entropy can "always" be interpreted in terms of changes in Shannon's measure of information. Mixing and demixing processes are used to highlight the pitfalls in the association of entropy with disorder. (Contains 3 figures.)
Information Technology (IT) Identity: A Conceptualization, Proposed Measures, and Research Agenda
ERIC Educational Resources Information Center
Carter, Michelle Suzanne
2012-01-01
With increasing embeddedness of information technologies (IT) in organizational processes, and services, individuals' long-term IT use has become instrumental to business success. At the same time, IS research has illustrated that under-utilization by end-users often prevents organizations from realizing expected benefits from their…
Human Performance on the Temporal Bisection Task
ERIC Educational Resources Information Center
Kopec, Charles D.; Brody, Carlos D.
2010-01-01
The perception and processing of temporal information are tasks the brain must continuously perform. These include measuring the duration of stimuli, storing duration information in memory, recalling such memories, and comparing two durations. How the brain accomplishes these tasks, however, is still open for debate. The temporal bisection task,…
What's ahead in automated lumber grading
D. Earl Kline; Richard Conners; Philip A. Araman
1998-01-01
This paper discusses how present scanning technologies are being applied to automatic lumber grading. The presentation focuses on 1) what sensing and scanning devices are needed to measure information for accurate grading feature detection, 2) the hardware and software needed to efficiently process this information, and 3) specific issues related to softwood lumber...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-25
..., identifying significant environmental issues in the DEIS/EIR, providing useful information such as published and unpublished data, and knowledge of relevant issues and recommending mitigation measures to offset... the scoping process. c. Individuals and agencies may offer information or data relevant to the...
Information literacy of U.S. and Indian engineering undergraduates.
Taraban, Roman; Suar, Damodar; Oliver, Kristin
2013-12-01
To be competitive, contemporary engineers must be capable of both processing and communicating information effectively. Available research suggests that Indian students would be disadvantaged in information literacy in their language of instruction (English) compared to U.S. students because English is not Indian students' native language. Compared to U.S. students, Indian students (a) were predicted to apply practical text processing strategies to a greater extent than analytic strategies and (b) endorse the direct transmission of information over critical, interpretive analysis of information. Two validated scales measuring self-reported use of reading strategies and beliefs about interpreting and critiquing written information were administered to engineering students at an Indian Institute of Technology in their freshman to senior years. Neither prediction was supported: Indian students reported applying analytic strategies over pragmatic strategies and were more disposed to critically analyze information rather than accept it passively. Further, Indian students reported being more analytic and more reflective in their reading behaviors than U.S. engineering students. Additional data indicated that U.S. and Indian students' text-processing strategies and beliefs are associated with the texts that they read and their academic behaviors.
Temporal information processing in short- and long-term memory of patients with schizophrenia.
Landgraf, Steffen; Steingen, Joerg; Eppert, Yvonne; Niedermeyer, Ulrich; van der Meer, Elke; Krueger, Frank
2011-01-01
Cognitive deficits of patients with schizophrenia have been largely recognized as core symptoms of the disorder. One neglected factor that contributes to these deficits is the comprehension of time. In the present study, we assessed temporal information processing and manipulation from short- and long-term memory in 34 patients with chronic schizophrenia and 34 matched healthy controls. On the short-term memory temporal-order reconstruction task, an incidental or intentional learning strategy was deployed. Patients showed worse overall performance than healthy controls. The intentional learning strategy led to dissociable performance improvement in both groups. Whereas healthy controls improved on a performance measure (serial organization), patients improved on an error measure (inappropriate semantic clustering) when using the intentional instead of the incidental learning strategy. On the long-term memory script-generation task, routine and non-routine events of everyday activities (e.g., buying groceries) had to be generated in either chronological or inverted temporal order. Patients were slower than controls at generating events in the chronological routine condition only. They also committed more sequencing and boundary errors in the inverted conditions. The number of irrelevant events was higher in patients in the chronological, non-routine condition. These results suggest that patients with schizophrenia imprecisely access temporal information from short- and long-term memory. In short-term memory, processing of temporal information led to a reduction in errors rather than, as was the case in healthy controls, to an improvement in temporal-order recall. When accessing temporal information from long-term memory, patients were slower and committed more sequencing, boundary, and intrusion errors. Together, these results suggest that time information can be accessed and processed only imprecisely by patients who provide evidence for impaired time comprehension. This could contribute to symptomatic cognitive deficits and strategic inefficiency in schizophrenia.
Development of a safety decision-making scenario to measure worker safety in agriculture.
Mosher, G A; Keren, N; Freeman, S A; Hurburgh, C R
2014-04-01
Human factors play an important role in the management of occupational safety, especially in high-hazard workplaces such as commercial grain-handling facilities. Employee decision-making patterns represent an essential component of the safety system within a work environment. This research describes the process used to create a safety decision-making scenario to measure the process that grain-handling employees used to make choices in a safety-related work task. A sample of 160 employees completed safety decision-making simulations based on a hypothetical but realistic scenario in a grain-handling environment. Their choices and the information they used to make their choices were recorded. Although the employees emphasized safety information in their decision-making process, not all of their choices were safe choices. Factors influencing their choices are discussed, and implications for industry, management, and workers are shared.
Asymptotic inference in system identification for the atom maser.
Catana, Catalin; van Horssen, Merlijn; Guta, Madalin
2012-11-28
System identification is closely related to control theory and plays an increasing role in quantum engineering. In the quantum set-up, system identification is usually equated to process tomography, i.e. estimating a channel by probing it repeatedly with different input states. However, for quantum dynamical systems such as quantum Markov processes, it is more natural to consider the estimation based on continuous measurements of the output, with a given input that may be stationary. We address this problem using asymptotic statistics tools, for the specific example of estimating the Rabi frequency of an atom maser. We compute the Fisher information of different measurement processes as well as the quantum Fisher information of the atom maser, and establish the local asymptotic normality of these statistical models. The statistical notions can be expressed in terms of spectral properties of certain deformed Markov generators, and the connection to large deviations is briefly discussed.
Information About Cost of Goods Produced and its Usefulness for Production Engineers - A Case of SME
NASA Astrophysics Data System (ADS)
Maruszewska, Ewa Wanda; Strojek-Filus, Marzena; Drábková, Zita
2017-12-01
The article stresses the consequences of simplifications implemented in the measurement process of goods produced that are of crucial importance to production engineers in SME. The authors show the variety of possibilities that might be used by financial employees together with probable outputs in terms of valuation distortions. Using the case study the authors emphasis the importance of close cooperation of production engineers with finance professionals as out-puts of finance departments consist an important input for decision-making process of production managers. Further-more, demonstrated deficiencies in terms of methods applicable in financial reporting for measurement of the value of goods produced indicate the need for incorporation more financial and non-financial data in the process of judgments about the final cost of goods produced as simplifications applied in SME distort financial information provided to production engineers.
Dodge, Kenneth A.; Lansford, Jennifer E.; Burks, Virginia Salzer; Bates, John E.; Pettit, Gregory S.; Fontaine, Reid; Price, Joseph M.
2009-01-01
The relation between social rejection and growth in antisocial behavior was investigated. In Study 1, 259 boys and girls (34% African American) were followed from Grades 1 to 3 (ages 6–8 years) to Grades 5 to 7 (ages 10–12 years). Early peer rejection predicted growth in aggression. In Study 2, 585 boys and girls (16% African American) were followed from kindergarten to Grade 3 (ages 5–8 years), and findings were replicated. Furthermore, early aggression moderated the effect of rejection, such that rejection exacerbated antisocial development only among children initially disposed toward aggression. In Study 3, social information-processing patterns measured in Study 1 were found to mediate partially the effect of early rejection on later aggression. In Study 4, processing patterns measured in Study 2 replicated the mediation effect. Findings are integrated into a recursive model of antisocial development. PMID:12705561
Information processing speed and 8-year mortality among community-dwelling elderly Japanese.
Iwasa, Hajime; Kai, Ichiro; Yoshida, Yuko; Suzuki, Takao; Kim, Hunkyung; Yoshida, Hideyo
2014-01-01
Cognitive function is an important contributor to health among elderly adults. One reliable measure of cognitive functioning is information processing speed, which can predict incident dementia and is longitudinally related to the incidence of functional dependence. Few studies have examined the association between information processing speed and mortality. This 8-year prospective cohort study design with mortality surveillance examined the longitudinal relationship between information processing speed and all-cause mortality among community-dwelling elderly Japanese. A total of 440 men and 371 women aged 70 years or older participated in this study. The Digit Symbol Substitution Test (DSST) was used to assess information processing speed. DSST score was used as an independent variable, and age, sex, education level, depressive symptoms, chronic disease, sensory deficit, instrumental activities of daily living, walking speed, and cognitive impairment were used as covariates. During the follow-up period, 182 participants (133 men and 49 women) died. A multivariate Cox proportional hazards model showed that lower DSST score was associated with increased risk of mortality (hazard ratio [HR] = 1.62, 95% CI = 0.97-2.72; HR = 1.73, 95% CI = 1.05-2.87; and HR = 2.55, 95% CI = 1.51-4.29, for the third, second, and first quartiles of DSST score, respectively). Slower information processing speed was associated with shorter survival among elderly Japanese.
Phase space gradient of dissipated work and information: A role of relative Fisher information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamano, Takuya, E-mail: yamano@amy.hi-ho.ne.jp
2013-11-15
We show that an information theoretic distance measured by the relative Fisher information between canonical equilibrium phase densities corresponding to forward and backward processes is intimately related to the gradient of the dissipated work in phase space. We present a universal constraint on it via the logarithmic Sobolev inequality. Furthermore, we point out that a possible expression of the lower bound indicates a deep connection in terms of the relative entropy and the Fisher information of the canonical distributions.
Valenza, Gaetano; Faes, Luca; Citi, Luca; Orini, Michele; Barbieri, Riccardo
2018-05-01
Measures of transfer entropy (TE) quantify the direction and strength of coupling between two complex systems. Standard approaches assume stationarity of the observations, and therefore are unable to track time-varying changes in nonlinear information transfer with high temporal resolution. In this study, we aim to define and validate novel instantaneous measures of TE to provide an improved assessment of complex nonstationary cardiorespiratory interactions. We here propose a novel instantaneous point-process TE (ipTE) and validate its assessment as applied to cardiovascular and cardiorespiratory dynamics. In particular, heartbeat and respiratory dynamics are characterized through discrete time series, and modeled with probability density functions predicting the time of the next physiological event as a function of the past history. Likewise, nonstationary interactions between heartbeat and blood pressure dynamics are characterized as well. Furthermore, we propose a new measure of information transfer, the instantaneous point-process information transfer (ipInfTr), which is directly derived from point-process-based definitions of the Kolmogorov-Smirnov distance. Analysis on synthetic data, as well as on experimental data gathered from healthy subjects undergoing postural changes confirms that ipTE, as well as ipInfTr measures are able to dynamically track changes in physiological systems coupling. This novel approach opens new avenues in the study of hidden, transient, nonstationary physiological states involving multivariate autonomic dynamics in cardiovascular health and disease. The proposed method can also be tailored for the study of complex multisystem physiology (e.g., brain-heart or, more in general, brain-body interactions).
NASA Technical Reports Server (NTRS)
Hinton, David A.
1993-01-01
An element of the NASA/FAA windshear program is the integration of ground-based microburst information on the flight deck, to support airborne windshear alerting and microburst avoidance. NASA conducted a windshear flight test program in the summer of 1991 during which airborne processing of Terminal Doppler Weather Radar (TDWR) data was used to derive microburst alerts. Microburst information was extracted from TDWR, transmitted to a NASA Boeing 737 in flight via data link, and processed to estimate the windshear hazard level (F-factor) that would be experienced by the aircraft in each microburst. The microburst location and F-factor were used to derive a situation display and alerts. The situation display was successfully used to maneuver the aircraft for microburst penetrations, during which atmospheric 'truth' measurements were made. A total of 19 penetrations were made of TDWR-reported microburst locations, resulting in 18 airborne microburst alerts from the TDWR data and two microburst alerts from the airborne reactive windshear detection system. The primary factors affecting alerting performance were spatial offset of the flight path from the region of strongest shear, differences in TDWR measurement altitude and airplane penetration altitude, and variations in microburst outflow profiles. Predicted and measured F-factors agreed well in penetrations near microburst cores. Although improvements in airborne and ground processing of the TDWR measurements would be required to support an airborne executive-level alerting protocol, the practicality of airborne utilization of TDWR data link data has been demonstrated.
RELIABILITY TESTING OF AN ON-HARVESTER COTTON WEIGHT MEASUREMENT SYSTEM
USDA-ARS?s Scientific Manuscript database
A system for weighing seed cotton onboard stripper harvesters was developed and installed on several producer owned and operated machines. The weight measurement system provides critical information to producers when in the process of calibrating yield monitors or conducting on-farm research. The ...
Hemispheric Laterality in Music and Math
ERIC Educational Resources Information Center
Szirony, Gary Michael; Burgin, John S.; Pearson, L. Carolyn
2008-01-01
Hemispheric laterality may be a useful concept in teaching, learning, training, and in understanding more about human development. To address this issue, a measure of hemispheric laterality was compared to musical and mathematical ability. The Human Information Processing Survey (HIPS) instrument, designed to measure hemispheric laterality, was…
Predicting Aggressive Tendencies by Visual Attention Bias Associated with Hostile Emotions
Lin, Ping-I; Hsieh, Cheng-Da; Juan, Chi-Hung; Hossain, Md Monir; Erickson, Craig A.; Lee, Yang-Han; Su, Mu-Chun
2016-01-01
The goal of the current study is to clarify the relationship between social information processing (e.g., visual attention to cues of hostility, hostility attribution bias, and facial expression emotion labeling) and aggressive tendencies. Thirty adults were recruited in the eye-tracking study that measured various components in social information processing. Baseline aggressive tendencies were measured using the Buss-Perry Aggression Questionnaire (AQ). Visual attention towards hostile objects was measured as the proportion of eye gaze fixation duration on cues of hostility. Hostility attribution bias was measured with the rating results for emotions of characters in the images. The results show that the eye gaze duration on hostile characters was significantly inversely correlated with the AQ score and less eye contact with an angry face. The eye gaze duration on hostile object was not significantly associated with hostility attribution bias, although hostility attribution bias was significantly positively associated with the AQ score. Our findings suggest that eye gaze fixation time towards non-hostile cues may predict aggressive tendencies. PMID:26901770
Predicting Aggressive Tendencies by Visual Attention Bias Associated with Hostile Emotions.
Lin, Ping-I; Hsieh, Cheng-Da; Juan, Chi-Hung; Hossain, Md Monir; Erickson, Craig A; Lee, Yang-Han; Su, Mu-Chun
2016-01-01
The goal of the current study is to clarify the relationship between social information processing (e.g., visual attention to cues of hostility, hostility attribution bias, and facial expression emotion labeling) and aggressive tendencies. Thirty adults were recruited in the eye-tracking study that measured various components in social information processing. Baseline aggressive tendencies were measured using the Buss-Perry Aggression Questionnaire (AQ). Visual attention towards hostile objects was measured as the proportion of eye gaze fixation duration on cues of hostility. Hostility attribution bias was measured with the rating results for emotions of characters in the images. The results show that the eye gaze duration on hostile characters was significantly inversely correlated with the AQ score and less eye contact with an angry face. The eye gaze duration on hostile object was not significantly associated with hostility attribution bias, although hostility attribution bias was significantly positively associated with the AQ score. Our findings suggest that eye gaze fixation time towards non-hostile cues may predict aggressive tendencies.
Hoyer, Dirk; Leder, Uwe; Hoyer, Heike; Pompe, Bernd; Sommer, Michael; Zwiener, Ulrich
2002-01-01
The heart rate variability (HRV) is related to several mechanisms of the complex autonomic functioning such as respiratory heart rate modulation and phase dependencies between heart beat cycles and breathing cycles. The underlying processes are basically nonlinear. In order to understand and quantitatively assess those physiological interactions an adequate coupling analysis is necessary. We hypothesized that nonlinear measures of HRV and cardiorespiratory interdependencies are superior to the standard HRV measures in classifying patients after acute myocardial infarction. We introduced mutual information measures which provide access to nonlinear interdependencies as counterpart to the classically linear correlation analysis. The nonlinear statistical autodependencies of HRV were quantified by auto mutual information, the respiratory heart rate modulation by cardiorespiratory cross mutual information, respectively. The phase interdependencies between heart beat cycles and breathing cycles were assessed basing on the histograms of the frequency ratios of the instantaneous heart beat and respiratory cycles. Furthermore, the relative duration of phase synchronized intervals was acquired. We investigated 39 patients after acute myocardial infarction versus 24 controls. The discrimination of these groups was improved by cardiorespiratory cross mutual information measures and phase interdependencies measures in comparison to the linear standard HRV measures. This result was statistically confirmed by means of logistic regression models of particular variable subsets and their receiver operating characteristics.
On the theory of quantum measurement
NASA Technical Reports Server (NTRS)
Haus, Hermann A.; Kaertner, Franz X.
1994-01-01
Many so called paradoxes of quantum mechanics are clarified when the measurement equipment is treated as a quantized system. Every measurement involves nonlinear processes. Self consistent formulations of nonlinear quantum optics are relatively simple. Hence optical measurements, such as the quantum nondemolition (QND) measurement of photon number, are particularly well suited for such a treatment. It shows that the so called 'collapse of the wave function' is not needed for the interpretation of the measurement process. Coherence of the density matrix of the signal is progressively reduced with increasing accuracy of the photon number determination. If the QND measurement is incorporated into the double slit experiment, the contrast ratio of the fringes is found to decrease with increasing information on the photon number in one of the two paths.
NASA Astrophysics Data System (ADS)
Jung, Y.; Kim, J.; Kim, W.; Boesch, H.; Yoshida, Y.; Cho, C.; Lee, H.; Goo, T. Y.
2016-12-01
The Greenhouse Gases Observing SATellite (GOSAT) is the first satellite dedicated to measure atmospheric CO2 concentrations from space that can able to improve our knowledge about carbon cycle. Several studies have performed to develop the CO2 retrieval algorithms using GOSAT measurements, but limitations in spatial coverage and uncertainties due to aerosols and thin cirrus clouds are still remained as a problem for monitoring CO2 concentration globally. In this study, we develop the Yonsei CArbon Retrieval (YCAR) algorithm based on optimal estimation method to retrieve the column-averaged dry-air mole fraction of carbon dioxide (XCO2) with optimized a priori CO2 profiles and aerosol models over East Asia. In previous studies, the aerosol optical properties (AOP) and the aerosol top height used to cause significant errors in retrieved XCO2 up to 2.5 ppm. Since this bias comes from a rough assumption of aerosol information in the forward model used in CO2 retrieval process, the YCAR algorithm improves the process to take into account AOPs as well as aerosol vertical distribution; total AOD and the fine mode fraction (FMF) are obtained from the ground-based measurements closely located, and other parameters are obtained from a priori information. Comparing to ground-based XCO2 measurements, the YCAR XCO2 product has a bias of 0.59±0.48 ppm and 2.16±0.87 ppm at Saga and Tsukuba sites, respectively, showing lower biases and higher correlations rather than the GOSAT standard products. These results reveal that considering better aerosol information can improve the accuracy of CO2 retrieval algorithm and provide more useful XCO2 information with reduced uncertainties.
Better Higgs-C P tests through information geometry
NASA Astrophysics Data System (ADS)
Brehmer, Johann; Kling, Felix; Plehn, Tilman; Tait, Tim M. P.
2018-05-01
Measuring the C P symmetry in the Higgs sector is one of the key tasks of the LHC and a crucial ingredient for precision studies, for example in the language of effective Lagrangians. We systematically analyze which LHC signatures offer dedicated C P measurements in the Higgs-gauge sector and discuss the nature of the information they provide. Based on the Fisher information measure, we compare the maximal reach for C P -violating effects in weak boson fusion, associated Z H production, and Higgs decays into four leptons. We find a subtle balance between more theory-independent approaches and more powerful analysis channels, indicating that rigorous evidence for C P violation in the Higgs-gauge sector will likely require a multistep process.
An Extension of SIC Predictions to the Wiener Coactive Model
Houpt, Joseph W.; Townsend, James T.
2011-01-01
The survivor interaction contrasts (SIC) is a powerful measure for distinguishing among candidate models of human information processing. One class of models to which SIC analysis can apply are the coactive, or channel summation, models of human information processing. In general, parametric forms of coactive models assume that responses are made based on the first passage time across a fixed threshold of a sum of stochastic processes. Previous work has shown that that the SIC for a coactive model based on the sum of Poisson processes has a distinctive down-up-down form, with an early negative region that is smaller than the later positive region. In this note, we demonstrate that a coactive process based on the sum of two Wiener processes has the same SIC form. PMID:21822333
An Extension of SIC Predictions to the Wiener Coactive Model.
Houpt, Joseph W; Townsend, James T
2011-06-01
The survivor interaction contrasts (SIC) is a powerful measure for distinguishing among candidate models of human information processing. One class of models to which SIC analysis can apply are the coactive, or channel summation, models of human information processing. In general, parametric forms of coactive models assume that responses are made based on the first passage time across a fixed threshold of a sum of stochastic processes. Previous work has shown that that the SIC for a coactive model based on the sum of Poisson processes has a distinctive down-up-down form, with an early negative region that is smaller than the later positive region. In this note, we demonstrate that a coactive process based on the sum of two Wiener processes has the same SIC form.
Using a Multimedia Presentation to Enhance Informed Consent in a Pediatric Emergency Department.
Spencer, Sandra P; Stoner, Michael J; Kelleher, Kelly; Cohen, Daniel M
2015-08-01
Informed consent is an ethical process for ensuring patient autonomy. Multimedia presentations (MMPs) often aid the informed consent process for research studies. Thus, it follows that MMPs would improve informed consent in clinical settings. The aim of this study was to determine if an MMP for the informed consent process for ketamine sedation improves parental satisfaction and comprehension as compared with standard practice. This 2-phase study compared 2 methods of informed consent for ketamine sedation of pediatric patients. Phase 1 was a randomized, prospective study that compared the standard verbal consent to an MMP. Phase 2 implemented the MMP into daily work flow to validate the previous year's results. Parents completed a survey evaluating their satisfaction of the informed consent process and assessing their knowledge of ketamine sedation. Primary outcome measures were parental overall satisfaction with the informed consent process and knowledge of ketamine sedation. One hundred eighty-four families from a free-standing, urban, tertiary pediatric emergency department with over 85,000 annual visits were enrolled. Different demographics were not associated with a preference for the MMP or improved scores on the content quiz. Intervention families were more likely "to feel involved in the decision to use ketamine" and to understand that "they had the right to refuse the ketamine" as compared with control families. The intervention group scored significantly higher overall on the content section than the control group. Implementation and intervention families responded similarly to all survey sections. Multimedia presentation improves parental understanding of ketamine sedation, whereas parental satisfaction with the informed consent process remains unchanged. Use of MMP in the emergency department for informed consent shows potential for both patients and providers.
Pauwels, Evelyn; Van Hoof, Elke; Charlier, Caroline; Lechner, Lilian; De Bourdeaudhuij, Ilse
2012-10-03
On-line provision of information during the transition phase after treatment carries great promise in meeting shortcomings in post-treatment care for breast cancer survivors and their partners. The objectives of this study are to describe the development and process evaluation of a tailored informative website and to assess which characteristics of survivors and partners, participating in the feasibility study, are related to visiting the website. The development process included quantitative and qualitative assessments of survivors' and partners' care needs and preferences. Participants' use and evaluation of the website were explored by conducting baseline and post-measurements. During the intervening 10-12 weeks 57 survivors and 28 partners were granted access to the website. Fifty-seven percent (n=21) of survivors who took part in the post-measurement indicated that they had visited the website. Compared to non-visitors (n=16), they were more likely to have a partner and a higher income, reported higher levels of self-esteem and had completed treatment for a longer period of time. Partners who consulted the on-line information (42%, n=8) were younger and reported lower levels of social support compared to partners who did not visit the website (n=11). Visitors generally evaluated the content and lay-out positively, yet some believed the information was incomplete and impersonal. The website reached only about half of survivors and partners, yet was mostly well-received. Besides other ways of providing information and support, a website containing clear-cut and tailored information could be a useful tool in post-treatment care provision.
Biased predecisional processing of leading and nonleading alternatives.
Blanchard, Simon J; Carlson, Kurt A; Meloy, Margaret G
2014-03-01
When people obtain information about choice alternatives in a set one attribute at a time, they rapidly identify a leading alternative. Although previous research has established that people then distort incoming information, it is unclear whether distortion occurs through favoring of the leading alternative, disfavoring of the trailing alternative, or both. Prior examinations have not explored the predecisional treatment of the nonleading alternative (or alternatives) because they conceptualized distortion as a singular construct in binary choice and measured it using a relative item comparing the evaluation of both alternatives simultaneously. In this article, we introduce a measure of distortion at the level of the alternative, which allows for measuring whether predecisional distortion favors or disfavors every alternative being considered in choice sets of various sizes. We report that both proleader and antitrailer distortion occur and that the use of antitrailer processing differs between binary choices and multiple-options choices.
NASA Astrophysics Data System (ADS)
Hwang, Darryl H.; Ma, Kevin; Yepes, Fernando; Nadamuni, Mridula; Nayyar, Megha; Liu, Brent; Duddalwar, Vinay; Lepore, Natasha
2015-12-01
A conventional radiology report primarily consists of a large amount of unstructured text, and lacks clear, concise, consistent and content-rich information. Hence, an area of unmet clinical need consists of developing better ways to communicate radiology findings and information specific to each patient. Here, we design a new workflow and reporting system that combines and integrates advances in engineering technology with those from the medical sciences, the Multidimensional Interactive Radiology Report and Analysis (MIRRA). Until recently, clinical standards have primarily relied on 2D images for the purpose of measurement, but with the advent of 3D processing, many of the manually measured metrics can be automated, leading to better reproducibility and less subjective measurement placement. Hence, we make use this newly available 3D processing in our workflow. Our pipeline is used here to standardize the labeling, tracking, and quantifying of metrics for renal masses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuevas, F.A.; Curilef, S., E-mail: scurilef@ucn.cl; Plastino, A.R., E-mail: arplastino@ugr.es
The spread of a wave-packet (or its deformation) is a very important topic in quantum mechanics. Understanding this phenomenon is relevant in connection with the study of diverse physical systems. In this paper we apply various 'spreading measures' to characterize the evolution of an initially localized wave-packet in a tight-binding lattice, with special emphasis on information-theoretical measures. We investigate the behavior of both the probability distribution associated with the wave packet and the concomitant probability current. Complexity measures based upon Renyi entropies appear to be particularly good descriptors of the details of the delocalization process. - Highlights: > Spread ofmore » highly localized wave-packet in the tight-binding lattice. > Entropic and information-theoretical characterization is used to understand the delocalization. > The behavior of both the probability distribution and the concomitant probability current is investigated. > Renyi entropies appear to be good descriptors of the details of the delocalization process.« less
Feature-based characterisation of signature topography in laser powder bed fusion of metals
NASA Astrophysics Data System (ADS)
Senin, Nicola; Thompson, Adam; Leach, Richard
2018-04-01
The use of state-of-the-art areal topography measurement instrumentation allows for a high level of detail in the acquisition of topographic information at micrometric scales. The 3D geometric models of surface topography obtained from measured data create new opportunities for the investigation of manufacturing processes through characterisation of the surfaces of manufactured parts. Conventional methods for quantitative assessment of topography usually only involve the computation of texture parameters, summary indicators of topography-related characteristics that are computed over the investigated area. However, further useful information may be obtained through characterisation of signature topographic formations, as more direct indicators of manufacturing process behaviour and performance. In this work, laser powder bed fusion of metals is considered. An original algorithmic method is proposed to isolate relevant topographic formations and to quantify their dimensional and geometric properties, using areal topography data acquired by state-of-the-art areal topography measurement instrumentation.
Differential impairments of selective attention due to frequency and duration of cannabis use.
Solowij, N; Michie, P T; Fox, A M
1995-05-15
The evidence for long-term cognitive impairments associated with chronic use of cannabis has been inconclusive. We report the results of a brain event-related potential (ERP) study of selective attention in long-term cannabis users in the unintoxicated state. Two ERP measures known to reflect distinct components of attention were found to be affected differentially by duration and frequency of cannabis use. The ability to focus attention and filter out irrelevant information, measured by frontal processing negativity to irrelevant stimuli, was impaired progressively with the number of years of use but was unrelated to frequency of use. The speed of information processing, measured by the latency of parietal P300, was delayed significantly with increasing frequency of use but was unaffected by duration of use. The results suggest that a chronic buildup of cannabinoids produces both short- and long-term cognitive impairments.
Estimation of road profile variability from measured vehicle responses
NASA Astrophysics Data System (ADS)
Fauriat, W.; Mattrand, C.; Gayton, N.; Beakou, A.; Cembrzynski, T.
2016-05-01
When assessing the statistical variability of fatigue loads acting throughout the life of a vehicle, the question of the variability of road roughness naturally arises, as both quantities are strongly related. For car manufacturers, gathering information on the environment in which vehicles evolve is a long and costly but necessary process to adapt their products to durability requirements. In the present paper, a data processing algorithm is proposed in order to estimate the road profiles covered by a given vehicle, from the dynamic responses measured on this vehicle. The algorithm based on Kalman filtering theory aims at solving a so-called inverse problem, in a stochastic framework. It is validated using experimental data obtained from simulations and real measurements. The proposed method is subsequently applied to extract valuable statistical information on road roughness from an existing load characterisation campaign carried out by Renault within one of its markets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burian, Cosmin; Llobet, Eduard; Vilanova, Xavier
We have designed a challenging experimental sample set in the form of 20 solutions with a high degree of similarity in order to study whether the addition of chromatographic separation information improves the performance of regular MS based electronic noses. In order to make an initial study of the approach, two different chromatographic methods were used. By processing the data of these experiments with 2 and 3-way algorithms, we have shown that the addition of chromatographic separation information improves the results compared to the 2-way analysis of mass spectra or total ion chromatogram treated separately. Our findings show that whenmore » the chromatographic peaks are resolved (longer measurement times), 2-way methods work better than 3-way methods, whereas in the case of a more challenging measurement (more coeluted chromatograms, much faster GC-MS measurements) 3-way methods work better.« less
Extracting quantum coherence via steering
Hu, Xueyuan; Fan, Heng
2016-01-01
As the precious resource for quantum information processing, quantum coherence can be created remotely if the involved two sites are quantum correlated. It can be expected that the amount of coherence created should depend on the quantity of the shared quantum correlation, which is also a resource. Here, we establish an operational connection between coherence induced by steering and the quantum correlation. We find that the steering-induced coherence quantified by such as relative entropy of coherence and trace-norm of coherence is bounded from above by a known quantum correlation measure defined as the one-side measurement-induced disturbance. The condition that the upper bound saturated by the induced coherence varies for different measures of coherence. The tripartite scenario is also studied and similar conclusion can be obtained. Our results provide the operational connections between local and non-local resources in quantum information processing. PMID:27682450
Effect of Antidepressant Medication Use on Emotional Information Processing in Major Depression
Wells, Tony T.; Clerkin, Elise M.; Ellis, Alissa J.; Beevers, Christopher G.
2013-01-01
Objective Acute administration of antidepressant medication increases emotional information processing for positive information in both depressed and healthy participants. This effect is likely relevant to the therapeutic actions of these medications, but has not been studied in patients with Major Depressive Disorder (MDD) taking antidepressants as typically prescribed in the community. Method The authors examined the effects of antidepressant medication on selective attention for emotional stimuli using eye tracking in a sample of 47 participants (21 medicated; 26 non-medicated) with MDD and 47 matched, non-depressed controls. Participants completed a passive viewing eye tracking task assessing selective attention for positive, dysphoric, threatening, and neutral stimuli in addition to providing medication information and self-report measures of depression and anxiety severity. Results: Depressed participants currently taking antidepressant medication and non-depressed healthy control participants demonstrated greater total gaze duration and more fixations for positive stimuli, compared to non-medicated depressed participants. Depressed participants on medication (vs. depressed participants not on medication) also had fewer fixations for dysphoric stimuli. Conclusions Antidepressants, as prescribed in the community to depressed patients, appear to modify emotional information processing in the absence of differences in depression severity. These results are consistent with prior work and indicate a robust effect for antidepressants on positive information processing. They also provide further evidence for modification of information processing as a potential mechanism of action for antidepressant medication. PMID:24030200
Effect of antidepressant medication use on emotional information processing in major depression.
Wells, Tony T; Clerkin, Elise M; Ellis, Alissa J; Beevers, Christopher G
2014-02-01
Acute administration of antidepressant medication increases emotional information processing for positive information in both depressed and healthy persons. This effect is likely relevant to the therapeutic actions of these medications, but it has not been studied in patients with major depressive disorder taking antidepressants as typically prescribed in the community. The authors used eye tracking to examine the effects of antidepressant medication on selective attention for emotional stimuli in a sample of 47 patients with major depressive disorder (21 medicated and 26 unmedicated) and 47 matched comparison subjects without depression. Participants completed a passive-viewing eye-tracking task assessing selective attention for positive, dysphoric, threatening, and neutral stimuli in addition to providing medication information and self-report measures of depression and anxiety severity. Depressed participants currently taking antidepressants and nondepressed comparison subjects demonstrated greater total gaze duration and more fixations for positive stimuli compared with unmedicated depressed participants. Depressed participants on medication also had fewer fixations for dysphoric stimuli compared with depressed participants not on medication. Antidepressants, as prescribed in the community to patients with depression, appear to modify emotional information processing in the absence of differences in depression severity. These results are consistent with previous work and indicate a robust effect for antidepressants on positive information processing. They also provide further evidence for modification of information processing as a potential mechanism of action for antidepressant medication.
Near ground level sensing for spatial analysis of vegetation
NASA Technical Reports Server (NTRS)
Sauer, Tom; Rasure, John; Gage, Charlie
1991-01-01
Measured changes in vegetation indicate the dynamics of ecological processes and can identify the impacts from disturbances. Traditional methods of vegetation analysis tend to be slow because they are labor intensive; as a result, these methods are often confined to small local area measurements. Scientists need new algorithms and instruments that will allow them to efficiently study environmental dynamics across a range of different spatial scales. A new methodology that addresses this problem is presented. This methodology includes the acquisition, processing, and presentation of near ground level image data and its corresponding spatial characteristics. The systematic approach taken encompasses a feature extraction process, a supervised and unsupervised classification process, and a region labeling process yielding spatial information.
Polar Environmental Monitoring
NASA Technical Reports Server (NTRS)
Nagler, R. G.; Schulteis, A. C.
1979-01-01
The present and projected benefits of the polar regions were reviewed and then translated into information needs in order to support the array of polar activities anticipated. These needs included measurement sensitivities for polar environmental data (ice/snow, atmosphere, and ocean data for integrated support) and the processing and delivery requirements which determine the effectiveness of environmental services. An assessment was made of how well electromagnetic signals can be converted into polar environmental information. The array of sensor developments in process or proposed were also evaluated as to the spectral diversity, aperture sizes, and swathing capabilities available to provide these measurements from spacecraft, aircraft, or in situ platforms. Global coverage and local coverage densification options were studied in terms of alternative spacecraft trajectories and aircraft flight paths.
Wang, Xiaoli; Xuan, Yifu; Jarrold, Christopher
2016-01-01
Previous studies have examined whether difficulties in short-term memory for verbal information, that might be associated with dyslexia, are driven by problems in retaining either information about to-be-remembered items or the order in which these items were presented. However, such studies have not used process-pure measures of short-term memory for item or order information. In this work we adapt a process dissociation procedure to properly distinguish the contributions of item and order processes to verbal short-term memory in a group of 28 adults with a self-reported diagnosis of dyslexia and a comparison sample of 29 adults without a dyslexia diagnosis. In contrast to previous work that has suggested that individuals with dyslexia experience item deficits resulting from inefficient phonological representation and language-independent order memory deficits, the results showed no evidence of specific problems in short-term retention of either item or order information among the individuals with a self-reported diagnosis of dyslexia, despite this group showing expected difficulties on separate measures of word and non-word reading. However, there was some suggestive evidence of a link between order memory for verbal material and individual differences in non-word reading, consistent with other claims for a role of order memory in phonologically mediated reading. The data from the current study therefore provide empirical evidence to question the extent to which item and order short-term memory are necessarily impaired in dyslexia. PMID:26941679
Wang, Xiaoli; Xuan, Yifu; Jarrold, Christopher
2016-01-01
Previous studies have examined whether difficulties in short-term memory for verbal information, that might be associated with dyslexia, are driven by problems in retaining either information about to-be-remembered items or the order in which these items were presented. However, such studies have not used process-pure measures of short-term memory for item or order information. In this work we adapt a process dissociation procedure to properly distinguish the contributions of item and order processes to verbal short-term memory in a group of 28 adults with a self-reported diagnosis of dyslexia and a comparison sample of 29 adults without a dyslexia diagnosis. In contrast to previous work that has suggested that individuals with dyslexia experience item deficits resulting from inefficient phonological representation and language-independent order memory deficits, the results showed no evidence of specific problems in short-term retention of either item or order information among the individuals with a self-reported diagnosis of dyslexia, despite this group showing expected difficulties on separate measures of word and non-word reading. However, there was some suggestive evidence of a link between order memory for verbal material and individual differences in non-word reading, consistent with other claims for a role of order memory in phonologically mediated reading. The data from the current study therefore provide empirical evidence to question the extent to which item and order short-term memory are necessarily impaired in dyslexia.
ERIC Educational Resources Information Center
Badger, Elizabeth
1992-01-01
Explains a set of processes that teachers might use to structure their evaluation of students' learning and understanding. Illustrates the processes of setting goals, deciding what to assess, gathering information, and using the results through a measurement task requiring students to estimate the number of popcorn kernels in a container. (MDH)
42 CFR 410.142 - CMS process for approving national accreditation organizations.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Diabetes Self-Management Training and Diabetes Outcome Measurements § 410.142 CMS process for approving... diabetes to accredit entities to furnish training. (b) Required information and materials. An organization... outpatient diabetes self-management training program and procedures to monitor the correction of those...
42 CFR 410.142 - CMS process for approving national accreditation organizations.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Diabetes Self-Management Training and Diabetes Outcome Measurements § 410.142 CMS process for approving... diabetes to accredit entities to furnish training. (b) Required information and materials. An organization... outpatient diabetes self-management training program and procedures to monitor the correction of those...
42 CFR 410.142 - CMS process for approving national accreditation organizations.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Diabetes Self-Management Training and Diabetes Outcome Measurements § 410.142 CMS process for approving... diabetes to accredit entities to furnish training. (b) Required information and materials. An organization... outpatient diabetes self-management training program and procedures to monitor the correction of those...
ERIC Educational Resources Information Center
Rose, Susan A.; And Others
1991-01-01
Measures of visual and tactual recognition memory, tactual-visual transfer, and object permanence were obtained for preterm and full-term infants. Measures of tactual-visual transfer were correlated with later intelligence measures up to the age of five years. These correlations were independent of socioeconomic status, medical risk, and early…
2010-01-01
Background The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the “in silico” stochastic event based modeling approach to find the molecular dynamics of the system. Results In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Conclusions Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics. PMID:21143785
Ghosh, Preetam; Ghosh, Samik; Basu, Kalyan; Das, Sajal K; Zhang, Chaoyang
2010-12-01
The challenge today is to develop a modeling and simulation paradigm that integrates structural, molecular and genetic data for a quantitative understanding of physiology and behavior of biological processes at multiple scales. This modeling method requires techniques that maintain a reasonable accuracy of the biological process and also reduces the computational overhead. This objective motivates the use of new methods that can transform the problem from energy and affinity based modeling to information theory based modeling. To achieve this, we transform all dynamics within the cell into a random event time, which is specified through an information domain measure like probability distribution. This allows us to use the "in silico" stochastic event based modeling approach to find the molecular dynamics of the system. In this paper, we present the discrete event simulation concept using the example of the signal transduction cascade triggered by extra-cellular Mg2+ concentration in the two component PhoPQ regulatory system of Salmonella Typhimurium. We also present a model to compute the information domain measure of the molecular transport process by estimating the statistical parameters of inter-arrival time between molecules/ions coming to a cell receptor as external signal. This model transforms the diffusion process into the information theory measure of stochastic event completion time to get the distribution of the Mg2+ departure events. Using these molecular transport models, we next study the in-silico effects of this external trigger on the PhoPQ system. Our results illustrate the accuracy of the proposed diffusion models in explaining the molecular/ionic transport processes inside the cell. Also, the proposed simulation framework can incorporate the stochasticity in cellular environments to a certain degree of accuracy. We expect that this scalable simulation platform will be able to model more complex biological systems with reasonable accuracy to understand their temporal dynamics.
Combining Space-Based and In-Situ Measurements to Track Flooding in Thailand
NASA Technical Reports Server (NTRS)
Chien, Steve; Doubleday, Joshua; Mclaren, David; Tran, Daniel; Tanpipat, Veerachai; Chitradon, Royal; Boonya-aaroonnet, Surajate; Thanapakpawin, Porranee; Khunboa, Chatchai; Leelapatra, Watis;
2011-01-01
We describe efforts to integrate in-situ sensing, space-borne sensing, hydrological modeling, active control of sensing, and automatic data product generation to enhance monitoring and management of flooding. In our approach, broad coverage sensors and missions such as MODIS, TRMM, and weather satellite information and in-situ weather and river gauging information are all inputs to track flooding via river basin and sub-basin hydrological models. While these inputs can provide significant information as to the major flooding, targetable space measurements can provide better spatial resolution measurements of flooding extent. In order to leverage such assets we automatically task observations in response to automated analysis indications of major flooding. These new measurements are automatically processed and assimilated with the other flooding data. We describe our ongoing efforts to deploy this system to track major flooding events in Thailand.
NASA Astrophysics Data System (ADS)
Artemenko, M. V.; Chernetskaia, I. E.; Kalugina, N. M.; Shchekina, E. N.
2018-04-01
This article describes the solution of the actual problem of the productive formation of a cortege of informative measured features of the object of observation and / or control using author's algorithms for the use of bootstraps and counter-bootstraps technologies for processing the results of measurements of various states of the object on the basis of different volumes of the training sample. The work that is presented in this paper considers aggregation by specific indicators of informative capacity by linear, majority, logical and “greedy” methods, applied both individually and integrally. The results of the computational experiment are discussed, and in conclusion is drawn that the application of the proposed methods contributes to an increase in the efficiency of classification of the states of the object from the results of measurements.
Ethical principles of informed consent: exploring nurses' dual role of care provider and researcher.
Judkins-Cohn, Tanya M; Kielwasser-Withrow, Kiersten; Owen, Melissa; Ward, Jessica
2014-01-01
This article describes the ethical principles of autonomy, beneficence, and justice within the nurse researcher-participant relationship as these principles relate to the informed consent process for research. Within this process, the nurse is confronted with a dual role. This article describes how nurses, who are in the dual role of care provider and researcher, can apply these ethical principles to their practice in conjunction with the American Nurses Association's code of ethics for nurses. This article also describes, as an element of ethical practice, the importance of using participant-centered quality measures to aid informed decision making of participants in research. In addition, the article provides strategies for improving the informed consent process in nursing research. Finally, case scenarios are discussed, along with the application of ethical principles within the awareness of the dual role of the nurse as care provider and researcher. Copyright 2014, SLACK Incorporated.
Ruiter, R A; Kok, G; Verplanken, B; Brug, J
2001-06-01
The effect of fear arousal on attitude toward participating in early detection activities [i.e. breast self-examination (BSE)] was studied from an information-processing perspective. It was hypothesized that fear arousal motivates respondents to more argument-based processing of fear-relevant persuasive information. Respondents first read information about breast cancer in which fear was manipulated. After measuring fear arousal, respondents read a persuasive message about performing BSE. Analyses with reported fear, but not manipulated fear, found support for the hypothesis. Respondents who reported mild fear of breast cancer based their attitude toward BSE more on the arguments provided than respondents who reported low fear of breast cancer. This finding suggests that the use of fear arousal may be an efficient tool in health education practice. However, alternative interpretations are provided, in addition to the suggestion to be careful with using fear arousal in health education messages.
Pure sources and efficient detectors for optical quantum information processing
NASA Astrophysics Data System (ADS)
Zielnicki, Kevin
Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on optimizing the detection efficiency of visible light photon counters (VLPCs), a single-photon detection technology that is also capable of resolving photon number states. We report a record-breaking quantum efficiency of 91 +/- 3% observed with our detection system. Both sources and detectors are independently interesting physical systems worthy of study, but together they promise to enable entire new classes and applications of information based on quantum mechanics.
PATTERNS OF CLINICALLY SIGNIFICANT COGNITIVE IMPAIRMENT IN HOARDING DISORDER.
Mackin, R Scott; Vigil, Ofilio; Insel, Philip; Kivowitz, Alana; Kupferman, Eve; Hough, Christina M; Fekri, Shiva; Crothers, Ross; Bickford, David; Delucchi, Kevin L; Mathews, Carol A
2016-03-01
The cognitive characteristics of individuals with hoarding disorder (HD) are not well understood. Existing studies are relatively few and somewhat inconsistent but suggest that individuals with HD may have specific dysfunction in the cognitive domains of categorization, speed of information processing, and decision making. However, there have been no studies evaluating the degree to which cognitive dysfunction in these domains reflects clinically significant cognitive impairment (CI). Participants included 78 individuals who met DSM-V criteria for HD and 70 age- and education-matched controls. Cognitive performance on measures of memory, attention, information processing speed, abstract reasoning, visuospatial processing, decision making, and categorization ability was evaluated for each participant. Rates of clinical impairment for each measure were compared, as were age- and education-corrected raw scores for each cognitive test. HD participants showed greater incidence of CI on measures of visual memory, visual detection, and visual categorization relative to controls. Raw-score comparisons between groups showed similar results with HD participants showing lower raw-score performance on each of these measures. In addition, in raw-score comparisons HD participants also demonstrated relative strengths compared to control participants on measures of verbal and visual abstract reasoning. These results suggest that HD is associated with a pattern of clinically significant CI in some visually mediated neurocognitive processes including visual memory, visual detection, and visual categorization. Additionally, these results suggest HD individuals may also exhibit relative strengths, perhaps compensatory, in abstract reasoning in both verbal and visual domains. © 2015 Wiley Periodicals, Inc.
A Measure of Inspection Time in 4-Year-Old Children: The Benny Bee IT Task
ERIC Educational Resources Information Center
Williams, Sarah E.; Turley, Christopher; Nettelbeck, Ted; Burns, Nicholas R.
2009-01-01
Inspection time (IT) measures speed of information processing without the confounding influence of motor speed. While IT has been found to relate to cognitive abilities in adults and older children, no measure of IT has been validated for use with children younger than 6 years. This study examined the validity of a new measure of IT for preschool…
NASA Astrophysics Data System (ADS)
Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He
2017-10-01
Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.
Speech Recognition as a Transcription Aid: A Randomized Comparison With Standard Transcription
Mohr, David N.; Turner, David W.; Pond, Gregory R.; Kamath, Joseph S.; De Vos, Cathy B.; Carpenter, Paul C.
2003-01-01
Objective. Speech recognition promises to reduce information entry costs for clinical information systems. It is most likely to be accepted across an organization if physicians can dictate without concerning themselves with real-time recognition and editing; assistants can then edit and process the computer-generated document. Our objective was to evaluate the use of speech-recognition technology in a randomized controlled trial using our institutional infrastructure. Design. Clinical note dictations from physicians in two specialty divisions were randomized to either a standard transcription process or a speech-recognition process. Secretaries and transcriptionists also were assigned randomly to each of these processes. Measurements. The duration of each dictation was measured. The amount of time spent processing a dictation to yield a finished document also was measured. Secretarial and transcriptionist productivity, defined as hours of secretary work per minute of dictation processed, was determined for speech recognition and standard transcription. Results. Secretaries in the endocrinology division were 87.3% (confidence interval, 83.3%, 92.3%) as productive with the speech-recognition technology as implemented in this study as they were using standard transcription. Psychiatry transcriptionists and secretaries were similarly less productive. Author, secretary, and type of clinical note were significant (p < 0.05) predictors of productivity. Conclusion. When implemented in an organization with an existing document-processing infrastructure (which included training and interfaces of the speech-recognition editor with the existing document entry application), speech recognition did not improve the productivity of secretaries or transcriptionists. PMID:12509359
NASA Astrophysics Data System (ADS)
Bielik, M.; Vozar, J.; Hegedus, E.; Celebration Working Group
2003-04-01
The contribution informs about the preliminary results that relate to the first arrival p-wave seismic tomographic processing of data measured along the profiles CEL01, CEL04, CEL05, CEL06, CEL09 and CEL11. These profiles were measured in a framework of the seismic project called CELEBRATION 2000. Data acquisition and geometric parameters of the processed profiles, tomographic processing’s principle, particular processing steps and program parameters are described. Characteristic data (shot points, geophone points, total length of profiles, for all profiles, sampling, sensors and record lengths) of observation profiles are given. The fast program package developed by C. Zelt was applied for tomographic velocity inversion. This process consists of several steps. First step is a creation of the starting velocity field for which the calculated arrival times are modelled by the method of finite differences. The next step is minimization of differences between the measured and modelled arrival time till the deviation is small. Elimination of equivalency problem by including a priori information in the starting velocity field was done too. A priori information consists of the depth to the pre-Tertiary basement, estimation of its overlying sedimentary velocity from well-logging and or other seismic velocity data, etc. After checking the reciprocal times, pickings were corrected. The final result of the processing is a reliable travel time curve set considering the reciprocal times. We carried out picking of travel time curves, enhancement of signal-to-noise ratio on the seismograms using the program system of PROMAX. Tomographic inversion was carried out by so called 3D/2D procedure taking into account 3D wave propagation. It means that a corridor along the profile, which contains the outlying shot points and geophone points as well was defined and we carried out 3D processing within this corridor. The preliminary results indicate the seismic anomalous zones within the crust and the uppermost part of the upper mantle in the area consists of the Western Carpathians, the North European platform, the Pannonian basin and the Bohemian Massif.
Signal processing of anthropometric data
NASA Astrophysics Data System (ADS)
Zimmermann, W. J.
1983-09-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Signal processing of anthropometric data
NASA Technical Reports Server (NTRS)
Zimmermann, W. J.
1983-01-01
The Anthropometric Measurements Laboratory has accumulated a large body of data from a number of previous experiments. The data is very noisy, therefore it requires the application of some signal processing schemes. Moreover, it was not regarded as time series measurements but as positional information; hence, the data is stored as coordinate points as defined by the motion of the human body. The accumulated data defines two groups or classes. Some of the data was collected from an experiment designed to measure the flexibility of the limbs, referred to as radial movement. The remaining data was collected from experiments designed to determine the surface of the reach envelope. An interactive signal processing package was designed and implemented. Since the data does not include time this package does not include a time series element. Presently the results is restricted to processing data obtained from those experiments designed to measure flexibility.
Surface-specific additive manufacturing test artefacts
NASA Astrophysics Data System (ADS)
Townsend, Andrew; Racasan, Radu; Blunt, Liam
2018-06-01
Many test artefact designs have been proposed for use with additive manufacturing (AM) systems. These test artefacts have primarily been designed for the evaluation of AM form and dimensional performance. A series of surface-specific measurement test artefacts designed for use in the verification of AM manufacturing processes are proposed here. Surface-specific test artefacts can be made more compact because they do not require the large dimensions needed for accurate dimensional and form measurements. The series of three test artefacts are designed to provide comprehensive information pertaining to the manufactured surface. Measurement possibilities include deviation analysis, surface texture parameter data generation, sub-surface analysis, layer step analysis and build resolution comparison. The test artefacts are designed to provide easy access for measurement using conventional surface measurement techniques, for example, focus variation microscopy, stylus profilometry, confocal microscopy and scanning electron microscopy. Additionally, the test artefacts may be simply visually inspected as a comparative tool, giving a fast indication of process variation between builds. The three test artefacts are small enough to be included in every build and include built-in manufacturing traceability information, making them a convenient physical record of the build.
Ultrasonic power measurement system based on acousto-optic interaction.
He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan
2016-05-01
Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.
Luo, Xiongbiao; Mori, Kensaku
2014-06-01
Endoscope 3-D motion tracking, which seeks to synchronize pre- and intra-operative images in endoscopic interventions, is usually performed as video-volume registration that optimizes the similarity between endoscopic video and pre-operative images. The tracking performance, in turn, depends significantly on whether a similarity measure can successfully characterize the difference between video sequences and volume rendering images driven by pre-operative images. The paper proposes a discriminative structural similarity measure, which uses the degradation of structural information and takes image correlation or structure, luminance, and contrast into consideration, to boost video-volume registration. By applying the proposed similarity measure to endoscope tracking, it was demonstrated to be more accurate and robust than several available similarity measures, e.g., local normalized cross correlation, normalized mutual information, modified mean square error, or normalized sum squared difference. Based on clinical data evaluation, the tracking error was reduced significantly from at least 14.6 mm to 4.5 mm. The processing time was accelerated more than 30 frames per second using graphics processing unit.
Ultrasonic power measurement system based on acousto-optic interaction
NASA Astrophysics Data System (ADS)
He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan
2016-05-01
Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.
NASA Technical Reports Server (NTRS)
Britt, C. L., Jr.
1975-01-01
The development of an RF Multilateration system to provide accurate position and velocity measurements during the approach and landing phase of Vertical Takeoff Aircraft operation is discussed. The system uses an angle-modulated ranging signal to provide both range and range rate measurements between an aircraft transponder and multiple ground stations. Range and range rate measurements are converted to coordinate measurements and the coordinate and coordinate rate information is transmitted by an integral data link to the aircraft. Data processing techniques are analyzed to show advantages and disadvantages. Error analyses are provided to permit a comparison of the various techniques.
Frenkel, M; Chirico, R D; Diky, V; Muzny, C; Dong, Q; Marsh, K N; Dymond, J H; Wakeham, W A; Stein, S E; Königsberger, E; Goodwin, A R H; Magee, J W; Thijssen, M; Haynes, W M; Watanasiri, S; Satyro, M; Schmidt, M; Johns, A I; Hardin, G R
2006-01-01
Thermodynamic data are a key resource in the search for new relationships between properties of chemical systems that constitutes the basis of the scientific discovery process. In addition, thermodynamic information is critical for development and improvement of all chemical process technologies. Historically, peer-reviewed journals are the major source of this information obtained by experimental measurement or prediction. Technological advances in measurement science have propelled enormous growth in the scale of published thermodynamic data (almost doubling every 10 years). This expansion has created new challenges in data validation at all stages of the data delivery process. Despite the peer-review process, problems in data validation have led, in many instances, to publication of data that are grossly erroneous and, at times, inconsistent with the fundamental laws of nature. This article describes a new global data communication process in thermodynamics and its impact in addressing these challenges as well as in streamlining the delivery of the thermodynamic data from "data producers" to "data users". We believe that the prolific growth of scientific data in numerous and diverse fields outside thermodynamics, together with the demonstrated effectiveness and versatility of the process described in this article, will foster development of such processes in other scientific fields.
Frisch, Simon; Dshemuchadse, Maja; Görner, Max; Goschke, Thomas; Scherbaum, Stefan
2015-11-01
Selective attention biases information processing toward stimuli that are relevant for achieving our goals. However, the nature of this bias is under debate: Does it solely rely on the amplification of goal-relevant information or is there a need for additional inhibitory processes that selectively suppress currently distracting information? Here, we explored the processes underlying selective attention with a dynamic, modeling-based approach that focuses on the continuous evolution of behavior over time. We present two dynamic neural field models incorporating the diverging theoretical assumptions. Simulations with both models showed that they make similar predictions with regard to response times but differ markedly with regard to their continuous behavior. Human data observed via mouse tracking as a continuous measure of performance revealed evidence for the model solely based on amplification but no indication of persisting selective distracter inhibition.
De Martino, Federico; Moerel, Michelle; Ugurbil, Kamil; Goebel, Rainer; Yacoub, Essa; Formisano, Elia
2015-12-29
Columnar arrangements of neurons with similar preference have been suggested as the fundamental processing units of the cerebral cortex. Within these columnar arrangements, feed-forward information enters at middle cortical layers whereas feedback information arrives at superficial and deep layers. This interplay of feed-forward and feedback processing is at the core of perception and behavior. Here we provide in vivo evidence consistent with a columnar organization of the processing of sound frequency in the human auditory cortex. We measure submillimeter functional responses to sound frequency sweeps at high magnetic fields (7 tesla) and show that frequency preference is stable through cortical depth in primary auditory cortex. Furthermore, we demonstrate that-in this highly columnar cortex-task demands sharpen the frequency tuning in superficial cortical layers more than in middle or deep layers. These findings are pivotal to understanding mechanisms of neural information processing and flow during the active perception of sounds.
Quality and efficiency successes leveraging IT and new processes.
Chaiken, Barry P; Christian, Charles E; Johnson, Liz
2007-01-01
Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.
Digital image processing of vascular angiograms
NASA Technical Reports Server (NTRS)
Selzer, R. H.; Blankenhorn, D. H.; Beckenbach, E. S.; Crawford, D. W.; Brooks, S. H.
1975-01-01
A computer image processing technique was developed to estimate the degree of atherosclerosis in the human femoral artery. With an angiographic film of the vessel as input, the computer was programmed to estimate vessel abnormality through a series of measurements, some derived primarily from the vessel edge information and others from optical density variations within the lumen shadow. These measurements were combined into an atherosclerosis index, which was found to correlate well with both visual and chemical estimates of atherosclerotic disease.
Levesque, Jean-Frederic; Sutherland, Kim
2017-01-01
Objective Across healthcare systems, there is consensus on the need for independent and impartial assessment of performance. There is less agreement about how measurement and reporting performance improves healthcare. This paper draws on academic theories to develop a conceptual framework—one that classifies in an integrated manner the ways in which change can be leveraged by healthcare performance information. Methods A synthesis of published frameworks. Results The framework identifies eight levers for change enabled by performance information, spanning internal and external drivers, and emergent and planned processes: (1) cognitive levers provide awareness and understanding; (2) mimetic levers inform about the performance of others to encourage emulation; (3) supportive levers provide facilitation, implementation tools or models of care to actively support change; (4) formative levers develop capabilities and skills through teaching, mentoring and feedback; (5) normative levers set performance against guidelines, standards, certification and accreditation processes; (6) coercive levers use policies, regulations incentives and disincentives to force change; (7) structural levers modify the physical environment or professional cultures and routines; (8) competitive levers attract patients or funders. Conclusion This framework highlights how performance measurement and reporting can contribute to eight different levers for change. It provides guidance into how to align performance measurement and reporting into quality improvement programme. PMID:28851769
Frequency domain laser velocimeter signal processor
NASA Technical Reports Server (NTRS)
Meyers, James F.; Murphy, R. Jay
1991-01-01
A new scheme for processing signals from laser velocimeter systems is described. The technique utilizes the capabilities of advanced digital electronics to yield a signal processor capable of operating in the frequency domain maximizing the information obtainable from each signal burst. This allows a sophisticated approach to signal detection and processing, with a more accurate measurement of the chirp frequency resulting in an eight-fold increase in measurable signals over the present high-speed burst counter technology. Further, the required signal-to-noise ratio is reduced by a factor of 32, allowing measurements within boundary layers of wind tunnel models. Measurement accuracy is also increased up to a factor of five.
ERIC Educational Resources Information Center
Todd, Juanita; Finch, Brayden; Smith, Ellen; Budd, Timothy W.; Schall, Ulrich
2011-01-01
Temporal and spectral sound information is processed asymmetrically in the brain with the left-hemisphere showing an advantage for processing the former and the right-hemisphere for the latter. Using monaural sound presentation we demonstrate a context and ability dependent ear-asymmetry in brain measures of temporal change detection. Our measure…
Wind speed vector restoration algorithm
NASA Astrophysics Data System (ADS)
Baranov, Nikolay; Petrov, Gleb; Shiriaev, Ilia
2018-04-01
Impulse wind lidar (IWL) signal processing software developed by JSC «BANS» recovers full wind speed vector by radial projections and provides wind parameters information up to 2 km distance. Increasing accuracy and speed of wind parameters calculation signal processing technics have been studied in this research. Measurements results of IWL and continuous scanning lidar were compared. Also, IWL data processing modeling results have been analyzed.
ERIC Educational Resources Information Center
Kentucky State Dept. of Libraries, Frankfort.
This document is the beginning of a process. The objects of the process are to improve decisions between alternate choices in the development of statewide library services. Secondary functions are to develop the tools for providing information relevant to decisions, to measure and monitor services, and to aid in the communication process. The…
Pilot evaluation of a method to assess prescribers' information processing of medication alerts.
Russ, Alissa L; Melton, Brittany L; Daggy, Joanne K; Saleem, Jason J
2017-02-01
Prescribers commonly receive alerts during medication ordering. Prescribers work in a complex, time-pressured environment; to enhance the effectiveness of safety alerts, the effort needed to cognitively process these alerts should be minimized. Methods to evaluate the extent to which computerized alerts support prescribers' information processing are lacking. To develop a methodological protocol to assess the extent to which alerts support prescribers' information processing at-a-glance; specifically, the incorporation of information into their working memory. We hypothesized that the method would be feasible and that we would be able to detect a significant difference in prescribers' information processing with a revised alert display that incorporates warning design guidelines compared to the original alert display. A counterbalanced, within-subject study was conducted with 20 prescribers in a human-computer interaction laboratory. We tested a single alert that was displayed in two different ways. Prescribers were informed that an alert would appear for 10s. After the alert was shown, a white screen was displayed, and prescribers were asked to verbally describe what they saw; indicate how many total warnings; and describe anything else they remembered about the alert. We measured information processing via the accuracy of prescribers' free recall and their ability to identify that three warning messages were present. Two analysts independently evaluated participants' responses against a comprehensive catalog of alert elements and then discussed discrepancies until reaching consensus. This feasibility study demonstrated that the method seemed to be effective for evaluating prescribers' information processing of medication alert displays. With this method, we were able to detect significant differences in prescribers' recall of alert information. The proportion of total data elements that prescribers were able to accurately recall was significantly greater for the revised versus original alert display (p=0.006). With the revised display, more prescribers accurately reported that three warnings were shown (p=0.002). The methodological protocol was feasible for evaluating the alert display and yielded important findings on prescribers' information processing. Study methods supplement traditional usability evaluation methods and may be useful for evaluating information processing of other healthcare technologies. Published by Elsevier Inc.
Large deviation analysis of a simple information engine
NASA Astrophysics Data System (ADS)
Maitland, Michael; Grosskinsky, Stefan; Harris, Rosemary J.
2015-11-01
Information thermodynamics provides a framework for studying the effect of feedback loops on entropy production. It has enabled the understanding of novel thermodynamic systems such as the information engine, which can be seen as a modern version of "Maxwell's Dæmon," whereby a feedback controller processes information gained by measurements in order to extract work. Here, we analyze a simple model of such an engine that uses feedback control based on measurements to obtain negative entropy production. We focus on the distribution and fluctuations of the information obtained by the feedback controller. Significantly, our model allows an analytic treatment for a two-state system with exact calculation of the large deviation rate function. These results suggest an approximate technique for larger systems, which is corroborated by simulation data.
Total quality management - It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.
Effects of methylphenidate on working memory components: influence of measurement.
Bedard, Anne-Claude; Jain, Umesh; Johnson, Sheilah Hogg; Tannock, Rosemary
2007-09-01
To investigate the effects of methylphenidate (MPH) on components of working memory (WM) in attention-deficit hyperactivity disorder (ADHD) and determine the responsiveness of WM measures to MPH. Participants were a clinical sample of 50 children and adolescents with ADHD, aged 6 to 16 years old, who participated in an acute randomized, double-blind, placebo-controlled, crossover trial with single challenges of three MPH doses. Four components of WM were investigated, which varied in processing demands (storage versus manipulation of information) and modality (auditory-verbal; visual-spatial), each of which was indexed by a minimum of two separate measures. MPH improved the ability to store visual-spatial information irrespective of instrument used, but had no effects on the storage of auditory-verbal information. By contrast, MPH enhanced the ability to manipulate both auditory-verbal and visual-spatial information, although effects were instrument specific in both cases. MPH effects on WM are selective: they vary as a function of WM component and measurement.
[Supply services at health facilities: measuring performance].
Dacosta Claro, I
2001-01-01
Performance measurement, in their different meanings--either balance scorecard or outputs measurement--have become an essential tool in today's organizations (World-Class organizations) to improve service quality and reduce costs. This paper presents a performance measurement system for the hospital supply chain. The system is organized in different levels and groups of indicators in order to show a hierarchical, coherent and integrated vision of the processes. Thus, supply services performance is measured according to (1) financial aspects, (2) customers satisfaction aspects and (3) internal aspects of the processes performed. Since the informational needs of the managers vary within the administrative structure, the performance measurement system is defined in three hierarchical levels. Firstly, the whole supply chain, with the different interrelation of activities. Secondly, the three main processes of the chain--physical management of products, purchasing and negotiation processes and the local storage units. And finally, the performance measurement of each activity involved. The system and the indicators have been evaluated with the participation of 17 health services of Quebec (Canada), however, and due to the similarities of the operation, could be equally implemented in Spanish hospitals.
2006-09-01
Scully, M., Van Manen , J., & Westney, D. (2005). Managing for the Future: Organizational Behavior & Processes. Mason, OH: South-Western College...equipment, facilities and with increasing importance the resources of information and expertise (Ancona, D., Kochan, T., Scully, M., Van Maanen, J
USDA-ARS?s Scientific Manuscript database
Process-based modeling provides detailed spatial and temporal information of the soil environment in the shallow seedling recruitment zone across field topography where measurements of soil temperature and water may not sufficiently describe the zone. Hourly temperature and water profiles within the...
Collecting and Using Networked Statistics: Current Status, Future Goals
ERIC Educational Resources Information Center
Hiott, Judith
2004-01-01
For more than five years the Houston Public Library has collected statistics for measuring networked collections and services based on emerging guidelines. While the guidelines have provided authority and stability to the process, the clarification process continues. The development of information discovery software, such as federated search tools…
Teacher Rated Empathic Behaviors and Children's TAT Stories.
ERIC Educational Resources Information Center
Locraft, Constance; Teglasi, Hedwig
1997-01-01
Focuses on the emotional/cognitive processes associated with teacher ratings of empathic and socially competent behaviors. Results indicate that children (N=120) with higher empathy ratings and at higher grade levels received higher scores on a storytelling measure--the Thematic Apperception Test. Clarifies the social information processing of…
Optimal protocol for maximum work extraction in a feedback process with a time-varying potential
NASA Astrophysics Data System (ADS)
Kwon, Chulan
2017-12-01
The nonequilibrium nature of information thermodynamics is characterized by the inequality or non-negativity of the total entropy change of the system, memory, and reservoir. Mutual information change plays a crucial role in the inequality, in particular if work is extracted and the paradox of Maxwell's demon is raised. We consider the Brownian information engine where the protocol set of the harmonic potential is initially chosen by the measurement and varies in time. We confirm the inequality of the total entropy change by calculating, in detail, the entropic terms including the mutual information change. We rigorously find the optimal values of the time-dependent protocol for maximum extraction of work both for the finite-time and the quasi-static process.
Hynes, Denise M.; Perrin, Ruth A.; Rappaport, Steven; Stevens, Joanne M.; Demakis, John G.
2004-01-01
Information systems are increasingly important for measuring and improving health care quality. A number of integrated health care delivery systems use advanced information systems and integrated decision support to carry out quality assurance activities, but none as large as the Veterans Health Administration (VHA). The VHA's Quality Enhancement Research Initiative (QUERI) is a large-scale, multidisciplinary quality improvement initiative designed to ensure excellence in all areas where VHA provides health care services, including inpatient, outpatient, and long-term care settings. In this paper, we describe the role of information systems in the VHA QUERI process, highlight the major information systems critical to this quality improvement process, and discuss issues associated with the use of these systems. PMID:15187063
IFKIS a basis for organizational measures in avalanche risk management
NASA Astrophysics Data System (ADS)
Bründl, M.; Etter, H.-J.; Klingler, Ch.; Steiniger, M.; Rhyner, J.; Ammann, W.
2003-04-01
The avalanche winter 1999 in Switzerland showed that the combination of protection measures like avalanche barriers, hazard zone mapping, artificial avalanche release and organisational measures (closure of roads, evacuation etc.) proved to perform well. However, education as well as information and communication between the involved organizations proved to be a weak link in the crisis management. In the first part of the project IFKIS we developed a modular education and training course program for security responsibles of settlements and roads. In the second part an information system was developed which improves on the one hand the information fluxes between the national center for avalanche forecasting, the Swiss Federal Institute for Snow and Avalanche Research SLF, and the local forecasters. On the other hand the communication between the avalanche security services in the communities can be enhanced. During the last two years an information system based on Internet technology has been developed for this purpose. This system allows the transmission of measured data and observations to a central database at SLF and visualization of the data for different users. It also provides the possibility to exchange information on organizational measures like closure of roads, artificial avalanche release etc. on a local and regional scale. This improves the information fluxes and the coordination of safety-measures because all users, although at different places, are on the same information level. Inconsistent safety-measures can be avoided and information and communication concerning avalanche safety becomes much more transparent for all persons involved in hazard management. The training program as well the concept for the information-system are important basics for an efficient avalanche risk management but also for other natural processes and catastrophes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brush, Adrian; Masanet, Eric; Worrell, Ernst
The U.S. dairy processing industry—defined in this Energy Guide as facilities engaged in the conversion of raw milk to consumable dairy products—consumes around $1.5 billion worth of purchased fuels and electricity per year. Energy efficiency improvement is an important way to reduce these costs and to increase predictable earnings, especially in times of high energy price volatility. There are a variety of opportunities available at individual plants in the U.S. dairy processing industry to reduce energy consumption and greenhouse gas emissions in a cost-effective manner. This Energy Guide discusses energy efficiency practices and energy-efficient technologies that can be implemented atmore » the component, process, facility, and organizational levels. A discussion of the trends, structure, and energy consumption characteristics of the U.S. dairy processing industry is provided along with a description of the major process technologies used within the industry. Next, a wide variety of energy efficiency measures applicable to dairy processing plants are described. Many measure descriptions include expected savings in energy and energy-related costs, based on case study data from real-world applications in dairy processing facilities and related industries worldwide. Typical measure payback periods and references to further information in the technical literature are also provided, when available. Given the importance of water in dairy processing, a summary of basic, proven measures for improving water efficiency are also provided. The information in this Energy Guide is intended to help energy and plant managers in the U.S. dairy processing industry reduce energy and water consumption in a cost-effective manner while maintaining the quality of products manufactured. Further research on the economics of all measures—as well as on their applicability to different production practices—is needed to assess their cost effectiveness at individual plants.« less
Lucassen, Peter
2007-06-01
In the language and logic of the free market, providers of health care will have to demonstrate the quality of their work. However, in this setting quality is only interpreted in quantitative ways and consequently does not necessarily do justice to good physicians. Moreover, both outcome measures and process measures have serious drawbacks. An emphasis on outcome measures will disadvantage physicians working in deprived areas and doctors managing more complicated cases. Although process measures give the most direct information on the physician's performance, their evidence base is not always as straightforward as commonly supposed. Finally, measurement of quality indicators is complicated and time consuming. Physicians should be aware of the drawbacks of quality measurement and of the poor effects of quality improvement strategies on patient outcomes.
An Adaptive Kalman Filter using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
Multimedia Approach and Its Effect in Teaching Mathematics for the Prospective Teachers
ERIC Educational Resources Information Center
Joan, D. R. Robert; Denisia, S. P.
2012-01-01
Multimedia improves the effectiveness of teaching learning process of multimedia in formal or informal setting and utilizing scientific principle. It allows us to sort out the information to analyse and make meaning for conceptualization and applications which is suitable for individual learners. The objectives of the study was to measure the…
Measuring spray droplet size from agricultural nozzles using laser diffraction
USDA-ARS?s Scientific Manuscript database
When making an application of any crop protection material such as a herbicide or pesticide, the applicator uses a variety of skills and information to make an application so that the material reaches the target site (i.e. plant). Information critical in this process is the droplet size that a parti...
ERIC Educational Resources Information Center
Bennett, Teresa; Boyle, Michael; Georgiades, Katholiki; Georgiades, Stelios; Thompson, Ann; Duku, Eric; Bryson, Susan; Fombonne, Eric; Vaillancourt, Tracy; Zwaigenbaum, Lonnie; Smith, Isabel; Mirenda, Pat; Roberts, Wendy; Volden, Joanne; Waddell, Charlotte; Szatmari, Peter
2012-01-01
Background: Maximizing measurement accuracy is an important aim in child development assessment and research. Parents are essential informants in the diagnostic process, and past research suggests that certain parental characteristics may influence how they report information about their children. This has not been studied in autism spectrum…
ERIC Educational Resources Information Center
Chinello, Alessandro; Cattani, Veronica; Bonfiglioli, Claudia; Dehaene, Stanislas; Piazza, Manuela
2013-01-01
In the primate brain, sensory information is processed along two partially segregated cortical streams: the ventral stream, mainly coding for objects' shape and identity, and the dorsal stream, mainly coding for objects' quantitative information (including size, number, and spatial position). Neurophysiological measures indicate that such…
Executive functions, information sampling, and decision making in narcolepsy with cataplexy.
Delazer, Margarete; Högl, Birgit; Zamarian, Laura; Wenter, Johanna; Gschliesser, Viola; Ehrmann, Laura; Brandauer, Elisabeth; Cevikkol, Zehra; Frauscher, Birgit
2011-07-01
Narcolepsy with cataplexy (NC) affects neurotransmitter systems regulating emotions and cognitive functions. This study aimed to assess executive functions, information sampling, reward processing, and decision making in NC. Twenty-one NC patients and 58 healthy participants performed an extensive neuropsychological test battery. NC patients scored as controls in executive function tasks assessing set shifting, reversal learning, working memory, and planning. Group differences appeared in a task measuring information sampling and reward sensitivity. NC patients gathered less information, tolerated a higher level of uncertainty, and were less influenced by reward contingencies than controls. NC patients also showed reduced learning in decision making and had significantly lower scores than controls in the fifth block of the IOWA gambling task. No correlations were found with measures of sleepiness. NC patients may achieve high performance in several neuropsychological domains, including executive functions. Specific differences between NC patients and controls highlight the importance of the hypocretin system in reward processing and decision making and are in line with previous neuroimaging and neurophysiological studies. PsycINFO Database Record (c) 2011 APA, all rights reserved.
Chen, Yun; Yang, Hui
2016-01-01
In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering. PMID:27966581
Chen, Yun; Yang, Hui
2016-12-14
In the era of big data, there are increasing interests on clustering variables for the minimization of data redundancy and the maximization of variable relevancy. Existing clustering methods, however, depend on nontrivial assumptions about the data structure. Note that nonlinear interdependence among variables poses significant challenges on the traditional framework of predictive modeling. In the present work, we reformulate the problem of variable clustering from an information theoretic perspective that does not require the assumption of data structure for the identification of nonlinear interdependence among variables. Specifically, we propose the use of mutual information to characterize and measure nonlinear correlation structures among variables. Further, we develop Dirichlet process (DP) models to cluster variables based on the mutual-information measures among variables. Finally, orthonormalized variables in each cluster are integrated with group elastic-net model to improve the performance of predictive modeling. Both simulation and real-world case studies showed that the proposed methodology not only effectively reveals the nonlinear interdependence structures among variables but also outperforms traditional variable clustering algorithms such as hierarchical clustering.
Average fidelity between random quantum states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyczkowski, Karol; Centrum Fizyki Teoretycznej, Polska Akademia Nauk, Aleja Lotnikow 32/44, 02-668 Warsaw; Perimeter Institute, Waterloo, Ontario, N2L 2Y5
2005-03-01
We analyze mean fidelity between random density matrices of size N, generated with respect to various probability measures in the space of mixed quantum states: the Hilbert-Schmidt measure, the Bures (statistical) measure, the measure induced by the partial trace, and the natural measure on the space of pure states. In certain cases explicit probability distributions for the fidelity are derived. The results obtained may be used to gauge the quality of quantum-information-processing schemes.
A Concealed Information Test with multimodal measurement.
Ambach, Wolfgang; Bursch, Stephanie; Stark, Rudolf; Vaitl, Dieter
2010-03-01
A Concealed Information Test (CIT) investigates differential physiological responses to deed-related (probe) vs. irrelevant items. The present study focused on the detection of concealed information using simultaneous recordings of autonomic and brain electrical measures. As a secondary issue, verbal and pictorial presentations were compared with respect to their influence on the recorded measures. Thirty-one participants underwent a mock-crime scenario with a combined verbal and pictorial presentation of nine items. The subsequent CIT, designed with respect to event-related potential (ERP) measurement, used a 3-3.5s interstimulus interval. The item presentation modality, i.e. pictures or written words, was varied between subjects; no response was required from the participants. In addition to electroencephalogram (EEG), electrodermal activity (EDA), electrocardiogram (ECG), respiratory activity, and finger plethysmogram were recorded. A significant probe-vs.-irrelevant effect was found for each of the measures. Compared to sole ERP measurement, the combination of ERP and EDA yielded incremental information for detecting concealed information. Although, EDA per se did not reach the predictive value known from studies primarily designed for peripheral physiological measurement. Presentation modality neither influenced the detection accuracy for autonomic measures nor EEG measures; this underpins the equivalence of verbal and pictorial item presentation in a CIT, regardless of the physiological measures recorded. Future studies should further clarify whether the incremental validity observed in the present study reflects a differential sensitivity of ERP and EDA to different sub-processes in a CIT. Copyright 2009 Elsevier B.V. All rights reserved.
De Paúl, Joaquín; Asla, Nagore; Pérez-Albéniz, Alicia; de Cádiz, Bárbara Torres-Gómez
2006-08-01
The objective is to know if high-risk mothers for child physical abuse differ in their evaluations, attributions, negative affect, disciplinary choices for children's behavior, and expectations of compliance. The effect of a stressor and the introduction of mitigating information are analyzed. Forty-seven high-risk and 48 matched low-risk mothers participated in the study. Mothers' information processing and disciplinary choices were examined using six vignettes depicting a child engaging in different transgressions. A four-factor design with repeated measures on the last two factors was used. High-risk mothers reported more hostile intent, global and internal attributions, more use of power assertion discipline, and less induction. A risk group by child transgression interaction and a risk group by mitigating information interaction were found. Results support the social information-processing model of child physical abuse, which suggests that high-risk mothers process child-related information differently and use more power assertive and less inductive disciplinary techniques.
Multimodal processing of emotional information in 9-month-old infants I: emotional faces and voices.
Otte, R A; Donkers, F C L; Braeken, M A K A; Van den Bergh, B R H
2015-04-01
Making sense of emotions manifesting in human voice is an important social skill which is influenced by emotions in other modalities, such as that of the corresponding face. Although processing emotional information from voices and faces simultaneously has been studied in adults, little is known about the neural mechanisms underlying the development of this ability in infancy. Here we investigated multimodal processing of fearful and happy face/voice pairs using event-related potential (ERP) measures in a group of 84 9-month-olds. Infants were presented with emotional vocalisations (fearful/happy) preceded by the same or a different facial expression (fearful/happy). The ERP data revealed that the processing of emotional information appearing in human voice was modulated by the emotional expression appearing on the corresponding face: Infants responded with larger auditory ERPs after fearful compared to happy facial primes. This finding suggests that infants dedicate more processing capacities to potentially threatening than to non-threatening stimuli. Copyright © 2014 Elsevier Inc. All rights reserved.
A resource for assessing information processing in the developing brain using EEG and eye tracking
Langer, Nicolas; Ho, Erica J.; Alexander, Lindsay M.; Xu, Helen Y.; Jozanovic, Renee K.; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T.; Parra, Lucas C.; Milham, Michael P.; Kelly, Simon P.
2017-01-01
We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6–44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes. PMID:28398357
A resource for assessing information processing in the developing brain using EEG and eye tracking.
Langer, Nicolas; Ho, Erica J; Alexander, Lindsay M; Xu, Helen Y; Jozanovic, Renee K; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T; Parra, Lucas C; Milham, Michael P; Kelly, Simon P
2017-04-11
We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6-44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes.
Partial information decomposition as a spatiotemporal filter.
Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D
2011-09-01
Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.
Sandberg, D A; Lynn, S J; Matorin, A I
2001-07-01
To assess the impact of dissociation on information processing, 66 college women with high and low levels of trait dissociation were studied with regard to how they unitized videotape segments of an acquaintance rape scenario (actual assault not shown) and a nonthreatening control scenario. Unitization is a paradigm that measures how actively people process stimuli by recording how many times they press a button to indicate that they have seen a significant or meaningful event. Trait dissociation was negatively correlated with participants' unitization of the acquaintance rape videotape, unitization was positively correlated with danger cue identification, and state dissociation was negatively correlated with dangerousness ratings.
FPGA-based real time processing of the Plenoptic Wavefront Sensor
NASA Astrophysics Data System (ADS)
Rodríguez-Ramos, L. F.; Marín, Y.; Díaz, J. J.; Piqueras, J.; García-Jiménez, J.; Rodríguez-Ramos, J. M.
The plenoptic wavefront sensor combines measurements at pupil and image planes in order to obtain simultaneously wavefront information from different points of view, being capable to sample the volume above the telescope to extract the tomographic information of the atmospheric turbulence. The advantages of this sensor are presented elsewhere at this conference (José M. Rodríguez-Ramos et al). This paper will concentrate in the processing required for pupil plane phase recovery, and its computation in real time using FPGAs (Field Programmable Gate Arrays). This technology eases the implementation of massive parallel processing and allows tailoring the system to the requirements, maintaining flexibility, speed and cost figures.
Controlling quantum memory-assisted entropic uncertainty in non-Markovian environments
NASA Astrophysics Data System (ADS)
Zhang, Yanliang; Fang, Maofa; Kang, Guodong; Zhou, Qingping
2018-03-01
Quantum memory-assisted entropic uncertainty relation (QMA EUR) addresses that the lower bound of Maassen and Uffink's entropic uncertainty relation (without quantum memory) can be broken. In this paper, we investigated the dynamical features of QMA EUR in the Markovian and non-Markovian dissipative environments. It is found that dynamical process of QMA EUR is oscillation in non-Markovian environment, and the strong interaction is favorable for suppressing the amount of entropic uncertainty. Furthermore, we presented two schemes by means of prior weak measurement and posterior weak measurement reversal to control the amount of entropic uncertainty of Pauli observables in dissipative environments. The numerical results show that the prior weak measurement can effectively reduce the wave peak values of the QMA-EUA dynamic process in non-Markovian environment for long periods of time, but it is ineffectual on the wave minima of dynamic process. However, the posterior weak measurement reversal has an opposite effects on the dynamic process. Moreover, the success probability entirely depends on the quantum measurement strength. We hope that our proposal could be verified experimentally and might possibly have future applications in quantum information processing.
Industrial Photogrammetry - Accepted Metrology Tool or Exotic Niche
NASA Astrophysics Data System (ADS)
Bösemann, Werner
2016-06-01
New production technologies like 3D printing and other adaptive manufacturing technologies have changed the industrial manufacturing process, often referred to as next industrial revolution or short industry 4.0. Such Cyber Physical Production Systems combine virtual and real world through digitization, model building process simulation and optimization. It is commonly understood that measurement technologies are the key to combine the real and virtual worlds (eg. [Schmitt 2014]). This change from measurement as a quality control tool to a fully integrated step in the production process has also changed the requirements for 3D metrology solutions. Key words like MAA (Measurement Assisted Assembly) illustrate that new position of metrology in the industrial production process. At the same time it is obvious that these processes not only require more measurements but also systems to deliver the required information in high density in a short time. Here optical solutions including photogrammetry for 3D measurements have big advantages over traditional mechanical CMM's. The paper describes the relevance of different photogrammetric solutions including state of the art, industry requirements and application examples.
McDonough, Ian M.; Nashiro, Kaoru
2014-01-01
An emerging field of research focused on fluctuations in brain signals has provided evidence that the complexity of those signals, as measured by entropy, conveys important information about network dynamics (e.g., local and distributed processing). While much research has focused on how neural complexity differs in populations with different age groups or clinical disorders, substantially less research has focused on the basic understanding of neural complexity in populations with young and healthy brain states. The present study used resting-state fMRI data from the Human Connectome Project (Van Essen et al., 2013) to test the extent that neural complexity in the BOLD signal, as measured by multiscale entropy (1) would differ from random noise, (2) would differ between four major resting-state networks previously associated with higher-order cognition, and (3) would be associated with the strength and extent of functional connectivity—a complementary method of estimating information processing. We found that complexity in the BOLD signal exhibited different patterns of complexity from white, pink, and red noise and that neural complexity was differentially expressed between resting-state networks, including the default mode, cingulo-opercular, left and right frontoparietal networks. Lastly, neural complexity across all networks was negatively associated with functional connectivity at fine scales, but was positively associated with functional connectivity at coarse scales. The present study is the first to characterize neural complexity in BOLD signals at a high temporal resolution and across different networks and might help clarify the inconsistencies between neural complexity and functional connectivity, thus informing the mechanisms underlying neural complexity. PMID:24959130
Code of Federal Regulations, 2010 CFR
2010-10-01
... structural measures; (8) Requests for LOMRs and PMRs based on as-built information for projects for which...) Requests for CLOMRs based on projects involving levees, berms, or other structural measures. (d) If a... PROCESSING MAP CHANGES § 72.3 Fee schedule. (a) For requests for CLOMRs, LOMRs, and PMRs based on structural...
Stress measurements in Kuzbass mines using photoelastic sensors
NASA Astrophysics Data System (ADS)
Schastlivtsev, E.
1996-06-01
The basic amount of known measurements of stressed state in front of development workings' faces was carried out with the use of hydraulic sensors, which give an information about principal stresses without their separation. Besides, the availability of pipe-line and cumbersome equipment make more complicated and sometimes impossible the process of stresses' measurements during works in mining process. In our opinion, the borehole and photoelastic sensors at high degree satisfy with the conditions of stresses' measurements in front of mining workings' faces. The principal idea of the method is in the usage of proper face advancing aiming the estimation of the field stresses in its neighborhood. Borehole and photoelastic sensors, fixed in the advanced boreholes, drilled from the active face react to the field change of stresses or deformation caused by working face advancing. While obtaining this information we may judge about the distribution of additional stresses in rock of face's neighborhood and concentration of stresses in front of face. The usage of cavity (because of face advancing) in the quality of disturbing influence in combination with the properties of ring photoelastic sensor to given an information about magnitude and direction of secondary principle stresses, permits us to obtain rather a simple and not labor consuming method of investigation of field additional stresses in the working's face neighborhood.
Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria
2014-01-01
Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.
NASA Astrophysics Data System (ADS)
Faes, Luca; Marinazzo, Daniele; Stramaglia, Sebastiano; Jurysta, Fabrice; Porta, Alberto; Giandomenico, Nollo
2016-05-01
This work introduces a framework to study the network formed by the autonomic component of heart rate variability (cardiac process η) and the amplitude of the different electroencephalographic waves (brain processes δ, θ, α, σ, β) during sleep. The framework exploits multivariate linear models to decompose the predictability of any given target process into measures of self-, causal and interaction predictability reflecting respectively the information retained in the process and related to its physiological complexity, the information transferred from the other source processes, and the information modified during the transfer according to redundant or synergistic interaction between the sources. The framework is here applied to the η, δ, θ, α, σ, β time series measured from the sleep recordings of eight severe sleep apnoea-hypopnoea syndrome (SAHS) patients studied before and after long-term treatment with continuous positive airway pressure (CPAP) therapy, and 14 healthy controls. Results show that the full and self-predictability of η, δ and θ decreased significantly in SAHS compared with controls, and were restored with CPAP for δ and θ but not for η. The causal predictability of η and δ occurred through significantly redundant source interaction during healthy sleep, which was lost in SAHS and recovered after CPAP. These results indicate that predictability analysis is a viable tool to assess the modifications of complexity and causality of the cerebral and cardiac processes induced by sleep disorders, and to monitor the restoration of the neuroautonomic control of these processes during long-term treatment.
Intrinsic Information Processing and Energy Dissipation in Stochastic Input-Output Dynamical Systems
2015-07-09
Crutchfield. Information Anatomy of Stochastic Equilibria, Entropy , (08 2014): 0. doi: 10.3390/e16094713 Virgil Griffith, Edwin Chong, Ryan James...Christopher Ellison, James Crutchfield. Intersection Information Based on Common Randomness, Entropy , (04 2014): 0. doi: 10.3390/e16041985 TOTAL: 5 Number...Learning Group Seminar, Complexity Sciences Center, UC Davis. Korana Burke and Greg Wimsatt (UCD), reviewed PRL “Measurement of Stochastic Entropy
National blueprint for runway safety
DOT National Transportation Integrated Search
2000-10-01
The Blueprint describes the processes : employed to measurably reduce the risks : associated with runway incursions and surface : incidents. It sets expectations, establishes : accountability, communicates information, : and defines new and improved ...
Measuring Down: Evaluating Digital Storytelling as a Process for Narrative Health Promotion.
Gubrium, Aline C; Fiddian-Green, Alice; Lowe, Sarah; DiFulvio, Gloria; Del Toro-Mejías, Lizbeth
2016-05-15
Digital storytelling (DST) engages participants in a group-based process to create and share narrative accounts of life events. We present key evaluation findings of a 2-year, mixed-methods study that focused on effects of participating in the DST process on young Puerto Rican Latina's self-esteem, social support, empowerment, and sexual attitudes and behaviors. Quantitative results did not show significant changes in the expected outcomes. However, in our qualitative findings we identified several ways in which the DST made positive, health-bearing effects. We argue for the importance of "measuring down" to reflect the locally grounded, felt experiences of participants who engage in the process, as current quantitative scales do not "measure up" to accurately capture these effects. We end by suggesting the need to develop mixed-methods, culturally relevant, and sensitive evaluation tools that prioritize process effects as they inform intervention and health promotion. © The Author(s) 2016.
Publicly disclosed information about the quality of health care: response of the US public
Schneider, E; Lieberman, T
2001-01-01
Public disclosure of information about the quality of health plans, hospitals, and doctors continues to be controversial. The US experience of the past decade suggests that sophisticated quality measures and reporting systems that disclose information on quality have improved the process and outcomes of care in limited ways in some settings, but these efforts have not led to the "consumer choice" market envisaged. Important reasons for this failure include limited salience of objective measures to consumers, the complexity of the task of interpretation, and insufficient use of quality results by organised purchasers and insurers to inform contracting and pricing decisions. Nevertheless, public disclosure may motivate quality managers and providers to undertake changes that improve the delivery of care. Efforts to measure and report information about quality should remain public, but may be most effective if they are targeted to the needs of institutional and individual providers of care. Key Words: public disclosure; quality of health care; quality improvement PMID:11389318
Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E
2014-01-01
Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and subjective assessment of difficulty in fingerprint comparisons.
Niquini, Roberta Pereira; Bittencourt, Sonia Azevedo; Leal, Maria do Carmo
2013-09-01
To assess the conformity of the weight measurement process in the pre-gestational care offered in the city of Rio de Janeiro by primary units and hospitals of the National Health System, as well as to verify the agreement between the anthropometric data reported by pregnant women and those recorded in prenatal cards. A cross-sectional study was conducted in 2007 - 2008 with two cluster samples: one to obtain a sample of pregnant women to be interviewed and another one for the weight measurement procedures to be observed. The conformity of the weight measurement process was evaluated according to the Ministry of Health standards, and the agreement between the two sources of anthropometric data was evaluated using mean differences, Bland-Altman method, intraclass correlation coefficient (ICC) and weighted Kappa. Out of the twelve criteria for weight measurement evaluation (n = 159 observations), three weren't in conformity (< 50% of conformity), two of them only need to be assessed when the scale is mechanical. For the interviewed pregnant women (n = 2,148), who had the two sources of anthropometric data, there was a tendency of self-reported height overestimation and pre-gestational and current weight and Body Mass Index underestimation. Accordance between the two sources of anthropometric information, according to ICC and weighted Kappa, were high (> 0.80). Studies may use weight and height information reported by pregnant women, in the absence of prenatal cards records, when it is an important economy to their execution, although the improvement of these two sources of information by means of better anthropometric process is necessary.
Historical data recording for process computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, J.C.; Sellars, H.L.
1981-11-01
Computers have been used to monitor and control chemical and refining processes for more than 15 years. During this time, there has been a steady growth in the variety and sophistication of the functions performed by these process computers. Early systems were limited to maintaining only current operating measurements, available through crude operator's consoles or noisy teletypes. The value of retaining a process history, that is, a collection of measurements over time, became apparent, and early efforts produced shift and daily summary reports. The need for improved process historians which record, retrieve and display process information has grown as processmore » computers assume larger responsibilities in plant operations. This paper describes newly developed process historian functions that have been used on several of its in-house process monitoring and control systems in Du Pont factories. 3 refs.« less
Use of fuzzy sets in modeling of GIS objects
NASA Astrophysics Data System (ADS)
Mironova, Yu N.
2018-05-01
The paper discusses modeling and methods of data visualization in geographic information systems. Information processing in Geoinformatics is based on the use of models. Therefore, geoinformation modeling is a key in the chain of GEODATA processing. When solving problems, using geographic information systems often requires submission of the approximate or insufficient reliable information about the map features in the GIS database. Heterogeneous data of different origin and accuracy have some degree of uncertainty. In addition, not all information is accurate: already during the initial measurements, poorly defined terms and attributes (e.g., "soil, well-drained") are used. Therefore, there are necessary methods for working with uncertain requirements, classes, boundaries. The author proposes using spatial information fuzzy sets. In terms of a characteristic function, a fuzzy set is a natural generalization of ordinary sets, when one rejects the binary nature of this feature and assumes that it can take any value in the interval.
Space Station Application of Simulator-Developed Aircrew Coordination and Performance Measures
NASA Technical Reports Server (NTRS)
Murphy, Miles
1985-01-01
This paper summarizes a study in progress at NASA/Ames Research Center to develop measures of aircrew coordination and decision-making factors and to relate them to flight task performance, that is, to crew and system performance measures. The existence of some similar interpersonal process and task performance requirements suggests a potential application of these methods in space station crew research -- particularly research conducted in ground-based mock-ups. The secondary objective of this study should also be of interest: to develop information on crew process and performance for application in developing crew training programs.
Development of an Unmanned Aerial System (UAS) for Scaling Terrestrial Ecosystem Traits
NASA Astrophysics Data System (ADS)
Meng, R.; McMahon, A. M.; Serbin, S.; Rogers, A.
2015-12-01
The next generation of Ecosystem and Earth System Models (EESMs) will require detailed information on ecosystem structure and function, including properties of vegetation related to carbon (C), water, and energy cycling, in order to project the future state of ecosystems. High spatial-temporal resolution measurements of terrestrial ecosystem are also important for EESMs, because they can provide critical inputs and benchmark datasets for evaluation of EESMs simulations across scales. The recent development of high-quality, low-altitude remote sensing platforms or small UAS (< 25 kg) enables measurements of terrestrial ecosystems at unprecedented temporal and spatial scales. Specifically, these new platforms can provide detailed information on patterns and processes of terrestrial ecosystems at a critical intermediate scale between point measurements and suborbital and satellite platforms. Given their potential for sub-decimeter spatial resolution, improved mission safety, high revisit frequency, and reduced operation cost, these platforms are of particular interest in the development of ecological scaling algorithms to parameterize and benchmark EESMs, particularly over complex and remote terrain. Our group is developing a small UAS platform and integrated sensor package focused on measurement needs for scaling and informing ecosystem modeling activities, as well as scaling and mapping plant functional traits. To do this we are developing an integrated software workflow and hardware package using off-the-shelf instrumentation including a high-resolution digital camera for Structure from Motion, spectroradiometer, and a thermal infrared camera. Our workflow includes platform design, measurement, image processing, data management, and information extraction. The fusion of 3D structure information, thermal-infrared imagery, and spectroscopic measurements, will provide a foundation for the development of ecological scaling and mapping algorithms. Our initial focus is in temperate forests but near-term research will expand into the high-arctic and eventually tropical systems. The results of this prototype study show that off-the-shelf technology can be used to develop a low-cost alternative for mapping plant traits and three-dimensional structure for ecological research.
Structural health monitoring feature design by genetic programming
NASA Astrophysics Data System (ADS)
Harvey, Dustin Y.; Todd, Michael D.
2014-09-01
Structural health monitoring (SHM) systems provide real-time damage and performance information for civil, aerospace, and other high-capital or life-safety critical structures. Conventional data processing involves pre-processing and extraction of low-dimensional features from in situ time series measurements. The features are then input to a statistical pattern recognition algorithm to perform the relevant classification or regression task necessary to facilitate decisions by the SHM system. Traditional design of signal processing and feature extraction algorithms can be an expensive and time-consuming process requiring extensive system knowledge and domain expertise. Genetic programming, a heuristic program search method from evolutionary computation, was recently adapted by the authors to perform automated, data-driven design of signal processing and feature extraction algorithms for statistical pattern recognition applications. The proposed method, called Autofead, is particularly suitable to handle the challenges inherent in algorithm design for SHM problems where the manifestation of damage in structural response measurements is often unclear or unknown. Autofead mines a training database of response measurements to discover information-rich features specific to the problem at hand. This study provides experimental validation on three SHM applications including ultrasonic damage detection, bearing damage classification for rotating machinery, and vibration-based structural health monitoring. Performance comparisons with common feature choices for each problem area are provided demonstrating the versatility of Autofead to produce significant algorithm improvements on a wide range of problems.
Research on motor rotational speed measurement in regenerative braking system of electric vehicle
NASA Astrophysics Data System (ADS)
Pan, Chaofeng; Chen, Liao; Chen, Long; Jiang, Haobin; Li, Zhongxing; Wang, Shaohua
2016-01-01
Rotational speed signals acquisition and processing techniques are widely used in rotational machinery. In order to realized precise and real-time control of motor drive and regenerative braking process, rotational speed measurement techniques are needed in electric vehicles. Obtaining accurate motor rotational speed signal will contribute to the regenerative braking force control steadily and realized higher energy recovery rate. This paper aims to develop a method that provides instantaneous speed information in the form of motor rotation. It addresses principles of motor rotational speed measurement in the regenerative braking systems of electric vehicle firstly. The paper then presents ideal and actual Hall position sensor signals characteristics, the relation between the motor rotational speed and the Hall position sensor signals is revealed. Finally, Hall position sensor signals conditioning and processing circuit and program for motor rotational speed measurement have been carried out based on measurement error analysis.
1994-12-01
Order Cycle ..... 20 2. Order Processing and the Information System .......... ................. 21 3. The Order Cycle at SCCB ...... ......... 21 v...order transmittal time, order processing time, order assembly time, stock availability, production time, and delivery time. CUSTOMER L7 2 I i u *r1...methods, inventory stocking policies, order processing procedures, transport modes, and scheduling methods [Ref. 15]. 20 2. Order Processing and the
1992-09-01
abilities is fit along with the autoregressive process. Initially, the influences on search performance of within-group age and sex were included as control...Results: PerformanceLAbility Structure Measurement Model: Ability Structure The correlations between all the ability measures, age, and sex are...subsequent analyses for young adults. Age and sex were included as control variables. There was an age range of 15 years; this range is sufficiently large that
Recent advances in phase shifted time averaging and stroboscopic interferometry
NASA Astrophysics Data System (ADS)
Styk, Adam; Józwik, Michał
2016-08-01
Classical Time Averaging and Stroboscopic Interferometry are widely used for MEMS/MOEMS dynamic behavior investigations. Unfortunately both methods require an extensive measurement and data processing strategies in order to evaluate the information on maximum amplitude at a given load of vibrating object. In this paper the modified strategies of data processing in both techniques are introduced. These modifications allow for fast and reliable calculation of searched value, without additional complication of measurement systems. Through the paper the both approaches are discussed and experimentally verified.
Interaction of high-intensity laser radiation with metals.
NASA Technical Reports Server (NTRS)
Linlor, W. I.
1971-01-01
The interaction is characterized by the production of plasma, within which the primary absorption occurs. Absorption of laser radiation by a plasma may occur by several processes. The absorption process called 'inverse bremsstrahlung' is discussed. The interaction of a laser beam with the plasma produced from a thick metal target was studied. The results of the measurements of the ion kinetic energies are presented in a graph. In addition to measurements with thick targets, information was also obtained with a thin foil of gold.
Design and Production of Color Calibration Targets for Digital Input Devices
2000-07-01
gamuts . Fourth, color transform form CIELCH to sRGB will be described. Fifth, the relevant target mockups will be created. Sixth, the quality will be...Implement statistical _ • process controls Print, process and measure •, reject Transfer the measured CIEXYZ of I the target patches to SRGB a Genterate...Kodak Royal VII paper and sRGB . This plot shows all points on the a*-b* plane without information about the L*. The sRGB’s color gamut is obtained from
NCTM of liquids at high temperatures using polarization techniques
NASA Technical Reports Server (NTRS)
Krishnan, Shankar; Weber, J. K. Richard; Nordine, Paul C.; Schiffman, Robert A.
1990-01-01
Temperature measurement and control is extremely important in any materials processing application. However, conventional techniques for non-contact temperature measurement (mainly optical pyrometry) are very uncertain because of unknown or varying surface emittance. Optical properties like other properties change during processing. A dynamic, in-situ measurement of optical properties including the emittance is required. Intersonics is developing new technologies using polarized laser light scattering to determine surface emittance of freely radiating bodies concurrent with conventional optical pyrometry. These are sufficient to determine the true surface temperature of the target. Intersonics is currently developing a system called DAPP, the Division of Amplitude Polarimetric Pyrometer, that uses polarization information to measure the true thermodynamic temperature of freely radiating objects. This instrument has potential use in materials processing applications in ground and space based equipment. Results of thermophysical and thermodynamic measurements using laser reflection as a temperature measuring tool are presented. The impact of these techniques on thermophysical property measurements at high temperature is discussed.
Yuvaraj, Rajamanickam; Murugappan, Murugappan; Mohamed Ibrahim, Norlinah; Iqbal, Mohd; Sundaraj, Kenneth; Mohamad, Khairiyah; Palaniappan, Ramaswamy; Mesquita, Edgar; Satiyan, Marimuthu
2014-04-09
While Parkinson's disease (PD) has traditionally been described as a movement disorder, there is growing evidence of disruption in emotion information processing associated with the disease. The aim of this study was to investigate whether there are specific electroencephalographic (EEG) characteristics that discriminate PD patients and normal controls during emotion information processing. EEG recordings from 14 scalp sites were collected from 20 PD patients and 30 age-matched normal controls. Multimodal (audio-visual) stimuli were presented to evoke specific targeted emotional states such as happiness, sadness, fear, anger, surprise and disgust. Absolute and relative power, frequency and asymmetry measures derived from spectrally analyzed EEGs were subjected to repeated ANOVA measures for group comparisons as well as to discriminate function analysis to examine their utility as classification indices. In addition, subjective ratings were obtained for the used emotional stimuli. Behaviorally, PD patients showed no impairments in emotion recognition as measured by subjective ratings. Compared with normal controls, PD patients evidenced smaller overall relative delta, theta, alpha and beta power, and at bilateral anterior regions smaller absolute theta, alpha, and beta power and higher mean total spectrum frequency across different emotional states. Inter-hemispheric theta, alpha, and beta power asymmetry index differences were noted, with controls exhibiting greater right than left hemisphere activation. Whereas intra-hemispheric alpha power asymmetry reduction was exhibited in patients bilaterally at all regions. Discriminant analysis correctly classified 95.0% of the patients and controls during emotional stimuli. These distributed spectral powers in different frequency bands might provide meaningful information about emotional processing in PD patients.
Pilot-Configurable Information on a Display Unit
NASA Technical Reports Server (NTRS)
Bell, Charles Frederick (Inventor); Ametsitsi, Julian (Inventor); Che, Tan Nhat (Inventor); Shafaat, Syed Tahir (Inventor)
2017-01-01
A small thin display unit that can be installed in the flight deck for displaying only flight crew-selected tactical information needed for the task at hand. The flight crew can select the tactical information to be displayed by means of any conventional user interface. Whenever the flight crew selects tactical information for processes the request, including periodically retrieving measured current values or computing current values for the requested tactical parameters and returning those current tactical parameter values to the display unit for display.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
... 497 people died in alcohol-impaired driving crashes, accounting for 28% of all traffic-related deaths in ... visual and auditory information processing *Blood Alcohol Concentration Measurement The number of drinks listed represents the approximate ...
An Adaptive Kalman Filter Using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. A. H. Jazwinski developed a specialized version of this technique for estimation of process noise. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
A Study of Cognitive Load for Enhancing Student’s Quantitative Literacy in Inquiry Lab Learning
NASA Astrophysics Data System (ADS)
Nuraeni, E.; Rahman, T.; Alifiani, D. P.; Khoerunnisa, R. S.
2017-09-01
Students often find it difficult to appreciate the relevance of the role of quantitative analysis and concept attainment in the science class. This study measured student cognitive load during the inquiry lab of the respiratory system to improve quantitative literacy. Participants in this study were 40 11th graders from senior high school in Indonesia. After students learned, their feelings about the degree of mental effort that it took to complete the learning tasks were measured by 28 self-report on a 4-point Likert scale. The Task Complexity Worksheet were used to asses processing quantitative information and paper based test were applied to assess participants’ concept achievements. The results showed that inquiry instructional induced a relatively low mental effort, high processing information and high concept achievments.