Considerations on the Optimal and Efficient Processing of Information-Bearing Signals
ERIC Educational Resources Information Center
Harms, Herbert Andrew
2013-01-01
Noise is a fundamental hurdle that impedes the processing of information-bearing signals, specifically the extraction of salient information. Processing that is both optimal and efficient is desired; optimality ensures the extracted information has the highest fidelity allowed by the noise, while efficiency ensures limited resource usage. Optimal…
Richter, Tobias; Schroeder, Sascha; Wöhrmann, Britta
2009-03-01
In social cognition, knowledge-based validation of information is usually regarded as relying on strategic and resource-demanding processes. Research on language comprehension, in contrast, suggests that validation processes are involved in the construction of a referential representation of the communicated information. This view implies that individuals can use their knowledge to validate incoming information in a routine and efficient manner. Consistent with this idea, Experiments 1 and 2 demonstrated that individuals are able to reject false assertions efficiently when they have validity-relevant beliefs. Validation processes were carried out routinely even when individuals were put under additional cognitive load during comprehension. Experiment 3 demonstrated that the rejection of false information occurs automatically and interferes with affirmative responses in a nonsemantic task (epistemic Stroop effect). Experiment 4 also revealed complementary interference effects of true information with negative responses in a nonsemantic task. These results suggest the existence of fast and efficient validation processes that protect mental representations from being contaminated by false and inaccurate information.
Information processing using a single dynamical node as complex system
Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.
2011-01-01
Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110
Relatively fast! Efficiency advantages of comparative thinking.
Mussweiler, Thomas; Epstude, Kai
2009-02-01
Comparisons are a ubiquitous process in information processing. Seven studies examine whether, how, and when comparative thinking increases the efficiency of judgment and choice. Studies 1-4 demonstrate that procedurally priming participants to engage in more vs. less comparison influences how they process information about a target. Specifically, they retrieve less information about the target (Studies 1A, 1B), think more about an information-rich standard (Study 2) about which they activate judgment-relevant information (Study 3), and use this information to compensate for missing target information (Study 4). Studies 2-5 demonstrate the ensuing efficiency advantages. Participants who are primed on comparative thinking are faster in making a target judgment (Studies 2A, 2B, 4, 5) and have more residual processing capacities for a secondary task (Study 5). Studies 6 and 7 establish two boundary conditions by demonstrating that comparative thinking holds efficiency advantages only if target and standard are partly characterized by alignable features (Study 6) that are difficult to evaluate in isolation (Study 7). These findings indicate that comparative thinking may often constitute a useful mechanism to simplify information processing. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
De Loof, Esther; Van Opstal, Filip; Verguts, Tom
2016-04-01
Theories on visual awareness claim that predicted stimuli reach awareness faster than unpredicted ones. In the current study, we disentangle whether prior information about the upcoming stimulus affects visual awareness of stimulus location (i.e., individuation) by modulating processing efficiency or threshold setting. Analogous research on stimulus identification revealed that prior information modulates threshold setting. However, as identification and individuation are two functionally and neurally distinct processes, the mechanisms underlying identification cannot simply be extrapolated directly to individuation. The goal of this study was therefore to investigate how individuation is influenced by prior information about the upcoming stimulus. To do so, a drift diffusion model was fitted to estimate the processing efficiency and threshold setting for predicted versus unpredicted stimuli in a cued individuation paradigm. Participants were asked to locate a picture, following a cue that was congruent, incongruent or neutral with respect to the picture's identity. Pictures were individuated faster in the congruent and neutral condition compared to the incongruent condition. In the diffusion model analysis, the processing efficiency was not significantly different across conditions. However, the threshold setting was significantly higher following an incongruent cue compared to both congruent and neutral cues. Our results indicate that predictive information about the upcoming stimulus influences visual awareness by shifting the threshold for individuation rather than by enhancing processing efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.
An Information Processing View of Field Dependence-Independence.
ERIC Educational Resources Information Center
Davis, J. Kent; Cochran, Kathryn F.
1989-01-01
Discusses field dependence-independence from an information processing perspective. Topics discussed include field dependence theory, stages of information processing, developmental issues and implications, and future directions. The information reviewed indicates that field-independent individuals are more efficient than field-dependent…
Working memory capacity and redundant information processing efficiency.
Endres, Michael J; Houpt, Joseph W; Donkin, Chris; Finn, Peter R
2015-01-01
Working memory capacity (WMC) is typically measured by the amount of task-relevant information an individual can keep in mind while resisting distraction or interference from task-irrelevant information. The current research investigated the extent to which differences in WMC were associated with performance on a novel redundant memory probes (RMP) task that systematically varied the amount of to-be-remembered (targets) and to-be-ignored (distractor) information. The RMP task was designed to both facilitate and inhibit working memory search processes, as evidenced by differences in accuracy, response time, and Linear Ballistic Accumulator (LBA) model estimates of information processing efficiency. Participants (N = 170) completed standard intelligence tests and dual-span WMC tasks, along with the RMP task. As expected, accuracy, response-time, and LBA model results indicated memory search and retrieval processes were facilitated under redundant-target conditions, but also inhibited under mixed target/distractor and redundant-distractor conditions. Repeated measures analyses also indicated that, while individuals classified as high (n = 85) and low (n = 85) WMC did not differ in the magnitude of redundancy effects, groups did differ in the efficiency of memory search and retrieval processes overall. Results suggest that redundant information reliably facilitates and inhibits the efficiency or speed of working memory search, and these effects are independent of more general limits and individual differences in the capacity or space of working memory.
New Method for Knowledge Management Focused on Communication Pattern in Product Development
NASA Astrophysics Data System (ADS)
Noguchi, Takashi; Shiba, Hajime
In the field of manufacturing, the importance of utilizing knowledge and know-how has been growing. To meet this background, there is a need for new methods to efficiently accumulate and extract effective knowledge and know-how. To facilitate the extraction of knowledge and know-how needed by engineers, we first defined business process information which includes schedule/progress information, document data, information about communication among parties concerned, and information which corresponds to these three types of information. Based on our definitions, we proposed an IT system (FlexPIM: Flexible and collaborative Process Information Management) to register and accumulate business process information with the least effort. In order to efficiently extract effective information from huge volumes of accumulated business process information, focusing attention on “actions” and communication patterns, we propose a new extraction method using communication patterns. And the validity of this method has been verified for some communication patterns.
Elaboration Likelihood and the Counseling Process: The Role of Affect.
ERIC Educational Resources Information Center
Stoltenberg, Cal D.; And Others
The role of affect in counseling has been examined from several orientations. The depth of processing model views the efficiency of information processing as a function of the extent to which the information is processed. The notion of cognitive processing capacity states that processing information at deeper levels engages more of one's limited…
NASA Astrophysics Data System (ADS)
Gaikwad, Akshay; Rehal, Diksha; Singh, Amandeep; Arvind, Dorai, Kavita
2018-02-01
We present the NMR implementation of a scheme for selective and efficient quantum process tomography without ancilla. We generalize this scheme such that it can be implemented efficiently using only a set of measurements involving product operators. The method allows us to estimate any element of the quantum process matrix to a desired precision, provided a set of quantum states can be prepared efficiently. Our modified technique requires fewer experimental resources as compared to the standard implementation of selective and efficient quantum process tomography, as it exploits the special nature of NMR measurements to allow us to compute specific elements of the process matrix by a restrictive set of subsystem measurements. To demonstrate the efficacy of our scheme, we experimentally tomograph the processes corresponding to "no operation," a controlled-NOT (CNOT), and a controlled-Hadamard gate on a two-qubit NMR quantum information processor, with high fidelities.
Honda, Masayuki; Matsumoto, Takehiro
2017-01-01
Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.
Effects of channel blocking on information transmission and energy efficiency in squid giant axons.
Liu, Yujiang; Yue, Yuan; Yu, Yuguo; Liu, Liwei; Yu, Lianchun
2018-04-01
Action potentials are the information carriers of neural systems. The generation of action potentials involves the cooperative opening and closing of sodium and potassium channels. This process is metabolically expensive because the ions flowing through open channels need to be restored to maintain concentration gradients of these ions. Toxins like tetraethylammonium can block working ion channels, thus affecting the function and energy cost of neurons. In this paper, by computer simulation of the Hodgkin-Huxley neuron model, we studied the effects of channel blocking with toxins on the information transmission and energy efficiency in squid giant axons. We found that gradually blocking sodium channels will sequentially maximize the information transmission and energy efficiency of the axons, whereas moderate blocking of potassium channels will have little impact on the information transmission and will decrease the energy efficiency. Heavy blocking of potassium channels will cause self-sustained oscillation of membrane potentials. Simultaneously blocking sodium and potassium channels with the same ratio increases both information transmission and energy efficiency. Our results are in line with previous studies suggesting that information processing capacity and energy efficiency can be maximized by regulating the number of active ion channels, and this indicates a viable avenue for future experimentation.
Mathematics of Information Processing and the Internet
ERIC Educational Resources Information Center
Hart, Eric W.
2010-01-01
The mathematics of information processing and the Internet can be organized around four fundamental themes: (1) access (finding information easily); (2) security (keeping information confidential); (3) accuracy (ensuring accurate information); and (4) efficiency (data compression). In this article, the author discusses each theme with reference to…
Anxiety, anticipation and contextual information: A test of attentional control theory.
Cocks, Adam J; Jackson, Robin C; Bishop, Daniel T; Williams, A Mark
2016-09-01
We tested the assumptions of Attentional Control Theory (ACT) by examining the impact of anxiety on anticipation using a dynamic, time-constrained task. Moreover, we examined the involvement of high- and low-level cognitive processes in anticipation and how their importance may interact with anxiety. Skilled and less-skilled tennis players anticipated the shots of opponents under low- and high-anxiety conditions. Participants viewed three types of video stimuli, each depicting different levels of contextual information. Performance effectiveness (response accuracy) and processing efficiency (response accuracy divided by corresponding mental effort) were measured. Skilled players recorded higher levels of response accuracy and processing efficiency compared to less-skilled counterparts. Processing efficiency significantly decreased under high- compared to low-anxiety conditions. No difference in response accuracy was observed. When reviewing directional errors, anxiety was most detrimental to performance in the condition conveying only contextual information, suggesting that anxiety may have a greater impact on high-level (top-down) cognitive processes, potentially due to a shift in attentional control. Our findings provide partial support for ACT; anxiety elicited greater decrements in processing efficiency than performance effectiveness, possibly due to predominance of the stimulus-driven attentional system.
Measuring Information Technology Performance: Operational Efficiency and Operational Effectiveness
ERIC Educational Resources Information Center
Moore, Annette G.
2012-01-01
This dissertation provides a practical approach for measuring operational efficiency and operational effectiveness for IT organizations introducing the ITIL process framework. The intent of the study was to assist Chief Information Officers (CIOs) in explaining the impact of introducing the Information Technology Infrastructure Library (ITIL)…
A learnable parallel processing architecture towards unity of memory and computing
NASA Astrophysics Data System (ADS)
Li, H.; Gao, B.; Chen, Z.; Zhao, Y.; Huang, P.; Ye, H.; Liu, L.; Liu, X.; Kang, J.
2015-08-01
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named “iMemComp”, where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped “iMemComp” with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on “iMemComp” can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
A learnable parallel processing architecture towards unity of memory and computing.
Li, H; Gao, B; Chen, Z; Zhao, Y; Huang, P; Ye, H; Liu, L; Liu, X; Kang, J
2015-08-14
Developing energy-efficient parallel information processing systems beyond von Neumann architecture is a long-standing goal of modern information technologies. The widely used von Neumann computer architecture separates memory and computing units, which leads to energy-hungry data movement when computers work. In order to meet the need of efficient information processing for the data-driven applications such as big data and Internet of Things, an energy-efficient processing architecture beyond von Neumann is critical for the information society. Here we show a non-von Neumann architecture built of resistive switching (RS) devices named "iMemComp", where memory and logic are unified with single-type devices. Leveraging nonvolatile nature and structural parallelism of crossbar RS arrays, we have equipped "iMemComp" with capabilities of computing in parallel and learning user-defined logic functions for large-scale information processing tasks. Such architecture eliminates the energy-hungry data movement in von Neumann computers. Compared with contemporary silicon technology, adder circuits based on "iMemComp" can improve the speed by 76.8% and the power dissipation by 60.3%, together with a 700 times aggressive reduction in the circuit area.
Kessels, Loes T E; Ruiter, Robert A C; Jansma, Bernadette M
2010-07-01
Previous studies indicate that people respond defensively to threatening health information, especially when the information challenges self-relevant goals. The authors investigated whether reduced acceptance of self-relevant health risk information is already visible in early attention processes, that is, attention disengagement processes. In a randomized, controlled trial with 29 smoking and nonsmoking students, a variant of Posner's cueing task was used in combination with the high-temporal resolution method of event-related brain potentials (ERPs). Reaction times and P300 ERP. Smokers showed lower P300 amplitudes in response to high- as opposed to low-threat invalid trials when moving their attention to a target in the opposite visual field, indicating more efficient attention disengagement processes. Furthermore, both smokers and nonsmokers showed increased P300 amplitudes in response to the presentation of high- as opposed to low-threat valid trials, indicating threat-induced attention-capturing processes. Reaction time measures did not support the ERP data, indicating that the ERP measure can be extremely informative to measure low-level attention biases in health communication. The findings provide the first neuroscientific support for the hypothesis that threatening health information causes more efficient disengagement among those for whom the health threat is self-relevant. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Information efficiency in visual communication
NASA Astrophysics Data System (ADS)
Alter-Gartenberg, Rachel; Rahman, Zia-ur
1993-08-01
This paper evaluates the quantization process in the context of the end-to-end performance of the visual-communication channel. Results show that the trade-off between data transmission and visual quality revolves around the information in the acquired signal, not around its energy. Improved information efficiency is gained by frequency dependent quantization that maintains the information capacity of the channel and reduces the entropy of the encoded signal. Restorations with energy bit-allocation lose both in sharpness and clarity relative to restorations with information bit-allocation. Thus, quantization with information bit-allocation is preferred for high information efficiency and visual quality in optimized visual communication.
Information efficiency in visual communication
NASA Technical Reports Server (NTRS)
Alter-Gartenberg, Rachel; Rahman, Zia-Ur
1993-01-01
This paper evaluates the quantization process in the context of the end-to-end performance of the visual-communication channel. Results show that the trade-off between data transmission and visual quality revolves around the information in the acquired signal, not around its energy. Improved information efficiency is gained by frequency dependent quantization that maintains the information capacity of the channel and reduces the entropy of the encoded signal. Restorations with energy bit-allocation lose both in sharpness and clarity relative to restorations with information bit-allocation. Thus, quantization with information bit-allocation is preferred for high information efficiency and visual quality in optimized visual communication.
See, Ya Hui Michelle; Petty, Richard E; Fabrigar, Leandre R
2013-08-01
We proposed that (a) processing interest for affective over cognitive information is captured by meta-bases (i.e., the extent to which people subjectively perceive themselves to rely on affect or cognition in their attitudes) and (b) processing efficiency for affective over cognitive information is captured by structural bases (i.e., the extent to which attitudes are more evaluatively congruent with affect or cognition). Because processing speed can disentangle interest from efficiency by being manifest as longer or shorter reading times, we hypothesized and found that more affective meta-bases predicted longer affective than cognitive reading time when processing efficiency was held constant (Study 1). In contrast, more affective structural bases predicted shorter affective than cognitive reading time when participants were constrained in their ability to allocate resources deliberatively (Study 2). When deliberation was neither encouraged nor constrained, effects for meta-bases and structural bases emerged (Study 3). Implications for affective-cognitive processing and other attitudes-relevant constructs are discussed.
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; Fales, Carl L.
1990-01-01
Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.
Sengupta, Biswa; Laughlin, Simon Barry; Niven, Jeremy Edward
2014-01-01
Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na(+) and K(+) channels, with generator potential and graded potential models lacking voltage-gated Na(+) channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na(+) channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a 'footprint' in the generator potential that obscures incoming signals. These three processes reduce information rates by ∼50% in generator potentials, to ∼3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation.
Sengupta, Biswa; Laughlin, Simon Barry; Niven, Jeremy Edward
2014-01-01
Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a ‘footprint’ in the generator potential that obscures incoming signals. These three processes reduce information rates by ∼50% in generator potentials, to ∼3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation. PMID:24465197
Some Information-Processing Correlates of Measures of Intelligence
ERIC Educational Resources Information Center
Lunneborg, Clifford E.
1978-01-01
Group and individually administered measure of intelligence were related to laboratory based measures of human information processing on a group of college freshmen. Among other results, high IQ was related to right hemisphere efficiency in processing non-linguistic stimuli. (Author/JKS)
NASA Astrophysics Data System (ADS)
Anderson, O. Roger
The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.
A universal quantum information processor for scalable quantum communication and networks
Yang, Xihua; Xue, Bolin; Zhang, Junxiang; Zhu, Shiyao
2014-01-01
Entanglement provides an essential resource for quantum computation, quantum communication, and quantum networks. How to conveniently and efficiently realize the generation, distribution, storage, retrieval, and control of multipartite entanglement is the basic requirement for realistic quantum information processing. Here, we present a theoretical proposal to efficiently and conveniently achieve a universal quantum information processor (QIP) via atomic coherence in an atomic ensemble. The atomic coherence, produced through electromagnetically induced transparency (EIT) in the Λ-type configuration, acts as the QIP and has full functions of quantum beam splitter, quantum frequency converter, quantum entangler, and quantum repeater. By employing EIT-based nondegenerate four-wave mixing processes, the generation, exchange, distribution, and manipulation of light-light, atom-light, and atom-atom multipartite entanglement can be efficiently and flexibly achieved in a deterministic way with only coherent light fields. This method greatly facilitates the operations in quantum information processing, and holds promising applications in realistic scalable quantum communication and quantum networks. PMID:25316514
Creativity, information, and consciousness: The information dynamics of thinking.
Wiggins, Geraint A
2018-05-07
This paper presents a theory of the basic operation of mind, Information Dynamics of Thinking, which is intended for computational implementation and thence empirical testing. It is based on the information theory of Shannon, and treats the mind/brain as an information processing organ that aims to be information-efficient, in that it predicts its world, so as to use information efficiently, and regularly re-represents it, so as to store information efficiently. The theory is presented in context of a background review of various research areas that impinge upon its development. Consequences of the theory and testable hypotheses arising from it are discussed. Copyright © 2018. Published by Elsevier B.V.
Theoretical aspects of cellular decision-making and information-processing.
Kobayashi, Tetsuya J; Kamimura, Atsushi
2012-01-01
Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.
High efficiency coherent optical memory with warm rubidium vapour
Hosseini, M.; Sparkes, B.M.; Campbell, G.; Lam, P.K.; Buchler, B.C.
2011-01-01
By harnessing aspects of quantum mechanics, communication and information processing could be radically transformed. Promising forms of quantum information technology include optical quantum cryptographic systems and computing using photons for quantum logic operations. As with current information processing systems, some form of memory will be required. Quantum repeaters, which are required for long distance quantum key distribution, require quantum optical memory as do deterministic logic gates for optical quantum computing. Here, we present results from a coherent optical memory based on warm rubidium vapour and show 87% efficient recall of light pulses, the highest efficiency measured to date for any coherent optical memory suitable for quantum information applications. We also show storage and recall of up to 20 pulses from our system. These results show that simple warm atomic vapour systems have clear potential as a platform for quantum memory. PMID:21285952
High efficiency coherent optical memory with warm rubidium vapour.
Hosseini, M; Sparkes, B M; Campbell, G; Lam, P K; Buchler, B C
2011-02-01
By harnessing aspects of quantum mechanics, communication and information processing could be radically transformed. Promising forms of quantum information technology include optical quantum cryptographic systems and computing using photons for quantum logic operations. As with current information processing systems, some form of memory will be required. Quantum repeaters, which are required for long distance quantum key distribution, require quantum optical memory as do deterministic logic gates for optical quantum computing. Here, we present results from a coherent optical memory based on warm rubidium vapour and show 87% efficient recall of light pulses, the highest efficiency measured to date for any coherent optical memory suitable for quantum information applications. We also show storage and recall of up to 20 pulses from our system. These results show that simple warm atomic vapour systems have clear potential as a platform for quantum memory.
Homo Heuristicus: Less-is-More Effects in Adaptive Cognition
Brighton, Henry; Gigerenzer, Gerd
2012-01-01
Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We discuss some of the major progress made so far, focusing on the discovery of less-is-more effects and the study of the ecological rationality of heuristics which examines in which environments a given strategy succeeds or fails, and why. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. PMID:23613644
ERIC Educational Resources Information Center
Lamb, Richard L.; Firestone, Jonah B.
2017-01-01
Conflicting explanations and unrelated information in science classrooms increase cognitive load and decrease efficiency in learning. This reduced efficiency ultimately limits one's ability to solve reasoning problems in the science. In reasoning, it is the ability of students to sift through and identify critical pieces of information that is of…
Information Diffusion in Facebook-Like Social Networks Under Information Overload
NASA Astrophysics Data System (ADS)
Li, Pei; Xing, Kai; Wang, Dapeng; Zhang, Xin; Wang, Hui
2013-07-01
Research on social networks has received remarkable attention, since many people use social networks to broadcast information and stay connected with their friends. However, due to the information overload in social networks, it becomes increasingly difficult for users to find useful information. This paper takes Facebook-like social networks into account, and models the process of information diffusion under information overload. The term view scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated is proposed to characterize the information diffusion efficiency. Through theoretical analysis, we find that factors such as network structure and view scope number have no impact on the information diffusion efficiency, which is a surprising result. To verify the results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly.
Image detection and compression for memory efficient system analysis
NASA Astrophysics Data System (ADS)
Bayraktar, Mustafa
2015-02-01
The advances in digital signal processing have been progressing towards efficient use of memory and processing. Both of these factors can be utilized efficiently by using feasible techniques of image storage by computing the minimum information of image which will enhance computation in later processes. Scale Invariant Feature Transform (SIFT) can be utilized to estimate and retrieve of an image. In computer vision, SIFT can be implemented to recognize the image by comparing its key features from SIFT saved key point descriptors. The main advantage of SIFT is that it doesn't only remove the redundant information from an image but also reduces the key points by matching their orientation and adding them together in different windows of image [1]. Another key property of this approach is that it works on highly contrasted images more efficiently because it`s design is based on collecting key points from the contrast shades of image.
Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo
2018-01-01
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks. Summary We conducted numerical simulations and mathematical analysis to examine the energy efficiency of neural information transmission in a recurrent network as a function of the ratio of excitatory and inhibitory synaptic connections. We obtained a general solution showing that there exists an optimal E/I synaptic ratio in a recurrent network at which the information transmission as well as the energy efficiency of this network achieves a global maximum. These results reflect general mechanisms for sensory coding processes, which may give insight into the energy efficiency of neural communication and coding. PMID:29773979
Yu, Lianchun; Shen, Zhou; Wang, Chen; Yu, Yuguo
2018-01-01
Selective pressure may drive neural systems to process as much information as possible with the lowest energy cost. Recent experiment evidence revealed that the ratio between synaptic excitation and inhibition (E/I) in local cortex is generally maintained at a certain value which may influence the efficiency of energy consumption and information transmission of neural networks. To understand this issue deeply, we constructed a typical recurrent Hodgkin-Huxley network model and studied the general principles that governs the relationship among the E/I synaptic current ratio, the energy cost and total amount of information transmission. We observed in such a network that there exists an optimal E/I synaptic current ratio in the network by which the information transmission achieves the maximum with relatively low energy cost. The coding energy efficiency which is defined as the mutual information divided by the energy cost, achieved the maximum with the balanced synaptic current. Although background noise degrades information transmission and imposes an additional energy cost, we find an optimal noise intensity that yields the largest information transmission and energy efficiency at this optimal E/I synaptic transmission ratio. The maximization of energy efficiency also requires a certain part of energy cost associated with spontaneous spiking and synaptic activities. We further proved this finding with analytical solution based on the response function of bistable neurons, and demonstrated that optimal net synaptic currents are capable of maximizing both the mutual information and energy efficiency. These results revealed that the development of E/I synaptic current balance could lead a cortical network to operate at a highly efficient information transmission rate at a relatively low energy cost. The generality of neuronal models and the recurrent network configuration used here suggest that the existence of an optimal E/I cell ratio for highly efficient energy costs and information maximization is a potential principle for cortical circuit networks. We conducted numerical simulations and mathematical analysis to examine the energy efficiency of neural information transmission in a recurrent network as a function of the ratio of excitatory and inhibitory synaptic connections. We obtained a general solution showing that there exists an optimal E/I synaptic ratio in a recurrent network at which the information transmission as well as the energy efficiency of this network achieves a global maximum. These results reflect general mechanisms for sensory coding processes, which may give insight into the energy efficiency of neural communication and coding.
The Communicative Function of Ambiguity in Language
ERIC Educational Resources Information Center
Piantadosi, Steven T.; Tily, Harry; Gibson, Edward
2012-01-01
We present a general information-theoretic argument that all efficient communication systems will be ambiguous, assuming that context is informative about meaning. We also argue that ambiguity allows for greater ease of processing by permitting efficient linguistic units to be re-used. We test predictions of this theory in English, German, and…
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao
The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.
Keller, Carmen
2011-07-01
Previous experimental research provides evidence that a familiar risk comparison within a risk ladder is understood by low- and high-numerate individuals. It especially helps low numerates to better evaluate risk. In the present study, an eye tracker was used to capture individuals' visual attention to a familiar risk comparison, such as the risk associated with smoking. Two parameters of information processing-efficiency and level-were derived from visual attention. A random sample of participants from the general population (N= 68) interpreted a given risk level with the help of the risk ladder. Numeracy was negatively correlated with overall visual attention on the risk ladder (r(s) =-0.28, p= 0.01), indicating that the lower the numeracy, the more the time spent looking at the whole risk ladder. Numeracy was positively correlated with the efficiency of processing relevant frequency (r(s) = 0.34, p < 0.001) and relevant textual information (r(s) = 0.34, p < 0.001), but not with the efficiency of processing relevant comparative information and numerical information. There was a significant negative correlation between numeracy and the level of processing of relevant comparative risk information (r(s) =-0.21, p < 0.01), indicating that low numerates processed the comparative risk information more deeply than the high numerates. There was no correlation between numeracy and perceived risk. These results add to previous experimental research, indicating that the smoking risk comparison was crucial for low numerates to evaluate and understand risk. Furthermore, the eye-tracker method is promising for studying information processing and improving risk communication formats. © 2011 Society for Risk Analysis.
Cybernetic Basis and System Practice of Remote Sensing and Spatial Information Science
NASA Astrophysics Data System (ADS)
Tan, X.; Jing, X.; Chen, R.; Ming, Z.; He, L.; Sun, Y.; Sun, X.; Yan, L.
2017-09-01
Cybernetics provides a new set of ideas and methods for the study of modern science, and it has been fully applied in many areas. However, few people have introduced cybernetics into the field of remote sensing. The paper is based on the imaging process of remote sensing system, introducing cybernetics into the field of remote sensing, establishing a space-time closed-loop control theory for the actual operation of remote sensing. The paper made the process of spatial information coherently, and improved the comprehensive efficiency of the space information from acquisition, procession, transformation to application. We not only describes the application of cybernetics in remote sensing platform control, sensor control, data processing control, but also in whole system of remote sensing imaging process control. We achieve the information of output back to the input to control the efficient operation of the entire system. This breakthrough combination of cybernetics science and remote sensing science will improve remote sensing science to a higher level.
Zedelius, Claire M.; Veling, Harm; Aarts, Henk
2012-01-01
Research has shown that high vs. low value rewards improve cognitive task performance independent of whether they are perceived consciously or unconsciously. However, efficient performance in response to high value rewards also depends on whether or not rewards are attainable. This raises the question of whether unconscious reward processing enables people to take into account such attainability information. Building on a theoretical framework according to which conscious reward processing is required to enable higher level cognitive processing, the present research tested the hypothesis that conscious but not unconscious reward processing enables integration of reward value with attainability information. In two behavioral experiments, participants were exposed to mask high and low value coins serving as rewards on a working memory (WM) task. The likelihood for conscious processing was manipulated by presenting the coins relatively briefly (17 ms) or long and clearly visible (300 ms). Crucially, rewards were expected to be attainable or unattainable. Requirements to integrate reward value with attainability information varied across experiments. Results showed that when integration of value and attainability was required (Experiment 1), long reward presentation led to efficient performance, i.e., selectively improved performance for high value attainable rewards. In contrast, in the short presentation condition, performance was increased for high value rewards even when these were unattainable. This difference between the effects of long and short presentation time disappeared when integration of value and attainability information was not required (Experiment 2). Together these findings suggest that unconsciously processed reward information is not integrated with attainability expectancies, causing inefficient effort investment. These findings are discussed in terms of a unique role of consciousness in efficient allocation of effort to cognitive control processes. PMID:22848198
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 10 2012-01-01 2012-01-01 false Information. 1218.10 Section 1218.10 Agriculture... INFORMATION ORDER Blueberry Promotion, Research, and Information Order Definitions § 1218.10 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 10 2013-01-01 2013-01-01 false Information. 1216.13 Section 1216.13 Agriculture... INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.13 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 10 2012-01-01 2012-01-01 false Information. 1216.13 Section 1216.13 Agriculture... INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.13 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 10 2011-01-01 2011-01-01 false Information. 1216.13 Section 1216.13 Agriculture... INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.13 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 10 2014-01-01 2014-01-01 false Information. 1218.10 Section 1218.10 Agriculture... INFORMATION ORDER Blueberry Promotion, Research, and Information Order Definitions § 1218.10 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 10 2013-01-01 2013-01-01 false Information. 1218.10 Section 1218.10 Agriculture... INFORMATION ORDER Blueberry Promotion, Research, and Information Order Definitions § 1218.10 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 10 2011-01-01 2011-01-01 false Information. 1218.10 Section 1218.10 Agriculture... INFORMATION ORDER Blueberry Promotion, Research, and Information Order Definitions § 1218.10 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 10 2014-01-01 2014-01-01 false Information. 1216.13 Section 1216.13 Agriculture... INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.13 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 10 2010-01-01 2010-01-01 false Information. 1216.13 Section 1216.13 Agriculture... INFORMATION ORDER Peanut Promotion, Research, and Information Order Definitions § 1216.13 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 10 2010-01-01 2010-01-01 false Information. 1218.10 Section 1218.10 Agriculture... INFORMATION ORDER Blueberry Promotion, Research, and Information Order Definitions § 1218.10 Information. Information means information and programs that are designed to increase efficiency in processing and to...
Analysis of E-marketplace Attributes: Assessing The NATO Logistics Stock Exchange
2008-01-01
order processing time Reduction of stock levels Reduction of payment processing time Reduction of excessive stocks Reduction of maverick buying...satisfaction 4,02 0,151 3. Reduction of order processing time 4,27 0,317 15. Reduction of stock levels 3,87 0,484 4. Reduction of payment processing time...information exchange with partners in the supply chain Efficiency Basic Reduction of order processing time Efficiency Important Reduction of
Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude
2011-07-01
Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Selective Attention with Separable Stimuli Using a Speeded Task.
ERIC Educational Resources Information Center
Kolbet, Lori L.; Garvey, Jackie
The ability to allocate attentional resources to relevant aspects of a stimulus event is a critical skill needed for efficient information processing. Evidence suggests that this ability to focus on relevant information without interference is dependent on the nature of the stimulus structure of the information to be processed. To test the…
ERIC Educational Resources Information Center
Hathaway, Walter E.
Efficient and convenient comprehensive information systems, long kept from coming into being by a variety of obstacles, are now made possible by the concept of distributive processing and the technology of micro- and mini-computer networks. Such systems can individualize instruction, group students efficiently, cut administrative costs, streamline…
Temporal Expectation and Information Processing: A Model-Based Analysis
ERIC Educational Resources Information Center
Jepma, Marieke; Wagenmakers, Eric-Jan; Nieuwenhuis, Sander
2012-01-01
People are able to use temporal cues to anticipate the timing of an event, enabling them to process that event more efficiently. We conducted two experiments, using the fixed-foreperiod paradigm (Experiment 1) and the temporal-cueing paradigm (Experiment 2), to assess which components of information processing are speeded when subjects use such…
Information processing efficiency in patients with multiple sclerosis.
Archibald, C J; Fisk, J D
2000-10-01
Reduced information processing efficiency, consequent to impaired neural transmission, has been proposed as underlying various cognitive problems in patients with Multiple Sclerosis (MS). This study employed two measures developed from experimental psychology that control for the potential confound of perceptual-motor abnormalities (Salthouse, Babcock, & Shaw, 1991; Sternberg, 1966, 1969) to assess the speed of information processing and working memory capacity in patients with mild to moderate MS. Although patients had significantly more cognitive complaints than neurologically intact matched controls, their performance on standard tests of immediate memory span did not differ from control participants and their word list learning was within normal limits. On the experimental measures, both relapsing-remitting and secondary-progressive patients exhibited significantly slowed information processing speed relative to controls. However, only the secondary-progressive patients had an additional decrement in working memory capacity. Depression, fatigue, or neurologic disability did not account for performance differences on these measures. While speed of information processing may be slowed early in the disease process, deficits in working memory capacity may appear only as there is progression of MS. It is these latter deficits, however, that may underlie the impairment of new learning that patients with MS demonstrate.
Smart information system for gachon university gil hospital.
Park, Dong Kyun; Jung, Eun Young; Jeong, Byung Hui; Moon, Byung Chan; Kang, Hyung Wook; Tchah, Hann; Han, Gi Seong; Cheng, Woo Sung; Lee, Young Ho
2012-03-01
In this research, the hospital information system of Gachon University Gil hospital is introduced and a future strategy for hospital information systems is proposed. This research introduces the development conditions of hospital information system at Gachon University Gil hospital, information about the development of the enterprise resource planning (ERP), a medical service process improvement system, and the personal health record (PHR) system. The medical service process and work efficiency were improved through the medical service process improvement system, which is the most common hospital information system at Gachon University Gil hospital and which includes an emergency medical service system, an online evaluation system and a round support system. Gachon University Gil hospital developed medical service improvement systems to increase work efficiency of medical team and optimized the systems to prove the availability of high-quality medical services for patients and their families. The PHR-based personalized health care solution is under development and will provide higher quality medical service for more patients in the future.
Smart Information System for Gachon University Gil Hospital
Jung, Eun Young; Jeong, Byung Hui; Moon, Byung Chan; Kang, Hyung Wook; Tchah, Hann; Han, Gi Seong; Cheng, Woo Sung; Lee, Young Ho
2012-01-01
Objectives In this research, the hospital information system of Gachon University Gil hospital is introduced and a future strategy for hospital information systems is proposed. Methods This research introduces the development conditions of hospital information system at Gachon University Gil hospital, information about the development of the enterprise resource planning (ERP), a medical service process improvement system, and the personal health record (PHR) system. Results The medical service process and work efficiency were improved through the medical service process improvement system, which is the most common hospital information system at Gachon University Gil hospital and which includes an emergency medical service system, an online evaluation system and a round support system. Conclusions Gachon University Gil hospital developed medical service improvement systems to increase work efficiency of medical team and optimized the systems to prove the availability of high-quality medical services for patients and their families. The PHR-based personalized health care solution is under development and will provide higher quality medical service for more patients in the future. PMID:22509476
Optimization of Wireless Transceivers under Processing Energy Constraints
NASA Astrophysics Data System (ADS)
Wang, Gaojian; Ascheid, Gerd; Wang, Yanlu; Hanay, Oner; Negra, Renato; Herrmann, Matthias; Wehn, Norbert
2017-09-01
Focus of the article is on achieving maximum data rates under a processing energy constraint. For a given amount of processing energy per information bit, the overall power consumption increases with the data rate. When targeting data rates beyond 100 Gb/s, the system's overall power consumption soon exceeds the power which can be dissipated without forced cooling. To achieve a maximum data rate under this power constraint, the processing energy per information bit must be minimized. Therefore, in this article, suitable processing efficient transmission schemes together with energy efficient architectures and their implementations are investigated in a true cross-layer approach. Target use cases are short range wireless transmitters working at carrier frequencies around 60 GHz and bandwidths between 1 GHz and 10 GHz.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1988-01-01
Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1988-01-01
Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
Information theoretical assessment of image gathering and coding for digital restoration
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; John, Sarah; Reichenbach, Stephen E.
1990-01-01
The process of image-gathering, coding, and restoration is presently treated in its entirety rather than as a catenation of isolated tasks, on the basis of the relationship between the spectral information density of a transmitted signal and the restorability of images from the signal. This 'information-theoretic' assessment accounts for the information density and efficiency of the acquired signal as a function of the image-gathering system's design and radiance-field statistics, as well as for the information efficiency and data compression that are obtainable through the combination of image gathering with coding to reduce signal redundancy. It is found that high information efficiency is achievable only through minimization of image-gathering degradation as well as signal redundancy.
7 CFR 1219.15 - Industry information.
Code of Federal Regulations, 2012 CFR
2012-01-01
... efficiency in processing, enhance the development of new markets and marketing strategies, increase marketing efficiency, and enhance the image of Hass avocados and the Hass avocado industry in the United States. ...
7 CFR 1219.15 - Industry information.
Code of Federal Regulations, 2013 CFR
2013-01-01
... efficiency in processing, enhance the development of new markets and marketing strategies, increase marketing efficiency, and enhance the image of Hass avocados and the Hass avocado industry in the United States. ...
7 CFR 1219.15 - Industry information.
Code of Federal Regulations, 2011 CFR
2011-01-01
... efficiency in processing, enhance the development of new markets and marketing strategies, increase marketing efficiency, and enhance the image of Hass avocados and the Hass avocado industry in the United States. ...
7 CFR 1219.15 - Industry information.
Code of Federal Regulations, 2014 CFR
2014-01-01
... efficiency in processing, enhance the development of new markets and marketing strategies, increase marketing efficiency, and enhance the image of Hass avocados and the Hass avocado industry in the United States. ...
Faghihi, Faramarz; Kolodziejski, Christoph; Fiala, André; Wörgötter, Florentin; Tetzlaff, Christian
2013-12-20
Fruit flies (Drosophila melanogaster) rely on their olfactory system to process environmental information. This information has to be transmitted without system-relevant loss by the olfactory system to deeper brain areas for learning. Here we study the role of several parameters of the fly's olfactory system and the environment and how they influence olfactory information transmission. We have designed an abstract model of the antennal lobe, the mushroom body and the inhibitory circuitry. Mutual information between the olfactory environment, simulated in terms of different odor concentrations, and a sub-population of intrinsic mushroom body neurons (Kenyon cells) was calculated to quantify the efficiency of information transmission. With this method we study, on the one hand, the effect of different connectivity rates between olfactory projection neurons and firing thresholds of Kenyon cells. On the other hand, we analyze the influence of inhibition on mutual information between environment and mushroom body. Our simulations show an expected linear relation between the connectivity rate between the antennal lobe and the mushroom body and firing threshold of the Kenyon cells to obtain maximum mutual information for both low and high odor concentrations. However, contradicting all-day experiences, high odor concentrations cause a drastic, and unrealistic, decrease in mutual information for all connectivity rates compared to low concentration. But when inhibition on the mushroom body is included, mutual information remains at high levels independent of other system parameters. This finding points to a pivotal role of inhibition in fly information processing without which the system efficiency will be substantially reduced.
Mutual information against correlations in binary communication channels.
Pregowska, Agnieszka; Szczepanski, Janusz; Wajnryb, Eligiusz
2015-05-19
Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain. We present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals. Our research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals.
Pure sources and efficient detectors for optical quantum information processing
NASA Astrophysics Data System (ADS)
Zielnicki, Kevin
Over the last sixty years, classical information theory has revolutionized the understanding of the nature of information, and how it can be quantified and manipulated. Quantum information processing extends these lessons to quantum systems, where the properties of intrinsic uncertainty and entanglement fundamentally defy classical explanation. This growing field has many potential applications, including computing, cryptography, communication, and metrology. As inherently mobile quantum particles, photons are likely to play an important role in any mature large-scale quantum information processing system. However, the available methods for producing and detecting complex multi-photon states place practical limits on the feasibility of sophisticated optical quantum information processing experiments. In a typical quantum information protocol, a source first produces an interesting or useful quantum state (or set of states), perhaps involving superposition or entanglement. Then, some manipulations are performed on this state, perhaps involving quantum logic gates which further manipulate or entangle the intial state. Finally, the state must be detected, obtaining some desired measurement result, e.g., for secure communication or computationally efficient factoring. The work presented here concerns the first and last stages of this process as they relate to photons: sources and detectors. Our work on sources is based on the need for optimized non-classical states of light delivered at high rates, particularly of single photons in a pure quantum state. We seek to better understand the properties of spontaneous parameteric downconversion (SPDC) sources of photon pairs, and in doing so, produce such an optimized source. We report an SPDC source which produces pure heralded single photons with little or no spectral filtering, allowing a significant rate enhancement. Our work on detectors is based on the need to reliably measure single-photon states. We have focused on optimizing the detection efficiency of visible light photon counters (VLPCs), a single-photon detection technology that is also capable of resolving photon number states. We report a record-breaking quantum efficiency of 91 +/- 3% observed with our detection system. Both sources and detectors are independently interesting physical systems worthy of study, but together they promise to enable entire new classes and applications of information based on quantum mechanics.
Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee
2016-04-01
In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.
Money, Chris
2018-01-24
The process for undertaking exposure assessments varies dependent on its purpose. But for exposure assessments to be relevant and accurate, they are reliant on access to reliable information on key exposure determinants. Acquiring such information is seldom straightforward and can take significant time and resources. This articles examines how the application of tiered and targeted approaches to information acquisition, within the context of European human health risk assessments, can not only lead to improvements in the efficiency and effectiveness of the process but also in the confidence of stakeholders in its outputs. The article explores how the benefits might be further improved through the coordination of such activities, as well as those areas that represent barriers to wider international harmonisation.
Inhibition of irrelevant information is not necessary to performance of expert chess players.
Postal, Virginie
2012-08-01
Some studies on expertise have demonstrated that the difference between novices and experts can be partly due to a lack of knowledge about which information is relevant for a given situation. This lack of knowledge seems to be associated with the selection of correct information and with inhibitory processes. However, while the efficiency of inhibitory processes can lead to better performance in the normal population, it seems that experts in chess do not base their performance on this process but rather on an automatic and parallel encoding of information. Two experiments investigated the processes involved in a check detection task. The congruence of the information was manipulated in a Stroop situation similar to Reingold, Charness, Scheltetus, & Stampe (2001). The results showed that the experts did not benefit from cuing with a congruent cue and that they did not show any interference effect by the incongruent cue, contrary to less skilled chess players who benefited from cuing (Exp. 1). An attentional priming procedure confirmed the automatic encoding of chess relations in the more skilled chess players by showing no advantage from the prime in this group (Exp. 2). Taken together, the results indicate that the processing was serial for the less skilled chess players and that it was automatic and parallel for the more expert chess players. The inhibition of irrelevant information does not seem necessary to process information rapidly and efficiently.
Using task analysis to improve the requirements elicitation in health information system.
Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa
2007-01-01
This paper describes the application of task analysis within the design process of a Web-based information system for managing clinical information in hemophilia care, in order to improve the requirements elicitation and, consequently, to validate the domain model obtained in a previous phase of the design process (system analysis). The use of task analysis in this case proved to be a practical and efficient way to improve the requirements engineering process by involving users in the design process.
Quality and efficiency successes leveraging IT and new processes.
Chaiken, Barry P; Christian, Charles E; Johnson, Liz
2007-01-01
Today, healthcare annually invests billions of dollars in information technology, including clinical systems, electronic medical records and interoperability platforms. While continued investment and parallel development of standards are critical to secure exponential benefits from clinical information technology, intelligent and creative redesign of processes through path innovation is necessary to deliver meaningful value. Reports from two organizations included in this report review the steps taken to reinvent clinical processes that best leverage information technology to deliver safer and more efficient care. Good Samaritan Hospital, Vincennes, Indiana, implemented electronic charting, point-of-care bar coding of medications prior to administration, and integrated clinical documentation for nursing, laboratory, radiology and pharmacy. Tenet Healthcare, during its implementation and deployment of multiple clinical systems across several hospitals, focused on planning that included team-based process redesign. In addition, Tenet constructed valuable and measurable metrics that link outcomes with its strategic goals.
Silicon photonics for neuromorphic information processing
NASA Astrophysics Data System (ADS)
Bienstman, Peter; Dambre, Joni; Katumba, Andrew; Freiberger, Matthias; Laporte, Floris; Lugnan, Alessio
2018-02-01
We present our latest results on silicon photonics neuromorphic information processing based a.o. on techniques like reservoir computing. We will discuss aspects like scalability, novel architectures for enhanced power efficiency, as well as all-optical readout. Additionally, we will touch upon new machine learning techniques to operate these integrated readouts. Finally, we will show how these systems can be used for high-speed low-power information processing for applications like recognition of biological cells.
Improving the claims process with EDI.
Moynihan, J J
1993-01-01
Electronic data interchange (EDI) is redefining the healthcare claims process. The traditional managerial approach to claims processing emphasizes information flow within the patient accounting department and between patient accounting and other departments. EDI enlarges the scope of the claims process to include information exchange between providers and payers. Using EDI to improve both external and internal information exchange makes the claims process more efficient and less expensive. This article is excerpted from "The Healthcare Financial Manager's Guide to Healthcare EDI," by James J. Moynihan, published by the Healthcare Financial Management Association.
NASA Technical Reports Server (NTRS)
Carnahan, Richard S., Jr.; Corey, Stephen M.; Snow, John B.
1989-01-01
Applications of rapid prototyping and Artificial Intelligence techniques to problems associated with Space Station-era information management systems are described. In particular, the work is centered on issues related to: (1) intelligent man-machine interfaces applied to scientific data user support, and (2) the requirement that intelligent information management systems (IIMS) be able to efficiently process metadata updates concerning types of data handled. The advanced IIMS represents functional capabilities driven almost entirely by the needs of potential users. Space Station-era scientific data projected to be generated is likely to be significantly greater than data currently processed and analyzed. Information about scientific data must be presented clearly, concisely, and with support features to allow users at all levels of expertise efficient and cost-effective data access. Additionally, mechanisms for allowing more efficient IIMS metadata update processes must be addressed. The work reported covers the following IIMS design aspects: IIMS data and metadata modeling, including the automatic updating of IIMS-contained metadata, IIMS user-system interface considerations, including significant problems associated with remote access, user profiles, and on-line tutorial capabilities, and development of an IIMS query and browse facility, including the capability to deal with spatial information. A working prototype has been developed and is being enhanced.
Informed use of patients' records on trusted health care services.
Sahama, Tony; Miller, Evonne
2011-01-01
Health care is an information-intensive business. Sharing information in health care processes is a smart use of data enabling informed decision-making whilst ensuring. the privacy and security of patient information. To achieve this, we propose data encryption techniques embedded Information Accountability Framework (IAF) that establishes transitions of the technological concept, thus enabling understanding of shared responsibility, accessibility, and efficient cost effective informed decisions between health care professionals and patients. The IAF results reveal possibilities of efficient informed medical decision making and minimisation of medical errors. Of achieving this will require significant cultural changes and research synergies to ensure the sustainability, acceptability and durability of the IAF.
An information theory account of cognitive control.
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory.
Enhancements to Demilitarization Process Maps Program (ProMap)
2016-10-14
map tool, ProMap, was improved by implementing new features, and sharing data with MIDAS and AMDIT databases . Specifically, process efficiency was...improved by 1) providing access to APE information contained in the AMDIT database directly from inside ProMap when constructing a process map, 2...what equipment can be efficiently used to demil a particular munition. Associated with this task was the upgrade of the AMDIT database so that
The Influence of Temporal Resolution Power and Working Memory Capacity on Psychometric Intelligence
ERIC Educational Resources Information Center
Troche, Stefan J.; Rammsayer, Thomas H.
2009-01-01
According to the temporal resolution power (TRP) hypothesis, higher TRP as reflected by better performance on psychophysical timing tasks accounts for faster speed of information processing and increased efficiency of information processing leading to better performance on tests of psychometric intelligence. An alternative explanation of…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... Efficiency Program for Commercial and Industrial Equipment: Public Meeting and Availability of the Framework Document for Commercial and Industrial Fans and Blowers AGENCY: Office of Energy Efficiency and Renewable... and industrial fans and blowers. To inform interested parties and to facilitate this process, DOE has...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... Efficiency Program for Commercial and Industrial Equipment: Public Meeting and Availability of the Framework Document for Commercial and Industrial Pumps AGENCY: Office of Energy Efficiency and Renewable Energy... industrial pumps. To inform interested parties and to facilitate this process, DOE has prepared a Framework...
Get Started: Energy Efficiency Makes More Sense Than Ever.
ERIC Educational Resources Information Center
Alban, Josh; Drabick, J. R.
2003-01-01
Describes the benefits of making school building more energy efficient. Provides examples of physical retrofits and behavioral changes to save energy costs. Describes four-step process to create an energy efficiency plan. Includes resources and information such as U.S. Department of Energy's Energy STAR program (www.energystar.gov). (PKP)
Design and implementation for integrated UAV multi-spectral inspection system
NASA Astrophysics Data System (ADS)
Zhu, X.; Li, X.; Yan, F.
2018-04-01
In order to improve the working efficiency of the transmission line inspection and reduce the labour intensity of the inspectors, this paper presents an Unmanned Aerial Vehicle (UAV) inspection system architecture for the transmission line inspection. In this document, the light-duty design for different inspection equipment and processing terminals is completed. It presents the reference design for the information-processing terminal, supporting the inspection and interactive equipment accessing, and obtains all performance indicators of the inspection information processing through the tests. Practical application shows that the UAV inspection system supports access and management of different types of mainstream fault detection equipment, and can implement the independent diagnosis of the detected information to generate inspection reports in line with industry norms, which can meet the fast, timely, and efficient requirements for the power line inspection work.
Managing Knowledge And Information In The Sustainable Organization
NASA Astrophysics Data System (ADS)
Grecu, Valentin
2015-09-01
Knowledge and information management are essential for the success of organizations and bring significant competitive advantages. There has been significant investments in setting up technological platforms that support business processes and increase the efficiency of operational structure in many organizations through an efficient management of knowledge and information. This research highlights the importance of using knowledge and information management in order to increase the competitiveness of organizations and to foster the transition towards the sustainable organization, as nowadays an organization that wants to be competitive needs to be sustainable.
2009-09-01
the most efficient model is developed and validated by applying it to the current IA C&A process flow at the TSO-KC. Finally... models are explored using the Knowledge Value Added (KVA) methodology, and the most efficient model is developed and validated by applying it to the ... models requires only one available actor from its respective group , rather than all actors in the group , to
Information processing in the primate visual system - An integrated systems perspective
NASA Technical Reports Server (NTRS)
Van Essen, David C.; Anderson, Charles H.; Felleman, Daniel J.
1992-01-01
The primate visual system contains dozens of distinct areas in the cerebral cortex and several major subcortical structures. These subdivisions are extensively interconnected in a distributed hierarchical network that contains several intertwined processing streams. A number of strategies are used for efficient information processing within this hierarchy. These include linear and nonlinear filtering, passage through information bottlenecks, and coordinated use of multiple types of information. In addition, dynamic regulation of information flow within and between visual areas may provide the computational flexibility needed for the visual system to perform a broad spectrum of tasks accurately and at high resolution.
Rebar, Amanda L.; Ram, Nilam; Conroy, David E.
2014-01-01
Objective The Single-Category Implicit Association Test (SC-IAT) has been used as a method for assessing automatic evaluations of physical activity, but measurement artifact or consciously-held attitudes could be confounding the outcome scores of these measures. The objective of these two studies was to address these measurement concerns by testing the validity of a novel SC-IAT scoring technique. Design Study 1 was a cross-sectional study, and study 2 was a prospective study. Method In study 1, undergraduate students (N = 104) completed SC-IATs for physical activity, flowers, and sedentary behavior. In study 2, undergraduate students (N = 91) completed a SC-IAT for physical activity, self-reported affective and instrumental attitudes toward physical activity, physical activity intentions, and wore an accelerometer for two weeks. The EZ-diffusion model was used to decompose the SC-IAT into three process component scores including the information processing efficiency score. Results In study 1, a series of structural equation model comparisons revealed that the information processing score did not share variability across distinct SC-IATs, suggesting it does not represent systematic measurement artifact. In study 2, the information processing efficiency score was shown to be unrelated to self-reported affective and instrumental attitudes toward physical activity, and positively related to physical activity behavior, above and beyond the traditional D-score of the SC-IAT. Conclusions The information processing efficiency score is a valid measure of automatic evaluations of physical activity. PMID:25484621
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brush, Adrian; Masanet, Eric; Worrell, Ernst
The U.S. dairy processing industry—defined in this Energy Guide as facilities engaged in the conversion of raw milk to consumable dairy products—consumes around $1.5 billion worth of purchased fuels and electricity per year. Energy efficiency improvement is an important way to reduce these costs and to increase predictable earnings, especially in times of high energy price volatility. There are a variety of opportunities available at individual plants in the U.S. dairy processing industry to reduce energy consumption and greenhouse gas emissions in a cost-effective manner. This Energy Guide discusses energy efficiency practices and energy-efficient technologies that can be implemented atmore » the component, process, facility, and organizational levels. A discussion of the trends, structure, and energy consumption characteristics of the U.S. dairy processing industry is provided along with a description of the major process technologies used within the industry. Next, a wide variety of energy efficiency measures applicable to dairy processing plants are described. Many measure descriptions include expected savings in energy and energy-related costs, based on case study data from real-world applications in dairy processing facilities and related industries worldwide. Typical measure payback periods and references to further information in the technical literature are also provided, when available. Given the importance of water in dairy processing, a summary of basic, proven measures for improving water efficiency are also provided. The information in this Energy Guide is intended to help energy and plant managers in the U.S. dairy processing industry reduce energy and water consumption in a cost-effective manner while maintaining the quality of products manufactured. Further research on the economics of all measures—as well as on their applicability to different production practices—is needed to assess their cost effectiveness at individual plants.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masanet, Eric; Masanet, Eric; Worrell, Ernst
2008-01-01
The U.S. fruit and vegetable processing industry--defined in this Energy Guide as facilities engaged in the canning, freezing, and drying or dehydrating of fruits and vegetables--consumes over $800 million worth of purchased fuels and electricity per year. Energy efficiency improvement isan important way to reduce these costs and to increase predictable earnings, especially in times of high energy price volatility. There are a variety of opportunities available at individual plants in the U.S. fruit and vegetable processing industry to reduce energy consumption in a cost-effective manner. This Energy Guide discusses energy efficiency practices and energy-efficient technologies that can be implementedmore » at the component, process, facility, and organizational levels. A discussion of the trends, structure, and energy consumption characteristics of the U.S. fruit and vegetable processing industry is provided along with a description of the major process technologies used within the industry. Next, a wide variety of energy efficiency measures applicable to fruit and vegetable processing plants are described. Many measure descriptions include expected savings in energy and energy-related costs, based on case study data from real-world applications in fruit and vegetable processing facilities and related industries worldwide. Typical measure payback periods and references to further information in the technical literature are also provided, when available. Given the importance of water in fruit and vegetable processing, a summary of basic, proven measures for improving plant-level water efficiency are also provided. The information in this Energy Guide is intended to help energy and plant managers in the U.S. fruit and vegetable processing industry reduce energy and water consumption in a cost-effective manner while maintaining the quality of products manufactured. Further research on the economics of all measures--as well as on their applicability to different production practices--is needed to assess their cost effectiveness at individual plants.« less
An information theory account of cognitive control
Fan, Jin
2014-01-01
Our ability to efficiently process information and generate appropriate responses depends on the processes collectively called cognitive control. Despite a considerable focus in the literature on the cognitive control of information processing, neural mechanisms underlying control are still unclear, and have not been characterized by considering the quantity of information to be processed. A novel and comprehensive account of cognitive control is proposed using concepts from information theory, which is concerned with communication system analysis and the quantification of information. This account treats the brain as an information-processing entity where cognitive control and its underlying brain networks play a pivotal role in dealing with conditions of uncertainty. This hypothesis and theory article justifies the validity and properties of such an account and relates experimental findings to the frontoparietal network under the framework of information theory. PMID:25228875
Information theoretical assessment of digital imaging systems
NASA Technical Reports Server (NTRS)
John, Sarah; Rahman, Zia-Ur; Huck, Friedrich O.; Reichenbach, Stephen E.
1990-01-01
The end-to-end performance of image gathering, coding, and restoration as a whole is considered. This approach is based on the pivotal relationship that exists between the spectral information density of the transmitted signal and the restorability of images from this signal. The information-theoretical assessment accounts for (1) the information density and efficiency of the acquired signal as a function of the image-gathering system design and the radiance-field statistics, and (2) the improvement in information efficiency and data compression that can be gained by combining image gathering with coding to reduce the signal redundancy and irrelevancy. It is concluded that images can be restored with better quality and from fewer data as the information efficiency of the data is increased. The restoration correctly explains the image gathering and coding processes and effectively suppresses the image-display degradations.
Information theoretical assessment of digital imaging systems
NASA Astrophysics Data System (ADS)
John, Sarah; Rahman, Zia-Ur; Huck, Friedrich O.; Reichenbach, Stephen E.
1990-10-01
The end-to-end performance of image gathering, coding, and restoration as a whole is considered. This approach is based on the pivotal relationship that exists between the spectral information density of the transmitted signal and the restorability of images from this signal. The information-theoretical assessment accounts for (1) the information density and efficiency of the acquired signal as a function of the image-gathering system design and the radiance-field statistics, and (2) the improvement in information efficiency and data compression that can be gained by combining image gathering with coding to reduce the signal redundancy and irrelevancy. It is concluded that images can be restored with better quality and from fewer data as the information efficiency of the data is increased. The restoration correctly explains the image gathering and coding processes and effectively suppresses the image-display degradations.
Reasoning with case histories of process knowledge for efficient process development
NASA Technical Reports Server (NTRS)
Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.
1988-01-01
The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.
Schneider, Thomas D
2010-10-01
The relationship between information and energy is key to understanding biological systems. We can display the information in DNA sequences specifically bound by proteins by using sequence logos, and we can measure the corresponding binding energy. These can be compared by noting that one of the forms of the second law of thermodynamics defines the minimum energy dissipation required to gain one bit of information. Under the isothermal conditions that molecular machines function this is [Formula in text] joules per bit (kB is Boltzmann's constant and T is the absolute temperature). Then an efficiency of binding can be computed by dividing the information in a logo by the free energy of binding after it has been converted to bits. The isothermal efficiencies of not only genetic control systems, but also visual pigments are near 70%. From information and coding theory, the theoretical efficiency limit for bistate molecular machines is ln 2=0.6931. Evolutionary convergence to maximum efficiency is limited by the constraint that molecular states must be distinct from each other. The result indicates that natural molecular machines operate close to their information processing maximum (the channel capacity), and implies that nanotechnology can attain this goal.
Schneider, Thomas D.
2010-01-01
The relationship between information and energy is key to understanding biological systems. We can display the information in DNA sequences specifically bound by proteins by using sequence logos, and we can measure the corresponding binding energy. These can be compared by noting that one of the forms of the second law of thermodynamics defines the minimum energy dissipation required to gain one bit of information. Under the isothermal conditions that molecular machines function this is joules per bit ( is Boltzmann's constant and T is the absolute temperature). Then an efficiency of binding can be computed by dividing the information in a logo by the free energy of binding after it has been converted to bits. The isothermal efficiencies of not only genetic control systems, but also visual pigments are near 70%. From information and coding theory, the theoretical efficiency limit for bistate molecular machines is ln 2 = 0.6931. Evolutionary convergence to maximum efficiency is limited by the constraint that molecular states must be distinct from each other. The result indicates that natural molecular machines operate close to their information processing maximum (the channel capacity), and implies that nanotechnology can attain this goal. PMID:20562221
Establishment of Textbook Information Management System Based on Active Server Page
ERIC Educational Resources Information Center
Geng, Lihua
2011-01-01
In the process of textbook management of universities, the flow of storage, collection and check of textbook is quite complicated and daily management flow and system also seriously constrains the efficiency of the management process. Thus, in order to combine the information management model and the traditional management model, it is necessary…
Changes in Information Processing with Aging: Implications for Teaching Motor Skills.
ERIC Educational Resources Information Center
Anshel, Mark H.
Although there are marked individual differences in the effect of aging on learning and performing motor skills, there is agreement that humans process information less efficiently with advanced age. Significant decrements have been found specifically with motor tasks that are characterized as externally-paced, rapid, complex, and requiring rapid…
NASA Astrophysics Data System (ADS)
Larger, Laurent; Baylón-Fuentes, Antonio; Martinenghi, Romain; Udaltsov, Vladimir S.; Chembo, Yanne K.; Jacquot, Maxime
2017-01-01
Reservoir computing, originally referred to as an echo state network or a liquid state machine, is a brain-inspired paradigm for processing temporal information. It involves learning a "read-out" interpretation for nonlinear transients developed by high-dimensional dynamics when the latter is excited by the information signal to be processed. This novel computational paradigm is derived from recurrent neural network and machine learning techniques. It has recently been implemented in photonic hardware for a dynamical system, which opens the path to ultrafast brain-inspired computing. We report on a novel implementation involving an electro-optic phase-delay dynamics designed with off-the-shelf optoelectronic telecom devices, thus providing the targeted wide bandwidth. Computational efficiency is demonstrated experimentally with speech-recognition tasks. State-of-the-art speed performances reach one million words per second, with very low word error rate. Additionally, to record speed processing, our investigations have revealed computing-efficiency improvements through yet-unexplored temporal-information-processing techniques, such as simultaneous multisample injection and pitched sampling at the read-out compared to information "write-in".
Including Both Time and Accuracy in Defining Text Search Efficiency.
ERIC Educational Resources Information Center
Symons, Sonya; Specht, Jacqueline A.
1994-01-01
Examines factors related to efficiency in a textbook search task. Finds that time and accuracy involved distinct processes and that accuracy was related to verbal competence. Finds further that measures of planning and extracting information accounted for 59% of the variance in search efficiency. Suggests that both accuracy and rate need to be…
Evaluating the process parameters of the dry coating process using a 2(5-1) factorial design.
Kablitz, Caroline Désirée; Urbanetz, Nora Anne
2013-02-01
A recent development of coating technology is dry coating, where polymer powder and liquid plasticizer are layered on the cores without using organic solvents or water. Several studies evaluating the process were introduced in literature, however, little information about the critical process parameters (CPPs) is given. Aim of the study was the investigation and optimization of CPPs with respect to one of the critical quality attributes (CQAs), the coating efficiency of the dry coating process in a rotary fluid bed. Theophylline pellets were coated with hydroxypropyl methylcellulose acetate succinate as enteric film former and triethyl citrate and acetylated monoglyceride as plasticizer. A 2(5-1) design of experiments (DOEs) was created investigating five independent process parameters namely coating temperature, curing temperature, feeding/spraying rate, air flow and rotor speed. The results were evaluated by multilinear regression using the software Modde(®) 7. It is shown, that generally, low feeding/spraying rates and low rotor speeds increase coating efficiency. High coating temperatures enhance coating efficiency, whereas medium curing temperatures have been found to be optimum in terms of coating efficiency. This study provides a scientific base for the design of efficient dry coating processes with respect to coating efficiency.
Exploring the Notion of Context in Medical Data.
Mylonas, Phivos
2017-01-01
Scientific and technological knowledge and skills are becoming crucial for most data analysis activities. Two rather distinct, but at the same time collaborating, domains are the ones of computer science and medicine; the former offers significant aid towards a more efficient understanding of the latter's research trends. Still, the process of meaningfully analyzing and understanding medical information and data is a tedious one, bound to several challenges. One of them is the efficient utilization of contextual information in the process leading to optimized, context-aware data analysis results. Nowadays, researchers are provided with tools and opportunities to analytically study medical data, but at the same time significant and rather complex computational challenges are yet to be tackled, among others due to the humanistic nature and increased rate of new content and information production imposed by related hardware and applications. So, the ultimate goal of this position paper is to provide interested parties an overview of major contextual information types to be identified within the medical data processing framework.
Workflow computing. Improving management and efficiency of pathology diagnostic services.
Buffone, G J; Moreau, D; Beck, J R
1996-04-01
Traditionally, information technology in health care has helped practitioners to collect, store, and present information and also to add a degree of automation to simple tasks (instrument interfaces supporting result entry, for example). Thus commercially available information systems do little to support the need to model, execute, monitor, coordinate, and revise the various complex clinical processes required to support health-care delivery. Workflow computing, which is already implemented and improving the efficiency of operations in several nonmedical industries, can address the need to manage complex clinical processes. Workflow computing not only provides a means to define and manage the events, roles, and information integral to health-care delivery but also supports the explicit implementation of policy or rules appropriate to the process. This article explains how workflow computing may be applied to health-care and the inherent advantages of the technology, and it defines workflow system requirements for use in health-care delivery with special reference to diagnostic pathology.
Project Method in Preparation of Future Preschool Teachers
ERIC Educational Resources Information Center
Anisimova, Ellina; Ibatullin, Rinat
2018-01-01
This article covers the issue of formation of information competence of future preschool teachers. Efficiency of using information technologies in educational process depends on the level of information competence of a teacher. A modern teacher has to use information technologies reasonably, that contribute to enriching of development of cognitive…
Indicators and Metrics for Evaluating the Sustainability of Chemical Processes
A metric-based method, called GREENSCOPE, has been developed for evaluating process sustainability. Using lab-scale information and engineering assumptions the method evaluates full-scale epresentations of processes in environmental, efficiency, energy and economic areas. The m...
Hartley, Douglas E H; Hill, Penny R; Moore, David R
2003-12-01
Claims have been made that language-impaired children have deficits processing rapidly presented or brief sensory information. These claims, known as the 'temporal processing hypothesis', are supported by demonstrations that language-impaired children have excess backward masking (BM). One explanation for these results is that BM is developmentally delayed in these children. However, little was known about how BM normally develops. Recently, we assessed BM in normally developing 6- and 8-year-old children and adults. Results showed that BM thresholds continue to improve over a comparatively protracted period (>10 years old). We also analysed reported deficits in BM in language-impaired and younger children, in terms of a model of temporal resolution. This analysis suggests that poor processing efficiency, rather than deficits in temporal resolution, can account for these results. This 'processing efficiency hypothesis' was recently tested in our laboratory. This experiment measured BM as a function of delays between the tone and the noise in children and adults. Results supported the processing efficiency hypothesis, and suggested that reduced processing efficiency alone could account for differences between adults and children. These findings provide a new perspective on the mechanisms underlying communication disorders, and imply that remediation strategies should be directed towards improving processing efficiency, not temporal resolution.
A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems
NASA Astrophysics Data System (ADS)
Li, Yu; Oberweis, Andreas
Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.
Medverd, Jonathan R; Cross, Nathan M; Font, Frank; Casertano, Andrew
2013-08-01
Radiologists routinely make decisions with only limited information when assigning protocol instructions for the performance of advanced medical imaging examinations. Opportunity exists to simultaneously improve the safety, quality and efficiency of this workflow through the application of an electronic solution leveraging health system resources to provide concise, tailored information and decision support in real-time. Such a system has been developed using an open source, open standards design for use within the Veterans Health Administration. The Radiology Protocol Tool Recorder (RAPTOR) project identified key process attributes as well as inherent weaknesses of paper processes and electronic emulators of paper processes to guide the development of its optimized electronic solution. The design provides a kernel that can be expanded to create an integrated radiology environment. RAPTOR has implications relevant to the greater health care community, and serves as a case model for modernization of legacy government health information systems.
Minimized state complexity of quantum-encoded cryptic processes
NASA Astrophysics Data System (ADS)
Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.
2016-05-01
The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.
Transfer-Efficient Face Routing Using the Planar Graphs of Neighbors in High Density WSNs
Kim, Sang-Ha
2017-01-01
Face routing has been adopted in wireless sensor networks (WSNs) where topological changes occur frequently or maintaining full network information is difficult. For message forwarding in networks, a planar graph is used to prevent looping, and because long edges are removed by planarization and the resulting planar graph is composed of short edges, and messages are forwarded along multiple nodes connected by them even though they can be forwarded directly. To solve this, face routing using information on all nodes within 2-hop range was adopted to forward messages directly to the farthest node within radio range. However, as the density of the nodes increases, network performance plunges because message transfer nodes receive and process increased node information. To deal with this problem, we propose a new face routing using the planar graphs of neighboring nodes to improve transfer efficiency. It forwards a message directly to the farthest neighbor and reduces loads and processing time by distributing network graph construction and planarization to the neighbors. It also decreases the amount of location information to be transmitted by sending information on the planar graph nodes rather than on all neighboring nodes. Simulation results show that it significantly improves transfer efficiency. PMID:29053623
Motivational influences on controlled processing: moderating distractibility in older adults.
Germain, Cassandra M; Hess, Thomas M
2007-09-01
Research has suggested that aging is associated with a decline in the efficiency of controlling processing operations. Three studies examined the moderating impact of personal relevance on age differences in one index of such operations: the ability to ignore distracting information. Young (17-26) and older (58-86) adults read a series of passages interspersed with irrelevant, distracting information, with the relevance of the passage content to these two age groups being systematically varied. For both groups, processing was more efficient and comprehension enhanced when passage relevance was high. These effects were particularly strong among older adults, a finding consistent with a growing body of data highlighting the importance of motivational factors in determining age differences in cognitive performance.
The source of dual-task limitations: Serial or parallel processing of multiple response selections?
Marois, René
2014-01-01
Although it is generally recognized that the concurrent performance of two tasks incurs costs, the sources of these dual-task costs remain controversial. The serial bottleneck model suggests that serial postponement of task performance in dual-task conditions results from a central stage of response selection that can only process one task at a time. Cognitive-control models, by contrast, propose that multiple response selections can proceed in parallel, but that serial processing of task performance is predominantly adopted because its processing efficiency is higher than that of parallel processing. In the present study, we empirically tested this proposition by examining whether parallel processing would occur when it was more efficient and financially rewarded. The results indicated that even when parallel processing was more efficient and was incentivized by financial reward, participants still failed to process tasks in parallel. We conclude that central information processing is limited by a serial bottleneck. PMID:23864266
Shuttle Ground Operations Efficiencies/Technologies (SGOE/T) study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Scholz, A. L.; Hart, M. T.; Lowry, D. J.
1987-01-01
Methods and technolgoy were defined to reduce the overall operations cost of a major space program. Space Shuttle processing at Kennedy Space Center (KSC) was designed as the working model that would be the source of the operational information. Methods of improving efficiency of ground operations were assessed and technology elements that could reduce cost identified. Emphasis is on: (1) specific technology items and (2) management approaches required to develop and support efficient ground operations. Prime study results are to be recommendations on how to achieve more efficient operations and identification of existing or new technology that would make vehicle processing in both the current program and future programs more efficient and, therefore, less costly.
Classification of cognitive systems dedicated to data sharing
NASA Astrophysics Data System (ADS)
Ogiela, Lidia; Ogiela, Marek R.
2017-08-01
In this paper will be presented classification of new cognitive information systems dedicated to cryptographic data splitting and sharing processes. Cognitive processes of semantic data analysis and interpretation, will be used to describe new classes of intelligent information and vision systems. In addition, cryptographic data splitting algorithms and cryptographic threshold schemes will be used to improve processes of secure and efficient information management with application of such cognitive systems. The utility of the proposed cognitive sharing procedures and distributed data sharing algorithms will be also presented. A few possible application of cognitive approaches for visual information management and encryption will be also described.
ERIC Educational Resources Information Center
Abdous, M'hammed; He, Wu
2009-01-01
During the past three years, we have developed and implemented an enterprise information system (EIS) to reengineer and facilitate the administrative process for preparing and teaching distance learning courses in a midsized-to-large university (with 23,000 students). The outcome of the implementation has been a streamlined and efficient process…
76 FR 37344 - Technology Evaluation Process
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-27
...-NOA-0039] Technology Evaluation Process AGENCY: Office of Energy Efficiency and Renewable Energy... is an extension of a prior RFI seeking comment on a proposed commercial buildings technology... seeks comments and information related to a commercial buildings technology evaluation process. DOE is...
76 FR 30696 - Technology Evaluation Process
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
...-NOA-0039] Technology Evaluation Process AGENCY: Office of Energy Efficiency and Renewable Energy... (DOE) seeks comments and information related to a commercial buildings technology evaluation process... technologies for commercial buildings based on the voluntary submittal of product test data. The program would...
Li, Shasha; Zhai, Guofang; Zhou, Shutian; Fan, Chenjing; Wu, Yunqing; Ren, Chongqiang
2017-01-01
Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual’s earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded. PMID:28272359
Li, Shasha; Zhai, Guofang; Zhou, Shutian; Fan, Chenjing; Wu, Yunqing; Ren, Chongqiang
2017-03-07
Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual's earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded.
Jenkins, Clinton N.; Flocks, J.; Kulp, M.; ,
2006-01-01
Information-processing methods are described that integrate the stratigraphic aspects of large and diverse collections of sea-floor sample data. They efficiently convert common types of sea-floor data into database and GIS (geographical information system) tables, visual core logs, stratigraphic fence diagrams and sophisticated stratigraphic statistics. The input data are held in structured documents, essentially written core logs that are particularly efficient to create from raw input datasets. Techniques are described that permit efficient construction of regional databases consisting of hundreds of cores. The sedimentological observations in each core are located by their downhole depths (metres below sea floor - mbsf) and also by a verbal term that describes the sample 'situation' - a special fraction of the sediment or position in the core. The main processing creates a separate output event for each instance of top, bottom and situation, assigning top-base mbsf values from numeric or, where possible, from word-based relative locational information such as 'core catcher' in reference to sampler device, and recovery or penetration length. The processing outputs represent the sub-bottom as a sparse matrix of over 20 sediment properties of interest, such as grain size, porosity and colour. They can be plotted in a range of core-log programs including an in-built facility that better suits the requirements of sea-floor data. Finally, a suite of stratigraphic statistics are computed, including volumetric grades, overburdens, thicknesses and degrees of layering. ?? The Geological Society of London 2006.
An Extreme Learning Machine-Based Neuromorphic Tactile Sensing System for Texture Recognition.
Rasouli, Mahdi; Chen, Yi; Basu, Arindam; Kukreja, Sunil L; Thakor, Nitish V
2018-04-01
Despite significant advances in computational algorithms and development of tactile sensors, artificial tactile sensing is strikingly less efficient and capable than the human tactile perception. Inspired by efficiency of biological systems, we aim to develop a neuromorphic system for tactile pattern recognition. We particularly target texture recognition as it is one of the most necessary and challenging tasks for artificial sensory systems. Our system consists of a piezoresistive fabric material as the sensor to emulate skin, an interface that produces spike patterns to mimic neural signals from mechanoreceptors, and an extreme learning machine (ELM) chip to analyze spiking activity. Benefiting from intrinsic advantages of biologically inspired event-driven systems and massively parallel and energy-efficient processing capabilities of the ELM chip, the proposed architecture offers a fast and energy-efficient alternative for processing tactile information. Moreover, it provides the opportunity for the development of low-cost tactile modules for large-area applications by integration of sensors and processing circuits. We demonstrate the recognition capability of our system in a texture discrimination task, where it achieves a classification accuracy of 92% for categorization of ten graded textures. Our results confirm that there exists a tradeoff between response time and classification accuracy (and information transfer rate). A faster decision can be achieved at early time steps or by using a shorter time window. This, however, results in deterioration of the classification accuracy and information transfer rate. We further observe that there exists a tradeoff between the classification accuracy and the input spike rate (and thus energy consumption). Our work substantiates the importance of development of efficient sparse codes for encoding sensory data to improve the energy efficiency. These results have a significance for a wide range of wearable, robotic, prosthetic, and industrial applications.
ERIC Educational Resources Information Center
Russell, Thyra K.
Morris Library at Southern Illinois University computerized its technical processes using the Library Computer System (LCS), which was implemented in the library to streamline order processing by: (1) providing up-to-date online files to track in-process items; (2) encouraging quick, efficient accessing of information; (3) reducing manual files;…
Information models of software productivity - Limits on productivity growth
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1992-01-01
Research into generalized information-metric models of software process productivity establishes quantifiable behavior and theoretical bounds. The models establish a fundamental mathematical relationship between software productivity and the human capacity for information traffic, the software product yield (system size), information efficiency, and tool and process efficiencies. An upper bound is derived that quantifies average software productivity and the maximum rate at which it may grow. This bound reveals that ultimately, when tools, methodologies, and automated assistants have reached their maximum effective state, further improvement in productivity can only be achieved through increasing software reuse. The reuse advantage is shown not to increase faster than logarithmically in the number of reusable features available. The reuse bound is further shown to be somewhat dependent on the reuse policy: a general 'reuse everything' policy can lead to a somewhat slower productivity growth than a specialized reuse policy.
Rismanchian, Farhood; Lee, Young Hoon
2017-07-01
This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.
Assessment of visual communication by information theory
NASA Astrophysics Data System (ADS)
Huck, Friedrich O.; Fales, Carl L.
1994-01-01
This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.
5 CFR 1320.9 - Agency certifications for proposed collections of information.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Agency certifications for proposed collections of information. 1320.9 Section 1320.9 Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET OMB... efficient and effective management and use of the information to be collected, including the processing of...
Spike processing with a graphene excitable laser
Shastri, Bhavin J.; Nahmias, Mitchell A.; Tait, Alexander N.; Rodriguez, Alejandro W.; Wu, Ben; Prucnal, Paul R.
2016-01-01
Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved “spiking” of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate a unified platform for spike processing with a graphene-coupled laser system. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation—fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system and also propose and simulate an analogous integrated device. The addition of graphene leads to a number of advantages which stem from its unique properties, including high absorption and fast carrier relaxation. These could lead to significant speed and efficiency improvements in unconventional laser processing devices, and ongoing research on graphene microfabrication promises compatibility with integrated laser platforms. PMID:26753897
Faghihi, Faramarz; Moustafa, Ahmed A
2016-09-01
The separation of input patterns received from the entorhinal cortex (EC) by the dentate gyrus (DG) is a well-known critical step of information processing in the hippocampus. Although the role of interneurons in separation pattern efficiency of the DG has been theoretically known, the balance of neurogenesis of excitatory neurons and interneurons as well as its potential role in information processing in the DG is not fully understood. In this work, we study separation efficiency of the DG for different rates of neurogenesis of interneurons and excitatory neurons using a novel computational model in which we assume an increase in the synaptic efficacy between excitatory neurons and interneurons and then its decay over time. Information processing in the EC and DG was simulated as information flow in a two layer feed-forward neural network. The neurogenesis rate was modeled as the percentage of new born neurons added to the neuronal population in each time bin. The results show an important role of an optimal neurogenesis rate of interneurons and excitatory neurons in the DG in efficient separation of inputs from the EC in pattern separation tasks. The model predicts that any deviation of the optimal values of neurogenesis rates leads to different decreased levels of the separation deficits of the DG which influences its function to encode memory.
Designing quantum information processing via structural physical approximation.
Bae, Joonwoo
2017-10-01
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
Designing quantum information processing via structural physical approximation
NASA Astrophysics Data System (ADS)
Bae, Joonwoo
2017-10-01
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
Allen, Thomas J.; Sherman, Jeffrey W.; Conrey, Frederica R.; Stroessner, Steven J.
2009-01-01
In two experiments, we investigated the relationships among stereotype strength, processing capacity, and the allocation of attention to stereotype-consistent versus stereotype-inconsistent information describing a target person. The results of both experiments showed that, with full capacity, greater stereotype strength was associated with increased attention toward stereotype-consistent versus stereotype-inconsistent information. However, when capacity was diminished, greater stereotype strength was associated with increased attention toward inconsistent versus consistent information. Thus, strong stereotypes may act as self-confirming filters when processing capacity is plentiful, but as efficient information gathering devices that maximize the acquisition of novel (disconfirming) information when capacity is depleted. Implications for models of stereotyping and stereotype change are discussed. PMID:20161043
Testing market informational efficiency of Constanta port operators
NASA Astrophysics Data System (ADS)
Roşca, E.; Popa, M.; Ruscă, F.; Burciu, Ş.
2015-11-01
The Romanian capital market is still an emergent one. Following the mass- privatization process and the private investments, three of the most important handling and storage companies acting in Constantza Port (OIL Terminal, Comvex and SOCEP) are listed on Romanian Stock Exchange. The paper investigates their evolution on the market, identifying the expected rate of return and the components of the shares risk (specific and systematic). Also, the price evolution could be analyzed through the informational efficiency which instantly reflects the price relevance. The Jarque-Bera normality test regarding the shares return rate distribution and the Fama test for the informational efficiency are completed for each company. The market price model is taken into consideration for price forecasting, computing the return rate auto-correlations. The results are subject of interpretation considering additional managerial and financial information of the companies’ activity.
Apply creative thinking of decision support in electrical nursing record.
Hao, Angelica Te-Hui; Hsu, Chien-Yeh; Li-Fang, Huang; Jian, Wen-Shan; Wu, Li-Bin; Kao, Ching-Chiu; Lu, Mei-Show; Chang, Her-Kung
2006-01-01
The nursing process consists of five interrelated steps: assessment, diagnosis, planning, intervention, and evaluation. In the nursing process, the nurse collects a great deal of data and information. The amount of data and information may exceed the amount the nurse can process efficiently and correctly. Thus, the nurse needs assistance to become proficient in the planning of nursing care, due to the difficulty of simultaneously processing a large set of information. Computer systems are viewed as tools to expand the capabilities of the nurse's mind. Using computer technology to support clinicians' decision making may provide high-quality, patient-centered, and efficient healthcare. Although some existing nursing information systems aid in the nursing process, they only provide the most fundamental decision support--i.e., standard care plans associated with common nursing diagnoses. Such a computerized decision support system helps the nurse develop a care plan step-by-step. But it does not assist the nurse in the decision-making process. The decision process about how to generate nursing diagnoses from data and how to individualize the care plans still reminds of the nurse. The purpose of this study is to develop a pilot structure in electronic nursing record system integrated with international nursing standard for improving the proficiency and accuracy of plan of care in clinical pathway process. The proposed pilot systems not only assist both student nurses and nurses who are novice in nursing practice, but also experts who need to work in a practice area which they are not familiar with.
Deficits in context-dependent adaptive coding of reward in schizophrenia
Kirschner, Matthias; Hager, Oliver M; Bischof, Martin; Hartmann-Riemer, Matthias N; Kluge, Agne; Seifritz, Erich; Tobler, Philippe N; Kaiser, Stefan
2016-01-01
Theoretical principles of information processing and empirical findings suggest that to efficiently represent all possible rewards in the natural environment, reward-sensitive neurons have to adapt their coding range dynamically to the current reward context. Adaptation ensures that the reward system is most sensitive for the most likely rewards, enabling the system to efficiently represent a potentially infinite range of reward information. A deficit in neural adaptation would prevent precise representation of rewards and could have detrimental effects for an organism’s ability to optimally engage with its environment. In schizophrenia, reward processing is known to be impaired and has been linked to different symptom dimensions. However, despite the fundamental significance of coding reward adaptively, no study has elucidated whether adaptive reward processing is impaired in schizophrenia. We therefore studied patients with schizophrenia (n=27) and healthy controls (n=25), using functional magnetic resonance imaging in combination with a variant of the monetary incentive delay task. Compared with healthy controls, patients with schizophrenia showed less efficient neural adaptation to the current reward context, which leads to imprecise neural representation of reward. Importantly, the deficit correlated with total symptom severity. Our results suggest that some of the deficits in reward processing in schizophrenia might be due to inefficient neural adaptation to the current reward context. Furthermore, because adaptive coding is a ubiquitous feature of the brain, we believe that our findings provide an avenue in defining a general impairment in neural information processing underlying this debilitating disorder. PMID:27430009
Information technology: changing nursing processes at the point-of-care.
Courtney, Karen L; Demiris, George; Alexander, Greg L
2005-01-01
Changing societal demographics, increasing complexity in healthcare knowledge, and increasing nursing shortages have led healthcare strategists to call for a redesign of the healthcare system. Embedded within most redesign recommendations is the increased use of technology to make nursing practice more efficient. However, information technology (IT) has the potential to go beyond simple efficiency increases. If IT is perceived truly as a part of the redesign of healthcare delivery rather than simply the automation of existing processes, then it can change nursing processes within institutions and furthermore change the point-of-care between nurses and patients. Nursing adoption of technology within the workplace is a result of the interactions between technical skills, social acceptance, and workplace culture. Nursing needs for information not only influence their adoption of particular technologies but also shape their design. The objective of this article is to illustrate how IT can change not only nursing practice and processes but also the point-of-care. A case study of the use of IT by nurses in telehomecare is presented and administrative implications are discussed.
An efficiency improvement in warehouse operation using simulation analysis
NASA Astrophysics Data System (ADS)
Samattapapong, N.
2017-11-01
In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.
ParaBTM: A Parallel Processing Framework for Biomedical Text Mining on Supercomputers.
Xing, Yuting; Wu, Chengkun; Yang, Xi; Wang, Wei; Zhu, En; Yin, Jianping
2018-04-27
A prevailing way of extracting valuable information from biomedical literature is to apply text mining methods on unstructured texts. However, the massive amount of literature that needs to be analyzed poses a big data challenge to the processing efficiency of text mining. In this paper, we address this challenge by introducing parallel processing on a supercomputer. We developed paraBTM, a runnable framework that enables parallel text mining on the Tianhe-2 supercomputer. It employs a low-cost yet effective load balancing strategy to maximize the efficiency of parallel processing. We evaluated the performance of paraBTM on several datasets, utilizing three types of named entity recognition tasks as demonstration. Results show that, in most cases, the processing efficiency can be greatly improved with parallel processing, and the proposed load balancing strategy is simple and effective. In addition, our framework can be readily applied to other tasks of biomedical text mining besides NER.
2012-01-01
Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846
Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki
2012-12-21
There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.
Sensitivity to the Sampling Process Emerges From the Principle of Efficiency.
Jara-Ettinger, Julian; Sun, Felix; Schulz, Laura; Tenenbaum, Joshua B
2018-05-01
Humans can seamlessly infer other people's preferences, based on what they do. Broadly, two types of accounts have been proposed to explain different aspects of this ability. The first account focuses on spatial information: Agents' efficient navigation in space reveals what they like. The second account focuses on statistical information: Uncommon choices reveal stronger preferences. Together, these two lines of research suggest that we have two distinct capacities for inferring preferences. Here we propose that this is not the case, and that spatial-based and statistical-based preference inferences can be explained by the assumption that agents are efficient alone. We show that people's sensitivity to spatial and statistical information when they infer preferences is best predicted by a computational model of the principle of efficiency, and that this model outperforms dual-system models, even when the latter are fit to participant judgments. Our results suggest that, as adults, a unified understanding of agency under the principle of efficiency underlies our ability to infer preferences. Copyright © 2018 Cognitive Science Society, Inc.
An Ada implementation of the network manager for the advanced information processing system
NASA Technical Reports Server (NTRS)
Nagle, Gail A.
1986-01-01
From an implementation standpoint, the Ada language provided many features which facilitated the data and procedure abstraction process. The language supported a design which was dynamically flexible (despite strong typing), modular, and self-documenting. Adequate training of programmers requires access to an efficient compiler which supports full Ada. When the performance issues for real time processing are finally addressed by more stringent requirements for tasking features and the development of efficient run-time environments for embedded systems, the full power of the language will be realized.
A new perspective on the perceptual selectivity of attention under load.
Giesbrecht, Barry; Sy, Jocelyn; Bundesen, Claus; Kyllingsbaek, Søren
2014-05-01
The human attention system helps us cope with a complex environment by supporting the selective processing of information relevant to our current goals. Understanding the perceptual, cognitive, and neural mechanisms that mediate selective attention is a core issue in cognitive neuroscience. One prominent model of selective attention, known as load theory, offers an account of how task demands determine when information is selected and an account of the efficiency of the selection process. However, load theory has several critical weaknesses that suggest that it is time for a new perspective. Here we review the strengths and weaknesses of load theory and offer an alternative biologically plausible computational account that is based on the neural theory of visual attention. We argue that this new perspective provides a detailed computational account of how bottom-up and top-down information is integrated to provide efficient attentional selection and allocation of perceptual processing resources. © 2014 New York Academy of Sciences.
FPGA implementation of sparse matrix algorithm for information retrieval
NASA Astrophysics Data System (ADS)
Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio
2005-06-01
Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.
7 CFR 1219.15 - Industry information.
Code of Federal Regulations, 2010 CFR
2010-01-01
... efficiency in processing, enhance the development of new markets and marketing strategies, increase marketing... 7 Agriculture 10 2010-01-01 2010-01-01 false Industry information. 1219.15 Section 1219.15 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Guo, Wei; Wenning, Thomas J.
Smart manufacturing and advanced data analytics can help the manufacturing sector unlock energy efficiency from the equipment level to the entire manufacturing facility and the whole supply chain. These technologies can make manufacturing industries more competitive, with intelligent communication systems, real-time energy savings, and increased energy productivity. Smart manufacturing can give all employees in an organization the actionable information they need, when they need it, so that each person can contribute to the optimal operation of the corporation through informed, data-driven decision making. This paper examines smart technologies and data analytics approaches for improving energy efficiency and reducing energy costsmore » in process-supporting energy systems. It dives into energy-saving improvement opportunities through smart manufacturing technologies and sophisticated data collection and analysis. The energy systems covered in this paper include those with motors and drives, fans, pumps, air compressors, steam, and process heating.« less
The Strategic Thinking Process: Efficient Mobilization of Human Resources for System Definition
Covvey, H. D.
1987-01-01
This paper describes the application of several group management techniques to the creation of needs specifications and information systems strategic plans in health care institutions. The overall process is called the “Strategic Thinking Process”. It is a formal methodology that can reduce the time and cost of creating key documents essential for the successful implementation of health care information systems.
Industrial application of semantic process mining
NASA Astrophysics Data System (ADS)
Espen Ingvaldsen, Jon; Atle Gulla, Jon
2012-05-01
Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.
Improving Working Memory Efficiency by Reframing Metacognitive Interpretation of Task Difficulty
ERIC Educational Resources Information Center
Autin, Frederique; Croizet, Jean-Claude
2012-01-01
Working memory capacity, our ability to manage incoming information for processing purposes, predicts achievement on a wide range of intellectual abilities. Three randomized experiments (N = 310) tested the effectiveness of a brief psychological intervention designed to boost working memory efficiency (i.e., state working memory capacity) by…
Information Processing of Remote-Sensing Data.
ERIC Educational Resources Information Center
Berry, P. A. M.; Meadows, A. J.
1987-01-01
Reviews the current status of satellite remote sensing data, including problems with efficient storage and rapid retrieval of the data, and appropriate computer graphics to process images. Areas of research concerned with overcoming these problems are described. (16 references) (CLB)
The quantum mitochondrion and optimal health
Nunn, Alistair V.W.; Guy, Geoffrey W.; Bell, Jimmy D.
2016-01-01
A sufficiently complex set of molecules, if subject to perturbation, will self-organize and show emergent behaviour. If such a system can take on information it will become subject to natural selection. This could explain how self-replicating molecules evolved into life and how intelligence arose. A pivotal step in this evolutionary process was of course the emergence of the eukaryote and the advent of the mitochondrion, which both enhanced energy production per cell and increased the ability to process, store and utilize information. Recent research suggest that from its inception life embraced quantum effects such as ‘tunnelling’ and ‘coherence’ while competition and stressful conditions provided a constant driver for natural selection. We believe that the biphasic adaptive response to stress described by hormesis–a process that captures information to enable adaptability, is central to this whole process. Critically, hormesis could improve mitochondrial quantum efficiency, improving the ATP/ROS ratio, whereas inflammation, which is tightly associated with the aging process, might do the opposite. This all suggests that to achieve optimal health and healthy aging, one has to sufficiently stress the system to ensure peak mitochondrial function, which itself could reflect selection of optimum efficiency at the quantum level. PMID:27528758
Palm, Günther
2016-01-01
Research in neural information processing has been successful in the past, providing useful approaches both to practical problems in computer science and to computational models in neuroscience. Recent developments in the area of cognitive neuroscience present new challenges for a computational or theoretical understanding asking for neural information processing models that fulfill criteria or constraints from cognitive psychology, neuroscience and computational efficiency. The most important of these criteria for the evaluation of present and future contributions to this new emerging field are listed at the end of this article. PMID:26858632
Distributed Load Shedding over Directed Communication Networks with Time Delays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Tao; Wu, Di
When generation is insufficient to support all loads under emergencies, effective and efficient load shedding needs to be deployed in order to maintain the supply-demand balance. This paper presents a distributed load shedding algorithm, which makes efficient decision based on the discovered global information. In the global information discovery process, each load only communicates with its neighboring load via directed communication links possibly with arbitrarily large but bounded time varying communication delays. We propose a novel distributed information discovery algorithm based on ratio consensus. Simulation results are used to validate the proposed method.
Efficient terrestrial laser scan segmentation exploiting data structure
NASA Astrophysics Data System (ADS)
Mahmoudabadi, Hamid; Olsen, Michael J.; Todorovic, Sinisa
2016-09-01
New technologies such as lidar enable the rapid collection of massive datasets to model a 3D scene as a point cloud. However, while hardware technology continues to advance, processing 3D point clouds into informative models remains complex and time consuming. A common approach to increase processing efficiently is to segment the point cloud into smaller sections. This paper proposes a novel approach for point cloud segmentation using computer vision algorithms to analyze panoramic representations of individual laser scans. These panoramas can be quickly created using an inherent neighborhood structure that is established during the scanning process, which scans at fixed angular increments in a cylindrical or spherical coordinate system. In the proposed approach, a selected image segmentation algorithm is applied on several input layers exploiting this angular structure including laser intensity, range, normal vectors, and color information. These segments are then mapped back to the 3D point cloud so that modeling can be completed more efficiently. This approach does not depend on pre-defined mathematical models and consequently setting parameters for them. Unlike common geometrical point cloud segmentation methods, the proposed method employs the colorimetric and intensity data as another source of information. The proposed algorithm is demonstrated on several datasets encompassing variety of scenes and objects. Results show a very high perceptual (visual) level of segmentation and thereby the feasibility of the proposed algorithm. The proposed method is also more efficient compared to Random Sample Consensus (RANSAC), which is a common approach for point cloud segmentation.
ERIC Educational Resources Information Center
Olaniran, Bolanle A.
2010-01-01
The semantic web describes the process whereby information content is made available for machine consumption. With increased reliance on information communication technologies, the semantic web promises effective and efficient information acquisition and dissemination of products and services in the global economy, in particular, e-learning.…
On the assessment of visual communication by information theory
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; Fales, Carl L.
1993-01-01
This assessment of visual communication integrates the optical design of the image-gathering device with the digital processing for image coding and restoration. Results show that informationally optimized image gathering ordinarily can be relied upon to maximize the information efficiency of decorrelated data and the visual quality of optimally restored images.
Choi, Inyoung; Choi, Ran; Lee, Jonghyun
2010-01-01
Objectives The objective of this research is to introduce the unique approach of the Catholic Medical Center (CMC) integrate network hospitals with organizational and technical methodologies adopted for seamless implementation. Methods The Catholic Medical Center has developed a new hospital information system to connect network hospitals and adopted new information technology architecture which uses single source for multiple distributed hospital systems. Results The hospital information system of the CMC was developed to integrate network hospitals adopting new system development principles; one source, one route and one management. This information architecture has reduced the cost for system development and operation, and has enhanced the efficiency of the management process. Conclusions Integrating network hospital through information system was not simple; it was much more complicated than single organization implementation. We are still looking for more efficient communication channel and decision making process, and also believe that our new system architecture will be able to improve CMC health care system and provide much better quality of health care service to patients and customers. PMID:21818432
75 FR 42296 - Safe, Efficient Use and Preservation of the Navigable Airspace
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-21
... facilitates the aeronautical study process and has reduced the overall processing time for these cases. The... cases to be processed, particularly if additional information, via public comment period, was necessary... the permit application is not necessary. There are cases where circulating the proposal for public...
78 FR 53436 - Improving Performance of Federal Permitting and Review of Infrastructure Projects
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-29
... an efficient decision-making process within each agency; to the extent possible, unifying and... IIP Process, the developer is encouraged to inform DOE in writing as soon as possible of its decision... to improve the performance of Federal siting, permitting, and review processes for infrastructure...
Ho, Tiffany C; Zhang, Shunan; Sacchet, Matthew D; Weng, Helen; Connolly, Colm G; Henje Blom, Eva; Han, Laura K M; Mobayed, Nisreen O; Yang, Tony T
2016-01-01
While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed.
Ho, Tiffany C.; Zhang, Shunan; Sacchet, Matthew D.; Weng, Helen; Connolly, Colm G.; Henje Blom, Eva; Han, Laura K. M.; Mobayed, Nisreen O.; Yang, Tony T.
2016-01-01
While the extant literature has focused on major depressive disorder (MDD) as being characterized by abnormalities in processing affective stimuli (e.g., facial expressions), little is known regarding which specific aspects of cognition influence the evaluation of affective stimuli, and what are the underlying neural correlates. To investigate these issues, we assessed 26 adolescents diagnosed with MDD and 37 well-matched healthy controls (HCL) who completed an emotion identification task of dynamically morphing faces during functional magnetic resonance imaging (fMRI). We analyzed the behavioral data using a sequential sampling model of response time (RT) commonly used to elucidate aspects of cognition in binary perceptual decision making tasks: the Linear Ballistic Accumulator (LBA) model. Using a hierarchical Bayesian estimation method, we obtained group-level and individual-level estimates of LBA parameters on the facial emotion identification task. While the MDD and HCL groups did not differ in mean RT, accuracy, or group-level estimates of perceptual processing efficiency (i.e., drift rate parameter of the LBA), the MDD group showed significantly reduced responses in left fusiform gyrus compared to the HCL group during the facial emotion identification task. Furthermore, within the MDD group, fMRI signal in the left fusiform gyrus during affective face processing was significantly associated with greater individual-level estimates of perceptual processing efficiency. Our results therefore suggest that affective processing biases in adolescents with MDD are characterized by greater perceptual processing efficiency of affective visual information in sensory brain regions responsible for the early processing of visual information. The theoretical, methodological, and clinical implications of our results are discussed. PMID:26869950
Energy-efficient neural information processing in individual neurons and neuronal networks.
Yu, Lianchun; Yu, Yuguo
2017-11-01
Brains are composed of networks of an enormous number of neurons interconnected with synapses. Neural information is carried by the electrical signals within neurons and the chemical signals among neurons. Generating these electrical and chemical signals is metabolically expensive. The fundamental issue raised here is whether brains have evolved efficient ways of developing an energy-efficient neural code from the molecular level to the circuit level. Here, we summarize the factors and biophysical mechanisms that could contribute to the energy-efficient neural code for processing input signals. The factors range from ion channel kinetics, body temperature, axonal propagation of action potentials, low-probability release of synaptic neurotransmitters, optimal input and noise, the size of neurons and neuronal clusters, excitation/inhibition balance, coding strategy, cortical wiring, and the organization of functional connectivity. Both experimental and computational evidence suggests that neural systems may use these factors to maximize the efficiency of energy consumption in processing neural signals. Studies indicate that efficient energy utilization may be universal in neuronal systems as an evolutionary consequence of the pressure of limited energy. As a result, neuronal connections may be wired in a highly economical manner to lower energy costs and space. Individual neurons within a network may encode independent stimulus components to allow a minimal number of neurons to represent whole stimulus characteristics efficiently. This basic principle may fundamentally change our view of how billions of neurons organize themselves into complex circuits to operate and generate the most powerful intelligent cognition in nature. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
45 CFR 205.36 - State plan requirements.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., of an automated statewide management information system designed effectively and efficiently, to assist management in the administration of an approved AFDC State plan. The submission process to amend... aid or services is changed. (c) To electronically refer and exchange information with programs under...
NASA Astrophysics Data System (ADS)
Dettmer, J.; Quijano, J. E.; Dosso, S. E.; Holland, C. W.; Mandolesi, E.
2016-12-01
Geophysical seabed properties are important for the detection and classification of unexploded ordnance. However, current surveying methods such as vertical seismic profiling, coring, or inversion are of limited use when surveying large areas with high spatial sampling density. We consider surveys based on a source and receiver array towed by an autonomous vehicle which produce large volumes of seabed reflectivity data that contain unprecedented and detailed seabed information. The data are analyzed with a particle filter, which requires efficient reflection-coefficient computation, efficient inversion algorithms and efficient use of computer resources. The filter quantifies information content of multiple sequential data sets by considering results from previous data along the survey track to inform the importance sampling at the current point. Challenges arise from environmental changes along the track where the number of sediment layers and their properties change. This is addressed by a trans-dimensional model in the filter which allows layering complexity to change along a track. Efficiency is improved by likelihood tempering of various particle subsets and including exchange moves (parallel tempering). The filter is implemented on a hybrid computer that combines central processing units (CPUs) and graphics processing units (GPUs) to exploit three levels of parallelism: (1) fine-grained parallel computation of spherical reflection coefficients with a GPU implementation of Levin integration; (2) updating particles by concurrent CPU processes which exchange information using automatic load balancing (coarse grained parallelism); (3) overlapping CPU-GPU communication (a major bottleneck) with GPU computation by staggering CPU access to the multiple GPUs. The algorithm is applied to spherical reflection coefficients for data sets along a 14-km track on the Malta Plateau, Mediterranean Sea. We demonstrate substantial efficiency gains over previous methods. [This research was supported in part by the U.S. Dept of Defense, thought the Strategic Environmental Research and Development Program (SERDP).
Sensitivity Analysis and Optimization of Enclosure Radiation with Applications to Crystal Growth
NASA Technical Reports Server (NTRS)
Tiller, Michael M.
1995-01-01
In engineering, simulation software is often used as a convenient means for carrying out experiments to evaluate physical systems. The benefit of using simulations as 'numerical' experiments is that the experimental conditions can be easily modified and repeated at much lower cost than the comparable physical experiment. The goal of these experiments is to 'improve' the process or result of the experiment. In most cases, the computational experiments employ the same trial and error approach as their physical counterparts. When using this approach for complex systems, the cause and effect relationship of the system may never be fully understood and efficient strategies for improvement never utilized. However, it is possible when running simulations to accurately and efficiently determine the sensitivity of the system results with respect to simulation to accurately and efficiently determine the sensitivity of the system results with respect to simulation parameters (e.g., initial conditions, boundary conditions, and material properties) by manipulating the underlying computations. This results in a better understanding of the system dynamics and gives us efficient means to improve processing conditions. We begin by discussing the steps involved in performing simulations. Then we consider how sensitivity information about simulation results can be obtained and ways this information may be used to improve the process or result of the experiment. Next, we discuss optimization and the efficient algorithms which use sensitivity information. We draw on all this information to propose a generalized approach for integrating simulation and optimization, with an emphasis on software programming issues. After discussing our approach to simulation and optimization we consider an application involving crystal growth. This application is interesting because it includes radiative heat transfer. We discuss the computation of radiative new factors and the impact this mode of heat transfer has on our approach. Finally, we will demonstrate the results of our optimization.
Light storage in a cold atomic ensemble with a high optical depth
NASA Astrophysics Data System (ADS)
Park, Kwang-Kyoon; Chough, Young-Tak; Kim, Yoon-Ho
2017-06-01
A quantum memory with a high storage efficiency and a long coherence time is an essential element in quantum information applications. Here, we report our recent development of an optical quantum memory with a rubidium-87 cold atom ensemble. By increasing the optical depth of the medium, we have achieved a storage efficiency of 65% and a coherence time of 51 μs for a weak laser pulse. The result of a numerical analysis based on the Maxwell-Bloch equations agrees well with the experimental results. Our result paves the way toward an efficient optical quantum memory and may find applications in photonic quantum information processing.
Efficient Saccade Planning Requires Time and Clear Choices
Ghahghaei, Saiedeh; Verghese, Preeti
2015-01-01
We use eye movements constantly to gather information. Saccades are efficient when they maximize the information required for the task, however there is controversy regarding the efficiency of eye movement planning. For example, saccades are efficient when searching for a single target (Nature, 434 (2005) 387–91), but are inefficient when searching for an unknown number of targets in noise, particularly under time pressure (Vision Research 74 (2012), 61–71). In this study, we used a multiple-target search paradigm and explored whether altering the noise level or increasing saccadic latency improved efficiency. Experiments used stimuli with two levels of discriminability such that saccades to the less discriminable stimuli provided more information. When these two noise levels corresponded to low and moderate visibility, most observers did not preferentially select informative locations, but looked at uncertain and probable target locations equally often. We then examined whether eye movements could be made more efficient by increasing the discriminability of the two stimulus levels and by delaying the first saccade so that there was more time for decision processes to influence the saccade choices. Some observers did indeed increase the proportion of their saccades to informative locations under these conditions. Others, however, made as many saccades as they could during the limited time and were unselective about the saccade goal. A clear trend that emerges across all experiments is that conditions with a greater proportion of efficient saccades are associated with a longer latency to initiate saccades, suggesting that the choice of informative locations requires deliberate planning. PMID:26037735
DESIGNING EFFICIENT, ECONOMIC AND ENVIRONMENTALLY FRIENDLY CHEMICAL PROCESSES
A catalytic reforming process has been studied using hierarchical design and simulation calculations. Aproximations for the fugitive emissions indicate which streams allow the most value to be lost and which have the highest potential environmental impact. One can use tis inform...
The forest, the trees, and the leaves: Differences of processing across development.
Krakowski, Claire-Sara; Poirel, Nicolas; Vidal, Julie; Roëll, Margot; Pineau, Arlette; Borst, Grégoire; Houdé, Olivier
2016-08-01
To act and think, children and adults are continually required to ignore irrelevant visual information to focus on task-relevant items. As real-world visual information is organized into structures, we designed a feature visual search task containing 3-level hierarchical stimuli (i.e., local shapes that constituted intermediate shapes that formed the global figure) that was presented to 112 participants aged 5, 6, 9, and 21 years old. This task allowed us to explore (a) which level is perceptively the most salient at each age (i.e., the fastest detected level) and (b) what kind of attentional processing occurs for each level across development (i.e., efficient processing: detection time does not increase with the number of stimuli on the display; less efficient processing: detection time increases linearly with the growing number of distractors). Results showed that the global level was the most salient at 5 years of age, whereas the global and intermediate levels were both salient for 9-year-olds and adults. Interestingly, at 6 years of age, the intermediate level was the most salient level. Second, all participants showed an efficient processing of both intermediate and global levels of hierarchical stimuli, and a less efficient processing of the local level, suggesting a local disadvantage rather than a global advantage in visual search. The cognitive cost for selecting the local target was higher for 5- and 6-year-old children compared to 9-year-old children and adults. These results are discussed with regards to the development of executive control. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Lo Storto, Corrado
2013-11-01
This paper presents an integrative framework to evaluate ecommerce website efficiency from the user viewpoint using Data Envelopment Analysis (DEA). This framework is inspired by concepts driven from theories of information processing and cognition and considers the website efficiency as a measure of its quality and performance. When the users interact with the website interfaces to perform a task, they are involved in a cognitive effort, sustaining a cognitive cost to search, interpret and process information, and experiencing either a sense of satisfaction or dissatisfaction for that. The amount of ambiguity and uncertainty, and the search (over-)time during navigation that they perceive determine the effort size - and, as a consequence, the cognitive cost amount - they have to bear to perform their task. On the contrary, task performing and result achievement provide the users with cognitive benefits, making interaction with the website potentially attractive, satisfying, and useful. In total, 9 variables are measured, classified in a set of 3 website macro-dimensions (user experience, site navigability and structure). The framework is implemented to compare 52 ecommerce websites that sell products in the information technology and media market. A stepwise regression is performed to assess the influence of cognitive costs and benefits that mostly affect website efficiency. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang
1999-01-01
Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230
Oppliger, Joel; da Palma, Joel Ramos; Burri, Dominique J; Bergeron, Eric; Khatib, Abdel-Majid; Spiropoulou, Christina F; Pasquato, Antonella; Kunz, Stefan
2016-01-15
Arenaviruses are emerging viruses including several causative agents of severe hemorrhagic fevers in humans. The advent of next-generation sequencing technology has greatly accelerated the discovery of novel arenavirus species. However, for many of these viruses, only genetic information is available, and their zoonotic disease potential remains unknown. During the arenavirus life cycle, processing of the viral envelope glycoprotein precursor (GPC) by the cellular subtilisin kexin isozyme 1 (SKI-1)/site 1 protease (S1P) is crucial for productive infection. The ability of newly emerging arenaviruses to hijack human SKI-1/S1P appears, therefore, to be a requirement for efficient zoonotic transmission and human disease potential. Here we implement a newly developed cell-based molecular sensor for SKI-1/S1P to characterize the processing of arenavirus GPC-derived target sequences by human SKI-1/S1P in a quantitative manner. We show that only nine amino acids flanking the putative cleavage site are necessary and sufficient to accurately recapitulate the efficiency and subcellular location of arenavirus GPC processing. In a proof of concept, our sensor correctly predicts efficient processing of the GPC of the newly emergent pathogenic Lujo virus by human SKI-1/S1P and defines the exact cleavage site. Lastly, we employed our sensor to show efficient GPC processing of a panel of pathogenic and nonpathogenic New World arenaviruses, suggesting that GPC cleavage represents no barrier for zoonotic transmission of these pathogens. Our SKI-1/S1P sensor thus represents a rapid and robust test system for assessment of the processing of putative cleavage sites derived from the GPCs of newly discovered arenavirus by the SKI-1/S1P of humans or any other species, based solely on sequence information. Arenaviruses are important emerging human pathogens that can cause severe hemorrhagic fevers with high mortality in humans. A crucial step in productive arenavirus infection of human cells is the processing of the viral envelope glycoprotein by the cellular subtilisin kexin isozyme 1 (SKI-1)/site 1 protease (S1P). In order to break the species barrier during zoonotic transmission and cause severe disease in humans, newly emerging arenaviruses must be able to hijack human SKI-1/S1P efficiently. Here we implement a newly developed cell-based molecular sensor for human SKI-1/S1P to characterize the processing of arenavirus glycoproteins in a quantitative manner. We further use our sensor to correctly predict efficient processing of the glycoprotein of the newly emergent pathogenic Lujo virus by human SKI-1/S1P. Our sensor thus represents a rapid and robust test system with which to assess whether the glycoprotein of any newly emerging arenavirus can be efficiently processed by human SKI-1/S1P, based solely on sequence information. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Basics for sensorimotor information processing: some implications for learning
Vidal, Franck; Meckler, Cédric; Hasbroucq, Thierry
2015-01-01
In sensorimotor activities, learning requires efficient information processing, whether in car driving, sport activities or human–machine interactions. Several factors may affect the efficiency of such processing: they may be extrinsic (i.e., task-related) or intrinsic (i.e., subjects-related). The effects of these factors are intimately related to the structure of human information processing. In the present article we will focus on some of them, which are poorly taken into account, even when minimizing errors or their consequences is an essential issue at stake. Among the extrinsic factors, we will discuss, first, the effects of the quantity and quality of information, secondly, the effects of instruction and thirdly motor program learning. Among the intrinsic factors, we will discuss first the influence of prior information, secondly how individual strategies affect performance and, thirdly, we will stress the fact that although the human brain is not structured to function errorless (which is not new) humans are able to detect their errors very quickly and (in most of the cases), fast enough to correct them before they result in an overt failure. Extrinsic and intrinsic factors are important to take into account for learning because (1) they strongly affect performance, either in terms of speed or accuracy, which facilitates or impairs learning, (2) the effect of certain extrinsic factors may be strongly modified by learning and (3) certain intrinsic factors might be exploited for learning strategies. PMID:25762944
Poor sleep quality predicts deficient emotion information processing over time in early adolescence.
Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran
2011-11-01
There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.
HMI conventions for process control graphics.
Pikaar, Ruud N
2012-01-01
Process operators supervise and control complex processes. To enable the operator to do an adequate job, instrumentation and process control engineers need to address several related topics, such as console design, information design, navigation, and alarm management. In process control upgrade projects, usually a 1:1 conversion of existing graphics is proposed. This paper suggests another approach, efficiently leading to a reduced number of new powerful process graphics, supported by a permanent process overview displays. In addition a road map for structuring content (process information) and conventions for the presentation of objects, symbols, and so on, has been developed. The impact of the human factors engineering approach on process control upgrade projects is illustrated by several cases.
Reasoning and memory: People make varied use of the information available in working memory.
Hardman, Kyle O; Cowan, Nelson
2016-05-01
Working memory (WM) is used for storing information in a highly accessible state so that other mental processes, such as reasoning, can use that information. Some WM tasks require that participants not only store information, but also reason about that information to perform optimally on the task. In this study, we used visual WM tasks that had both storage and reasoning components to determine both how ideally people are able to reason about information in WM and if there is a relationship between information storage and reasoning. We developed novel psychological process models of the tasks that allowed us to estimate for each participant both how much information they had in WM and how efficiently they reasoned about that information. Our estimates of information use showed that participants are not all ideal information users or minimal information users, but rather that there are individual differences in the thoroughness of information use in our WM tasks. However, we found that our participants tended to be more ideal than minimal. One implication of this work is that to accurately estimate the amount of information in WM, it is important to also estimate how efficiently that information is used. This new analysis contributes to the theoretical premise that human rationality may be bounded by the complexity of task demands. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Reasoning and memory: People make varied use of the information available in working memory
Hardman, Kyle O.; Cowan, Nelson
2015-01-01
Working memory (WM) is used for storing information in a highly-accessible state so that other mental processes, such as reasoning, can use that information. Some WM tasks require that participants not only store information, but also reason about that information in order to perform optimally on the task. In this study, we used visual WM tasks that had both storage and reasoning components in order to determine both how ideally people are able to reason about information in WM and if there is a relationship between information storage and reasoning. We developed novel psychological process models of the tasks that allowed us to estimate for each participant both how much information they had in WM and how efficiently they reasoned about that information. Our estimates of information use showed that participants are not all ideal information users or minimal information users, but rather that there are individual differences in the thoroughness of information use in our WM tasks. However, we found that our participants tended to be more ideal than minimal. One implication of this work is that in order to accurately estimate the amount of information in WM, it is important to also estimate how efficiently that information is used. This new analysis contributes to the theoretical premise that human rationality may be bounded by the complexity of task demands. PMID:26569436
Mobile mammography: An evaluation of organizational, process, and information systems challenges.
Browder, Casey; Eberth, Jan M; Schooley, Benjamin; Porter, Nancy R
2015-03-01
The purpose of this case study was to evaluate the information systems, personnel, and processes involved in mobile mammography settings, and offer recommendations to improve efficiency and satisfaction among patients and staff. Data includes on-site observations, interviews, and an electronic medical record review of a hospital who offers both mobile and fixed facility mammography services to their community. The optimal expectations for the process of mobile mammography from multiple perspectives were defined as (1) patient receives mammogram the day of their visit, (2) patient has efficient intake process with little wait time, (3) follow-up is completed and timely, (4) site contact and van staff are satisfied with van visit and choose to schedule future visits, and (5) the MMU is able to assess its performance and set goals for improvement. Challenges that prevent the realization of those expectations include a low patient pre-registration rate, difficulty obtaining required physician orders, frequent information system downtime/Internet connectivity issues, ill-defined organizational communication/roles, insufficient site host/patient education, and disparate organizational and information systems. Our recommendations include employing a dedicated mobile mammography team for end-to-end oversight, mitigating for system connectivity issues, allowing for patient self-referrals, integrating scheduling and registration processes, and a focused approach to educating site hosts and respective patients about expectations for the day of the visit. The MMU is an important community resource; we recommend simple process improvements and information flow improvements to further enable the MMU׳s goals. Copyright © 2015 Elsevier Inc. All rights reserved.
Process and representation in graphical displays
NASA Technical Reports Server (NTRS)
Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne
1993-01-01
Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?
Intelligent Work Process Engineering System
NASA Technical Reports Server (NTRS)
Williams, Kent E.
2003-01-01
Optimizing performance on work activities and processes requires metrics of performance for management to monitor and analyze in order to support further improvements in efficiency, effectiveness, safety, reliability and cost. Information systems are therefore required to assist management in making timely, informed decisions regarding these work processes and activities. Currently information systems regarding Space Shuttle maintenance and servicing do not exist to make such timely decisions. The work to be presented details a system which incorporates various automated and intelligent processes and analysis tools to capture organize and analyze work process related data, to make the necessary decisions to meet KSC organizational goals. The advantages and disadvantages of design alternatives to the development of such a system will be discussed including technologies, which would need to bedesigned, prototyped and evaluated.
Authomatization of Digital Collection Access Using Mobile and Wireless Data Terminals
NASA Astrophysics Data System (ADS)
Leontiev, I. V.
Information technologies become vital due to information processing needs, database access, data analysis and decision support. Currently, a lot of scientific projects are oriented on database integration of heterogeneous systems. The problem of on-line and rapid access to large integrated systems of digital collections is also very important. Usually users move between different locations, either at work or at home. In most cases users need an efficient and remote access to information, stored in integrated data collections. Desktop computers are unable to fulfill the needs, so mobile and wireless devices become helpful. Handhelds and data terminals are nessessary in medical assistance (they store detailed information about each patient, and helpful for nurses), immediate access to data collections is used in a Highway patrol services (databanks of cars, owners, driver licences). Using mobile access, warehouse operations can be validated. Library and museum items cyclecounting will speed up using online barcode-scanning and central database access. That's why mobile devices - cell phones, PDA, handheld computers with wireless access, WindowsCE and PalmOS terminals become popular. Generally, mobile devices have a relatively slow processor, and limited display capabilities, but they are effective for storing and displaying textual data, recognize user hand-writing with stylus, support GUI. Users can perform operations on handheld terminal, and exchange data with the main system (using immediate radio access, or offline access during syncronization process) for update. In our report, we give an approach for mobile access to data collections, which raises an efficiency of data processing in a book library, helps to control available books, books in stock, validate service charges, eliminate staff mistakes, generate requests for book delivery. Our system uses mobile devices Symbol RF (with radio-channel access), and data terminals Symbol Palm Terminal for batch-processing and synchronization with remote library databases. We discuss the use of PalmOS-compatible devices, and WindowsCE terminals. Our software system is based on modular, scalable three-tier architecture. Additional functionality can be easily customized. Scalability is also supplied by Internet / Intranet technologies, and radio-access points. The base module of the system supports generic warehouse operations: cyclecounting with handheld barcode-scanners, efficient items delivery and issue, item movement, reserving, report generating on finished and in-process operations. Movements are optimized using worker's current location, operations are sorted in a priority order and transmitted to mobile and wireless worker's terminals. Mobile terminals improve of tasks processing control, eliminate staff mistakes, display actual information about main processes, provide data for online-reports, and significantly raise the efficiency of data exchange.
Devine, Sean D
2016-02-01
Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.
Deterministic realization of collective measurements via photonic quantum walks.
Hou, Zhibo; Tang, Jun-Feng; Shang, Jiangwei; Zhu, Huangjun; Li, Jian; Yuan, Yuan; Wu, Kang-Da; Xiang, Guo-Yong; Li, Chuan-Feng; Guo, Guang-Can
2018-04-12
Collective measurements on identically prepared quantum systems can extract more information than local measurements, thereby enhancing information-processing efficiency. Although this nonclassical phenomenon has been known for two decades, it has remained a challenging task to demonstrate the advantage of collective measurements in experiments. Here, we introduce a general recipe for performing deterministic collective measurements on two identically prepared qubits based on quantum walks. Using photonic quantum walks, we realize experimentally an optimized collective measurement with fidelity 0.9946 without post selection. As an application, we achieve the highest tomographic efficiency in qubit state tomography to date. Our work offers an effective recipe for beating the precision limit of local measurements in quantum state tomography and metrology. In addition, our study opens an avenue for harvesting the power of collective measurements in quantum information-processing and for exploring the intriguing physics behind this power.
Correlated activity supports efficient cortical processing
Hung, Chou P.; Cui, Ding; Chen, Yueh-peng; Lin, Chia-pei; Levine, Matthew R.
2015-01-01
Visual recognition is a computational challenge that is thought to occur via efficient coding. An important concept is sparseness, a measure of coding efficiency. The prevailing view is that sparseness supports efficiency by minimizing redundancy and correlations in spiking populations. Yet, we recently reported that “choristers”, neurons that behave more similarly (have correlated stimulus preferences and spontaneous coincident spiking), carry more generalizable object information than uncorrelated neurons (“soloists”) in macaque inferior temporal (IT) cortex. The rarity of choristers (as low as 6% of IT neurons) indicates that they were likely missed in previous studies. Here, we report that correlation strength is distinct from sparseness (choristers are not simply broadly tuned neurons), that choristers are located in non-granular output layers, and that correlated activity predicts human visual search efficiency. These counterintuitive results suggest that a redundant correlational structure supports efficient processing and behavior. PMID:25610392
DOT National Transportation Integrated Search
1998-09-16
The Intermodal Surface Transportation Efficiency Act requires a proactive : public involvement process that provides complete information, timely public : notice, full public access to key decisions, and supports early and continuing : involvement of...
ERIC Educational Resources Information Center
Abt, Clark C.
Educational games present the complex realities of simultaneous interactive processes more accurately and effectively than serial processes such as lecturing and reading. Objectives of educational gaming are to motivate students by presenting relevant and realistic problems and to induce more efficient and active understanding of information.…
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound
NASA Astrophysics Data System (ADS)
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
Efficiency versus speed in quantum heat engines: Rigorous constraint from Lieb-Robinson bound.
Shiraishi, Naoto; Tajima, Hiroyasu
2017-08-01
A long-standing open problem whether a heat engine with finite power achieves the Carnot efficiency is investgated. We rigorously prove a general trade-off inequality on thermodynamic efficiency and time interval of a cyclic process with quantum heat engines. In a first step, employing the Lieb-Robinson bound we establish an inequality on the change in a local observable caused by an operation far from support of the local observable. This inequality provides a rigorous characterization of the following intuitive picture that most of the energy emitted from the engine to the cold bath remains near the engine when the cyclic process is finished. Using this description, we prove an upper bound on efficiency with the aid of quantum information geometry. Our result generally excludes the possibility of a process with finite speed at the Carnot efficiency in quantum heat engines. In particular, the obtained constraint covers engines evolving with non-Markovian dynamics, which almost all previous studies on this topic fail to address.
The Status and Promise of Advanced M&V: An Overview of “M&V 2.0” Methods, Tools, and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franconi, Ellen; Gee, Matt; Goldberg, Miriam
Advanced measurement and verification (M&V) of energy efficiency savings, often referred to as M&V 2.0 or advanced M&V, is currently an object of much industry attention. Thus far, however, there has been a lack of clarity about what techniques M&V 2.0 includes, how those techniques differ from traditional approaches, what the key considerations are for their use, and what value propositions M&V 2.0 presents to different stakeholders. The objective of this paper is to provide background information and frame key discussion points related to advanced M&V. The paper identifies the benefits, methods, and requirements of advanced M&V and outlines keymore » technical issues for applying these methods. It presents an overview of the distinguishing elements of M&V 2.0 tools and of how the industry is addressing needs for tool testing, consistency, and standardization, and it identifies opportunities for collaboration. In this paper, we consider two key features of M&V 2.0: (1) automated analytics that can provide ongoing, near-real-time savings estimates, and (2) increased data granularity in terms of frequency, volume, or end-use detail. Greater data granularity for large numbers of customers, such as that derived from comprehensive implementation of advanced metering infrastructure (AMI) systems, leads to very large data volumes. This drives interest in automated processing systems. It is worth noting, however, that automated processing can provide value even when applied to less granular data, such as monthly consumption data series. Likewise, more granular data, such as interval or end-use data, delivers value with or without automated processing, provided the processing is manageable. But it is the combination of greater data detail with automated processing that offers the greatest opportunity for value. Using M&V methods that capture load shapes together with automated processing1 can determine savings in near-real time to provide stakeholders with more timely and detailed information. This information can be used to inform ongoing building operations, provide early input on energy efficiency program design, or assess the impact of efficiency by location and time of day. Stakeholders who can make use of such information include regulators, energy efficiency program administrators, program evaluators, contractors and aggregators, building owners, the investment community, and grid planners. Although each stakeholder has its own priorities and challenges related to savings measurement and verification, the potential exists for all to draw from a single set of efficiency valuation data. Such an integrated approach could provide a base consistency across stakeholder uses.« less
Conceptual net energy output for biofuel production from lignocellulosic biomass through biorefining
J.Y. Zhu; X.S. Zhuang
2012-01-01
There is a lack of comprehensive information in the retrievable literature on pilot scale process and energy data using promising process technologies and commercially scalable and available capital equipment for lignocellulosic biomass biorefining. This study conducted a comprehensive review of the energy efficiency of selected sugar platform biorefinery process...
An Evaluation of the Decennial Review Process.
ERIC Educational Resources Information Center
Barak, Robert; And Others
Information on a review of the decennial review process required by the Arizona Board of Regents (ABOR) for every academic department in each Arizona university is presented as part of the final report by ABOR's Task Force on Excellence, Efficiency and Competitiveness. Revision in the review process and eight policy statements are discussed, and…
Improving patient access and streamlining processes through enterprise intelligence systems.
Dunn, Ronald L
2014-01-01
This article demonstrates how enterprise intelligence systems can be used to improve operational efficiency in hospitals. Enterprise intelligence systems mine raw data from disparate systems and transform the data into actionable information, which when used appropriately, support streamlined processes, optimize resources, and positively affect staff efficiency and the quality of patient care. Case studies on the implementation of McKesson Performance Visibility and Capacity Planner enterprise intelligence solutions at the Southlake Regional Health Centre and Lions Gate and Richmond Hospitals are provided.
Intensive care unit without walls: seeking patient safety by improving the efficiency of the system.
Gordo, F; Abella, A
2014-10-01
The term "ICU without walls" refers to innovative management in Intensive Care, based on two key elements: (1) collaboration of all medical and nursing staff involved in patient care during hospitalization and (2) technological support for severity early detection protocols by identifying patients at risk of deterioration throughout the hospital, based on the assessment of vital signs and/or laboratory test values, with the clear aim of improving critical patient safety in the hospitalization process. At present, it can be affirmed that there is important work to be done in the detection of severity and early intervention in patients at risk of organ dysfunction. Such work must be adapted to the circumstances of each center and should include training in the detection of severity, multidisciplinary work in the complete patient clinical process, and the use of technological systems allowing intervention on the basis of monitored laboratory and physiological parameters, with effective and efficient use of the information generated. Not only must information be generated, but also efficient management of such information must also be achieved. It is necessary to improve our activity through innovation in management procedures that facilitate the work of the intensivist, in collaboration with other specialists, throughout the hospital environment. Innovation is furthermore required in the efficient management of the information generated in hospitals, through intelligent and directed usage of the new available technology. Copyright © 2014 Elsevier España, S.L.U. and SEMICYUC. All rights reserved.
Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series
Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael
2014-01-01
Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489
The effects of serotonin manipulations on emotional information processing and mood.
Merens, Wendelien; Willem Van der Does, A J; Spinhoven, Philip
2007-11-01
Serotonin is implicated in both mood and cognition. It has recently been shown that antidepressant treatment has immediate effects on emotional information processing, which is much faster than any clinically significant effects. This review aims to investigate whether the effects on emotional information processing are reliable, and whether these effects are related to eventual clinical outcome. Treatment-efficiency may be greatly improved if early changes in emotional information processing are found to predict clinical outcome following antidepressant treatment. Review of studies investigating the short-term effects of serotonin manipulations (including medication) on the processing of emotional information, using PubMed and PsycInfo databases. Twenty-five studies were identified. Serotonin manipulations were found to affect attentional bias, facial emotion recognition, emotional memory, dysfunctional attitudes and decision making. The sequential link between changes in emotional processing and mood remains to be further investigated. The number of studies on serotonin manipulations and emotional information processing in currently depressed subjects is small. No studies yet have directly tested the link between emotional information processing and clinical outcome during the course of antidepressant treatment. Serotonin function is related to several aspects of emotional information processing, but it is unknown whether these changes predict or have any relationship with clinical outcome. Suggestions for future research are provided.
Miyoshi, Kazuchika; Rzucidlo, S Jacek; Pratt, Scott L; Stice, Steven L
2003-04-01
The low efficiency of somatic cell cloning is the major obstacle to widespread use of this technology. Incomplete nuclear reprogramming following the transfer of donor nuclei into recipient oocytes has been implicated as a primary reason for the low efficiency of the cloning procedure. The mechanisms and factors that affect the progression of the nuclear reprogramming process have not been completely elucidated, but the identification of these factors and their subsequent manipulation would increase cloning efficiency. At present, many groups are studying donor nucleus reprogramming. Here, we present an approach in which the efficiency of producing viable offspring is improved by selecting recipient oocytes and donor cells that will produce cloned embryos with functionally reprogrammed nuclei. This approach will produce information useful in future studies aimed at further deciphering the nuclear reprogramming process.
75 FR 77847 - Proposed Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-14
... Corporation's web- based grants management system, eGrants. Current Action The Corporation seeks to renew the... Summary to improve the efficiency and effectiveness of the peer review process. The information collection...: Corporation for National and Community Service, Senior Corps, Attention: Mr. Zach Rhein, Program Officer, Room...
Information System for Educational Policy and Administration.
ERIC Educational Resources Information Center
Clayton, J. C., Jr.
Educational Information System (EIS) is a proposed computer-based data processing system to help schools solve current educational problems more efficiently. The system would allow for more effective administrative operations in student scheduling, financial accounting, and long range planning. It would also assist school trustees and others in…
Children and Adults Integrate Talker and Verb Information in Online Processing
ERIC Educational Resources Information Center
Borovsky, Arielle; Creel, Sarah C.
2014-01-01
Children seem able to efficiently interpret a variety of linguistic cues during speech comprehension, yet have difficulty interpreting sources of nonlinguistic and paralinguistic information that accompany speech. The current study asked whether (paralinguistic) voice-activated role knowledge is rapidly interpreted in coordination with a…
Non-Markovianity and reservoir memory of quantum channels: a quantum information theory perspective
Bylicka, B.; Chruściński, D.; Maniscalco, S.
2014-01-01
Quantum technologies rely on the ability to coherently transfer information encoded in quantum states along quantum channels. Decoherence induced by the environment sets limits on the efficiency of any quantum-enhanced protocol. Generally, the longer a quantum channel is the worse its capacity is. We show that for non-Markovian quantum channels this is not always true: surprisingly the capacity of a longer channel can be greater than of a shorter one. We introduce a general theoretical framework linking non-Markovianity to the capacities of quantum channels and demonstrate how harnessing non-Markovianity may improve the efficiency of quantum information processing and communication. PMID:25043763
NASA Astrophysics Data System (ADS)
Quirin, Sean Albert
The joint application of tailored optical Point Spread Functions (PSF) and estimation methods is an important tool for designing quantitative imaging and sensing solutions. By enhancing the information transfer encoded by the optical waves into an image, matched post-processing algorithms are able to complete tasks with improved performance relative to conventional designs. In this thesis, new engineered PSF solutions with image processing algorithms are introduced and demonstrated for quantitative imaging using information-efficient signal processing tools and/or optical-efficient experimental implementations. The use of a 3D engineered PSF, the Double-Helix (DH-PSF), is applied as one solution for three-dimensional, super-resolution fluorescence microscopy. The DH-PSF is a tailored PSF which was engineered to have enhanced information transfer for the task of localizing point sources in three dimensions. Both an information- and optical-efficient implementation of the DH-PSF microscope are demonstrated here for the first time. This microscope is applied to image single-molecules and micro-tubules located within a biological sample. A joint imaging/axial-ranging modality is demonstrated for application to quantifying sources of extended transverse and axial extent. The proposed implementation has improved optical-efficiency relative to prior designs due to the use of serialized cycling through select engineered PSFs. This system is demonstrated for passive-ranging, extended Depth-of-Field imaging and digital refocusing of random objects under broadband illumination. Although the serialized engineered PSF solution is an improvement over prior designs for the joint imaging/passive-ranging modality, it requires the use of multiple PSFs---a potentially significant constraint. Therefore an alternative design is proposed, the Single-Helix PSF, where only one engineered PSF is necessary and the chromatic behavior of objects under broadband illumination provides the necessary information transfer. The matched estimation algorithms are introduced along with an optically-efficient experimental system to image and passively estimate the distance to a test object. An engineered PSF solution is proposed for improving the sensitivity of optical wave-front sensing using a Shack-Hartmann Wave-front Sensor (SHWFS). The performance limits of the classical SHWFS design are evaluated and the engineered PSF system design is demonstrated to enhance performance. This system is fabricated and the mechanism for additional information transfer is identified.
Capotosto, Paolo; Perrucci, M Gianni; Brunetti, Marcella; Del Gratta, Cosimo; Doppelmayr, Michael; Grabner, Roland H; Klimesch, Wolfgang; Neubauer, Aljoscha; Neuper, Christa; Pfurtscheller, Gert; Romani, Gian Luca; Babiloni, Claudio
2009-12-28
More intelligent persons (high IQ) typically present a higher cortical activity during tasks requiring the encoding of visuo-spatial information, namely higher alpha (about 10 Hz) event-related desynchronization (ERD; Doppelmayr et al., 2005). The opposite is true ("neural efficiency") during the retrieval of the encoded information, as revealed by both lower alpha ERD and/or lower theta (about 5 Hz) event-related synchronization (ERS; Grabner et al., 2004). To reconcile these contrasting results, here we evaluated the working hypothesis that more intelligent male subjects are characterized by a high cortical activity during the encoding phase. This deep encoding would explain the relatively low cortical activity for the retrieval of the encoded information. To test this hypothesis, electroencephalographic (EEG) data were recorded in 22 healthy young male volunteers during visuo-spatial information processing (encoding) and short-term retrieval of the encoded information. Cortical activity was indexed by theta ERS and alpha ERD. It was found that the higher the subjects' total IQ, the stronger the frontal theta ERS during the encoding task. Furthermore, the higher the subjects' total IQ, the lower the frontal high-frequency alpha ERD (about 10-12 Hz) during the retrieval task. This was not true for parietal counterpart of these EEG rhythms. These results reconcile previous contrasting evidence confirming that more intelligent persons do not ever show event-related cortical responses compatible with "neural efficiency" hypothesis. Rather, their cortical activity would depend on flexible and task-adapting features of frontal activation.
Process mining is an underutilized clinical research tool in transfusion medicine.
Quinn, Jason G; Conrad, David M; Cheng, Calvino K
2017-03-01
To understand inventory performance, transfusion services commonly use key performance indicators (KPIs) as summary descriptors of inventory efficiency that are graphed, trended, and used to benchmark institutions. Here, we summarize current limitations in KPI-based evaluation of blood bank inventory efficiency and propose process mining as an ideal methodology for application to inventory management research to improve inventory flows and performance. The transit of a blood product from inventory receipt to final disposition is complex and relates to many internal and external influences, and KPIs may be inadequate to fully understand the complexity of the blood supply chain and how units interact with its processes. Process mining lends itself well to analysis of blood bank inventories, and modern laboratory information systems can track nearly all of the complex processes that occur in the blood bank. Process mining is an analytical tool already used in other industries and can be applied to blood bank inventory management and research through laboratory information systems data using commercial applications. Although the current understanding of real blood bank inventories is value-centric through KPIs, it potentially can be understood from a process-centric lens using process mining. © 2017 AABB.
78 FR 65674 - Agency Information Collection Activities: Proposed Collection: Public Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-01
... will streamline the application submission process, enable an efficient award determination process... nursing and other qualified academic departments offering eligible advanced master's and/or doctoral degree nursing education programs that will prepare students to teach. Burden Statement: Burden in this...
45 CFR 205.36 - State plan requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., of an automated statewide management information system designed effectively and efficiently, to assist management in the administration of an approved AFDC State plan. The submission process to amend... account for— (1) All the factors in the total eligibility determination process under the plan for aid...
NASA Astrophysics Data System (ADS)
Bouty, A. A.; Koniyo, M. H.; Novian, D.
2018-02-01
This study aims to determine the level of maturity of information technology governance in Gorontalo city government by applying the COBIT framework 4.1. The research method is the case study method, by conducting surveys and data collection at 25 institution in Gorontalo City. The results of this study is the analysis of information technology needs based on the measurement of maturity level. The results of the measurement of the maturity level of information technology governance shows that there are still many business processes running at lower level, from 9 existing business processes there are 4 processes at level 2 (repetitive but intuitive) and 3 processes at level 1 (Initial/Ad hoc). With these results, is expected that the government of Gorontalo city immediately make improvements to the governance of information technology so that it can run more effectively and efficiently.
A Semantic Approach for Geospatial Information Extraction from Unstructured Documents
NASA Astrophysics Data System (ADS)
Sallaberry, Christian; Gaio, Mauro; Lesbegueries, Julien; Loustau, Pierre
Local cultural heritage document collections are characterized by their content, which is strongly attached to a territory and its land history (i.e., geographical references). Our contribution aims at making the content retrieval process more efficient whenever a query includes geographic criteria. We propose a core model for a formal representation of geographic information. It takes into account characteristics of different modes of expression, such as written language, captures of drawings, maps, photographs, etc. We have developed a prototype that fully implements geographic information extraction (IE) and geographic information retrieval (IR) processes. All PIV prototype processing resources are designed as Web Services. We propose a geographic IE process based on semantic treatment as a supplement to classical IE approaches. We implement geographic IR by using intersection computing algorithms that seek out any intersection between formal geocoded representations of geographic information in a user query and similar representations in document collection indexes.
Liu, Fei; Zhang, Xi; Jia, Yan
2015-01-01
In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.
The quantum mitochondrion and optimal health.
Nunn, Alistair V W; Guy, Geoffrey W; Bell, Jimmy D
2016-08-15
A sufficiently complex set of molecules, if subject to perturbation, will self-organize and show emergent behaviour. If such a system can take on information it will become subject to natural selection. This could explain how self-replicating molecules evolved into life and how intelligence arose. A pivotal step in this evolutionary process was of course the emergence of the eukaryote and the advent of the mitochondrion, which both enhanced energy production per cell and increased the ability to process, store and utilize information. Recent research suggest that from its inception life embraced quantum effects such as 'tunnelling' and 'coherence' while competition and stressful conditions provided a constant driver for natural selection. We believe that the biphasic adaptive response to stress described by hormesis-a process that captures information to enable adaptability, is central to this whole process. Critically, hormesis could improve mitochondrial quantum efficiency, improving the ATP/ROS ratio, whereas inflammation, which is tightly associated with the aging process, might do the opposite. This all suggests that to achieve optimal health and healthy aging, one has to sufficiently stress the system to ensure peak mitochondrial function, which itself could reflect selection of optimum efficiency at the quantum level. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.
Homo heuristicus: why biased minds make better inferences.
Gigerenzer, Gerd; Brighton, Henry
2009-01-01
Heuristics are efficient cognitive processes that ignore information. In contrast to the widely held view that less processing reduces accuracy, the study of heuristics shows that less information, computation, and time can in fact improve accuracy. We review the major progress made so far: (a) the discovery of less-is-more effects; (b) the study of the ecological rationality of heuristics, which examines in which environments a given strategy succeeds or fails, and why; (c) an advancement from vague labels to computational models of heuristics; (d) the development of a systematic theory of heuristics that identifies their building blocks and the evolved capacities they exploit, and views the cognitive system as relying on an "adaptive toolbox;" and (e) the development of an empirical methodology that accounts for individual differences, conducts competitive tests, and has provided evidence for people's adaptive use of heuristics. Homo heuristicus has a biased mind and ignores part of the available information, yet a biased mind can handle uncertainty more efficiently and robustly than an unbiased mind relying on more resource-intensive and general-purpose processing strategies. Copyright © 2009 Cognitive Science Society, Inc.
The integration of emotional and symbolic components in multimodal communication
Mehu, Marc
2015-01-01
Human multimodal communication can be said to serve two main purposes: information transfer and social influence. In this paper, I argue that different components of multimodal signals play different roles in the processes of information transfer and social influence. Although the symbolic components of communication (e.g., verbal and denotative signals) are well suited to transfer conceptual information, emotional components (e.g., non-verbal signals that are difficult to manipulate voluntarily) likely take a function that is closer to social influence. I suggest that emotion should be considered a property of communicative signals, rather than an entity that is transferred as content by non-verbal signals. In this view, the effect of emotional processes on communication serve to change the quality of social signals to make them more efficient at producing responses in perceivers, whereas symbolic components increase the signals’ efficiency at interacting with the cognitive processes dedicated to the assessment of relevance. The interaction between symbolic and emotional components will be discussed in relation to the need for perceivers to evaluate the reliability of multimodal signals. PMID:26217280
2005-05-01
efficiencies similar to those in the private sector . However, along the way, Government and private sector industry have begun to disagree about how PPI is...double that of the private sector due to an evaluation process that is cumbersome, time-consuming, and lacking the efficiencies enjoyed by private
A Prudent Access Control Behavioral Intention Model for the Healthcare Domain
ERIC Educational Resources Information Center
Mussa, Constance C.
2011-01-01
In recent years, many health care organizations have begun to take advantage of computerized information systems to facilitate more effective and efficient management and processing of information. However, commensurate with the vastly innovative enhancements that computer technology has contributed to traditional paper-based health care…
What's ahead in automated lumber grading
D. Earl Kline; Richard Conners; Philip A. Araman
1998-01-01
This paper discusses how present scanning technologies are being applied to automatic lumber grading. The presentation focuses on 1) what sensing and scanning devices are needed to measure information for accurate grading feature detection, 2) the hardware and software needed to efficiently process this information, and 3) specific issues related to softwood lumber...
2000-06-01
As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.
2017-09-17
design process requires teams to analyze and organize information in a manner that communicates efficiently with stakeholders. This communication is...share information (with each other and local school districts) on available/applicable grants b. How to help school districts identify and/or...graphed below. The graph compares the advance’s relative impact on the ability of the Air Force to maintain 12 information and decision dominance (x
Lu, Xinyan
2016-01-01
There is a clear requirement for enhancing laboratory information management during early absorption, distribution, metabolism and excretion (ADME) screening. The application of a commercial laboratory information management system (LIMS) is limited by complexity, insufficient flexibility, high costs and extended timelines. An improved custom in-house LIMS for ADME screening was developed using Excel. All Excel templates were generated through macros and formulae, and information flow was streamlined as much as possible. This system has been successfully applied in task generation, process control and data management, with a reduction in both labor time and human error rates. An Excel-based LIMS can provide a simple, flexible and cost/time-saving solution for improving workflow efficiencies in early ADME screening.
Methods and Apparatus for Autonomous Robotic Control
NASA Technical Reports Server (NTRS)
Gorshechnikov, Anatoly (Inventor); Livitz, Gennady (Inventor); Versace, Massimiliano (Inventor); Palma, Jesse (Inventor)
2017-01-01
Sensory processing of visual, auditory, and other sensor information (e.g., visual imagery, LIDAR, RADAR) is conventionally based on "stovepiped," or isolated processing, with little interactions between modules. Biological systems, on the other hand, fuse multi-sensory information to identify nearby objects of interest more quickly, more efficiently, and with higher signal-to-noise ratios. Similarly, examples of the OpenSense technology disclosed herein use neurally inspired processing to identify and locate objects in a robot's environment. This enables the robot to navigate its environment more quickly and with lower computational and power requirements.
Real-time encoding and compression of neuronal spikes by metal-oxide memristors
NASA Astrophysics Data System (ADS)
Gupta, Isha; Serb, Alexantrou; Khiat, Ali; Zeitler, Ralf; Vassanelli, Stefano; Prodromakis, Themistoklis
2016-09-01
Advanced brain-chip interfaces with numerous recording sites bear great potential for investigation of neuroprosthetic applications. The bottleneck towards achieving an efficient bio-electronic link is the real-time processing of neuronal signals, which imposes excessive requirements on bandwidth, energy and computation capacity. Here we present a unique concept where the intrinsic properties of memristive devices are exploited to compress information on neural spikes in real-time. We demonstrate that the inherent voltage thresholds of metal-oxide memristors can be used for discriminating recorded spiking events from background activity and without resorting to computationally heavy off-line processing. We prove that information on spike amplitude and frequency can be transduced and stored in single devices as non-volatile resistive state transitions. Finally, we show that a memristive device array allows for efficient data compression of signals recorded by a multi-electrode array, demonstrating the technology's potential for building scalable, yet energy-efficient on-node processors for brain-chip interfaces.
Tang, Shiming; Zhang, Yimeng; Li, Zhihao; Li, Ming; Liu, Fang; Jiang, Hongfei; Lee, Tai Sing
2018-04-26
One general principle of sensory information processing is that the brain must optimize efficiency by reducing the number of neurons that process the same information. The sparseness of the sensory representations in a population of neurons reflects the efficiency of the neural code. Here, we employ large-scale two-photon calcium imaging to examine the responses of a large population of neurons within the superficial layers of area V1 with single-cell resolution, while simultaneously presenting a large set of natural visual stimuli, to provide the first direct measure of the population sparseness in awake primates. The results show that only 0.5% of neurons respond strongly to any given natural image - indicating a ten-fold increase in the inferred sparseness over previous measurements. These population activities are nevertheless necessary and sufficient to discriminate visual stimuli with high accuracy, suggesting that the neural code in the primary visual cortex is both super-sparse and highly efficient. © 2018, Tang et al.
Real-time encoding and compression of neuronal spikes by metal-oxide memristors
Gupta, Isha; Serb, Alexantrou; Khiat, Ali; Zeitler, Ralf; Vassanelli, Stefano; Prodromakis, Themistoklis
2016-01-01
Advanced brain-chip interfaces with numerous recording sites bear great potential for investigation of neuroprosthetic applications. The bottleneck towards achieving an efficient bio-electronic link is the real-time processing of neuronal signals, which imposes excessive requirements on bandwidth, energy and computation capacity. Here we present a unique concept where the intrinsic properties of memristive devices are exploited to compress information on neural spikes in real-time. We demonstrate that the inherent voltage thresholds of metal-oxide memristors can be used for discriminating recorded spiking events from background activity and without resorting to computationally heavy off-line processing. We prove that information on spike amplitude and frequency can be transduced and stored in single devices as non-volatile resistive state transitions. Finally, we show that a memristive device array allows for efficient data compression of signals recorded by a multi-electrode array, demonstrating the technology's potential for building scalable, yet energy-efficient on-node processors for brain-chip interfaces. PMID:27666698
Research on pre-processing of QR Code
NASA Astrophysics Data System (ADS)
Sun, Haixing; Xia, Haojie; Dong, Ning
2013-10-01
QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.
Enhanced intelligence through optimized TCPED concepts for airborne ISR
NASA Astrophysics Data System (ADS)
Spitzer, M.; Kappes, E.; Böker, D.
2012-06-01
Current multinational operations show an increased demand for high quality actionable intelligence for different operational levels and users. In order to achieve sufficient availability, quality and reliability of information, various ISR assets are orchestrated within operational theatres. Especially airborne Intelligence, Surveillance and Reconnaissance (ISR) assets provide - due to their endurance, non-intrusiveness, robustness, wide spectrum of sensors and flexibility to mission changes - significant intelligence coverage of areas of interest. An efficient and balanced utilization of airborne ISR assets calls for advanced concepts for the entire ISR process framework including the Tasking, Collection, Processing, Exploitation and Dissemination (TCPED). Beyond this, the employment of current visualization concepts, shared information bases and information customer profiles, as well as an adequate combination of ISR sensors with different information age and dynamic (online) retasking process elements provides the optimization of interlinked TCPED processes towards higher process robustness, shorter process duration, more flexibility between ISR missions and, finally, adequate "entry points" for information requirements by operational users and commands. In addition, relevant Trade-offs of distributed and dynamic TCPED processes are examined and future trends are depicted.
NASA Astrophysics Data System (ADS)
Javidi, Bahram
The present conference discusses topics in the fields of neural networks, acoustooptic signal processing, pattern recognition, phase-only processing, nonlinear signal processing, image processing, optical computing, and optical information processing. Attention is given to the optical implementation of an inner-product neural associative memory, optoelectronic associative recall via motionless-head/parallel-readout optical disk, a compact real-time acoustooptic image correlator, a multidimensional synthetic estimation filter, and a light-efficient joint transform optical correlator. Also discussed are a high-resolution spatial light modulator, compact real-time interferometric Fourier-transform processors, a fast decorrelation algorithm for permutation arrays, the optical interconnection of optical modules, and carry-free optical binary adders.
Optimal physiological structure of small neurons to guarantee stable information processing
NASA Astrophysics Data System (ADS)
Zeng, S. Y.; Zhang, Z. Z.; Wei, D. Q.; Luo, X. S.; Tang, W. Y.; Zeng, S. W.; Wang, R. F.
2013-02-01
Spike is the basic element for neuronal information processing and the spontaneous spiking frequency should be less than 1 Hz for stable information processing. If the neuronal membrane area is small, the frequency of neuronal spontaneous spiking caused by ion channel noise may be high. Therefore, it is important to suppress the deleterious spontaneous spiking of the small neurons. We find by simulation of stochastic neurons with Hodgkin-Huxley-type channels that the leakage system is critical and extremely efficient to suppress the spontaneous spiking and guarantee stable information processing of the small neurons. However, within the physiological limit the potassium system cannot do so. The suppression effect of the leakage system is super-exponential, but that of the potassium system is quasi-linear. With the minor physiological cost and the minimal consumption of metabolic energy, a slightly lower reversal potential and a relatively larger conductance of the leakage system give the optimal physiological structure to suppress the deleterious spontaneous spiking and guarantee stable information processing of small neurons, dendrites and axons.
Information Network Model Query Processing
NASA Astrophysics Data System (ADS)
Song, Xiaopu
Information Networking Model (INM) [31] is a novel database model for real world objects and relationships management. It naturally and directly supports various kinds of static and dynamic relationships between objects. In INM, objects are networked through various natural and complex relationships. INM Query Language (INM-QL) [30] is designed to explore such information network, retrieve information about schema, instance, their attributes, relationships, and context-dependent information, and process query results in the user specified form. INM database management system has been implemented using Berkeley DB, and it supports INM-QL. This thesis is mainly focused on the implementation of the subsystem that is able to effectively and efficiently process INM-QL. The subsystem provides a lexical and syntactical analyzer of INM-QL, and it is able to choose appropriate evaluation strategies and index mechanism to process queries in INM-QL without the user's intervention. It also uses intermediate result structure to hold intermediate query result and other helping structures to reduce complexity of query processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schiller, Steven R.; Schwartz, Lisa C.
Demand-side energy efficiency (efficiency) represents a low-cost opportunity to reduce electricity consumption and demand and provide a wide range of non-energy benefits, including avoiding air pollution. Efficiency-related energy and non-energy impacts are determined and documented by implementing evaluation, measurement and verification (EM&V) systems. This technical brief describes efficiency EM&V coordination strategies that Western states can consider taking on together, outlines EM&V-related products that might be appropriate for multistate coordination, and identifies some implications of coordination. Coordinating efficiency EM&V activities can save both time and costs for state agencies and stakeholders engaged in efficiency activities and can be particularly beneficial formore » multiple states served by the same utility. First, the brief summarizes basic information on efficiency, its myriad potential benefits and EM&V for assessing those benefits. Second, the brief introduces the concept of multistate EM&V coordination in the context of assessing such benefits, including achievement of state and federal goals to reduce air pollutants.1 Next, the brief presents three coordination strategy options for efficiency EM&V: information clearinghouse/exchange, EM&V product development, and a regional energy efficiency tracking system platform. The brief then describes five regional EM&V products that could be developed on a multistate basis: EM&V reporting formats, database of consistent deemed electricity savings values, glossary of definitions and concepts, efficiency EM&V methodologies, and EM&V professional standards or accreditation processes. Finally, the brief discusses options for next steps that Western states can take to consider multistate coordination on efficiency EM&V. Appendices provide background information on efficiency and EM&V, as well as definitions and suggested resources on the covered topics. This brief is intended to inform state public utility commissions, boards for public and consumer-owned utilities, state energy offices and air agencies, and other organizations involved in discussions about the use of efficiency EM&V.« less
Strategic, Organizational and Standardization Aspects of Integrated Information Systems. Volume 6.
1987-12-01
TEST CHART NATIONAL BUREAU OF STANDARDS- 1963-A Masaustt Strategic, Organizational, and Intueoyffomto TechnlogyStandardization Aspects of UJ Kowledge ...reasons (such as the desired level of processing power and the amount of storage space), organizational reasons (such as each department obtaining its...of processing power falls, Abbott can afford to subordinate efficient processing for organizational effectiveness. 4. Steps in an Analytical Process
[Meaningful words? Cancer screening communication in Italy].
Cogo, Carla; Petrella, Marco
2012-01-01
Over the last ten years, Italian work groups of communication within The National Centre for Screening Monitoring have been working on various aspects of communication in screening: quality surveys, information materials, guidelines, websites, and training. This has been done taking into account that good quality information must be clear, accessible, up to date, evidence based, clear about its limitations and capable of indicating further sources of information. Whenever possible, information has been developed in collaboration with the target groups: citizens but also health professionals. However, if good quality information must be clear about benefits and harms, the communication of quantitative information is particularly complex in cancer screening. Moreover, receiving more information on risks and benefits does not seem to modify participation. In addition, more balanced information does not entail that a person will include it in the decision process.Throughout several focus groups, citizens have made it clear that the information received from the programmes was only a part of the decisional process in which other elements were just as, if not more, important: trust in doctors, family and friends, perception of health authority efficiency, personal experiences, inconsistencies in information or public disagreements with other credible sources. Such elements can be seen as an opportunity to strengthen partnerships with professional and advocacy groups and to cooperate more efficiently with media and specialists from different fields.
Age differences in dual information-processing modes: implications for cancer decision making.
Peters, Ellen; Diefenbach, Michael A; Hess, Thomas M; Västfjäll, Daniel
2008-12-15
Age differences in affective/experiential and deliberative processes have important theoretical implications for cancer decision making, as cancer is often a disease of older adulthood. The authors examined evidence for adult age differences in affective and deliberative information processes, reviewed the sparse evidence about age differences in decision making, and introduced how dual process theories and their findings might be applied to cancer decision making. Age-related declines in the efficiency of deliberative processes predict poorer-quality decisions as we age, particularly when decisions are unfamiliar and the information is numeric. However, age-related adaptive processes, including an increased focus on emotional goals and greater experience, can influence decision making and potentially offset age-related declines. A better understanding of the mechanisms that underlie cancer decision processes in our aging population should ultimately allow us to help older adults to better help themselves.
Age Differences in Dual Information-Processing Modes: Implications for Cancer Decision Making
Peters, Ellen; Diefenbach, Michael A.; Hess, Thomas M.; Västfjäll, Daniel
2008-01-01
Age differences in affective/experiential and deliberative processes have important theoretical implications for cancer decision making as cancer is often a disease of older adulthood. We examine evidence for adult age differences in affective and deliberative information processes, review the sparse evidence about age differences in decision making and introduce how dual process theories and their findings might be applied to cancer decision making. Age-related declines in the efficiency of deliberative processes predict poorer-quality decisions as we age, particularly when decisions are unfamiliar and the information is numeric. However, age-related adaptive processes, including an increased focus on emotional goals and greater experience, can influence decision making and potentially offset age-related declines. A better understanding of the mechanisms that underlie cancer decision processes in our aging population should ultimately allow us to help older adults to better help themselves. PMID:19058148
LESS SKILLED READERS HAVE LESS EFFICIENT SUPPRESSION MECHANISMS.
Gernsbacher, Morton Ann
1993-09-01
One approach to understanding the component processes and mechanisms underlying adult reading skill is to compare the performance of more skilled and less skilled readers on laboratory experiments. The results of some recent experiments employing this approach demonstrate that less skilled adult readers suppress less efficiently the inappropriate meanings of ambiguous words (e.g., the playing card vs. garden tool meanings of spade ), the incorrect forms of homophones (e.g., patients vs. patience ), the typical-but-absent members of scenes (e.g., a tractor in a farm scene), and words superimposed on pictures. Less skilled readers are not less efficient in activating contextually appropriate information; in fact, they activate contextually appropriate information more strongly than more skilled readers do. Therefore, one conclusion that can be drawn from these experiments is that less skilled adult readers suffer from less efficient suppression mechanisms.
Heterogeneous delivering capability promotes traffic efficiency in complex networks
NASA Astrophysics Data System (ADS)
Zhu, Yan-Bo; Guan, Xiang-Min; Zhang, Xue-Jun
2015-12-01
Traffic is one of the most fundamental dynamical processes in networked systems. With the homogeneous delivery capability of nodes, the global dynamic routing strategy proposed by Ling et al. [Phys. Rev. E81, 016113 (2010)] adequately uses the dynamic information during the process and thus it can reach a quite high network capacity. In this paper, based on the global dynamic routing strategy, we proposed a heterogeneous delivery allocation strategy of nodes on scale-free networks with consideration of nodes degree. It is found that the network capacity as well as some other indexes reflecting transportation efficiency are further improved. Our work may be useful for the design of more efficient routing strategies in communication or transportation systems.
Epidemics in Complex Networks: The Diversity of Hubs
NASA Astrophysics Data System (ADS)
Kitsak, Maksim; Gallos, Lazaros K.; Havlin, Shlomo; Stanley, H. Eugene; Makse, Hernan A.
2009-03-01
Many complex systems are believed to be vulnerable to spread of viruses and information owing to their high level of interconnectivity. Even viruses of low contagiousness easily proliferate the Internet. Rumors, fads, and innovation ideas are prone to efficient spreading in various social systems. Another commonly accepted standpoint is the importance of the most connected elements (hubs) in the spreading processes. We address following questions. Do all hubs conduct epidemics in the same manner? How does the epidemics spread depend on the structure of the network? What is the most efficient way to spread information over the system? We analyze several large-scale systems in the framework of of the susceptible/infective/removed (SIR) disease spread model which can also be mapped to the problem of rumor or fad spreading. We show that hubs are often ineffective in the transmission of virus or information owing to the highly heterogeneous topology of most networks. We also propose a new tool to evaluate the efficiency of nodes in spreading virus or information.
Künstler, E C S; Finke, K; Günther, A; Klingner, C; Witte, O; Bublak, P
2018-01-01
Dual tasking, or the simultaneous execution of two continuous tasks, is frequently associated with a performance decline that can be explained within a capacity sharing framework. In this study, we assessed the effects of a concurrent motor task on the efficiency of visual information uptake based on the 'theory of visual attention' (TVA). TVA provides parameter estimates reflecting distinct components of visual processing capacity: perceptual threshold, visual processing speed, and visual short-term memory (VSTM) storage capacity. Moreover, goodness-of-fit values and bootstrapping estimates were derived to test whether the TVA-model is validly applicable also under dual task conditions, and whether the robustness of parameter estimates is comparable in single- and dual-task conditions. 24 subjects of middle to higher age performed a continuous tapping task, and a visual processing task (whole report of briefly presented letter arrays) under both single- and dual-task conditions. Results suggest a decline of both visual processing capacity and VSTM storage capacity under dual-task conditions, while the perceptual threshold remained unaffected by a concurrent motor task. In addition, goodness-of-fit values and bootstrapping estimates support the notion that participants processed the visual task in a qualitatively comparable, although quantitatively less efficient way under dual-task conditions. The results support a capacity sharing account of motor-cognitive dual tasking and suggest that even performing a relatively simple motor task relies on central attentional capacity that is necessary for efficient visual information uptake.
An effective and efficient assessment process
Russell T. Graham; Theresa B. Jain
1999-01-01
Depending on the agency, discipline, or audience, assessments supply data and information to address relevant policy questions and to help make decisions. If properly executed, assessment processes can draw conclusions and make recommendations on how to manage natural resources. Assessments, especially large ones, can be easily influenced by internal and external...
Process-Oriented Worked Examples: Improving Transfer Performance through Enhanced Understanding
ERIC Educational Resources Information Center
van Gog, Tamara; Paas, Fred; van Merrienboer, Jeroen J. G.
2004-01-01
The research on worked examples has shown that for novices, studying worked examples is often a more effective and efficient way of learning than solving conventional problems. This theoretical paper argues that adding process-oriented information to worked examples can further enhance transfer performance, especially for complex cognitive skills…
Combined process automation for large-scale EEG analysis.
Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E
2012-01-01
Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bykovskii, Yurii A.; Eloev, E. N.; Kukharenko, K. L.; Panin, A. M.; Solodovnikov, N. P.; Torgashin, A. N.; Arestova, E. L.
1995-10-01
An acousto-optical system for input, display, and coherent-optical processing of information was implemented experimentally. The information transmission capacity, the structure of the information fluxes, and the efficiency of spaceborne telemetric systems were taken into account. The number of equivalent frequency-resolved channels corresponded to the structure of a telemetric frame of a two-step switch. The number of intensity levels of laser radiation corresponded to the scale of changes in the parameters. Use was made of the technology of a liquid optical contact between a wedge-shaped piezoelectric transducer made of lithium niobate and an anisotropic light-and-sound guide made of paratellurite with asymmetric scattering geometry. The simplest technique for optical filtering of multiparameter signals was analysed.
Exploitation and Benefits of BIM in Construction Project Management
NASA Astrophysics Data System (ADS)
Mesároš, Peter; Mandičák, Tomáš
2017-10-01
BIM is increasingly getting into the awareness in construction industry. BIM is the process of creating and data managing of the building during its life cycle. BIM became a part of management tools in modern construction companies. Construction projects have a number of participants. It means difficulty process of construction project management and a serious requirement for processing the huge amount of information including design, construction, time and cost parameters, economic efficiency and sustainability. Progressive information and communication technologies support cost management and management of construction project. One of them is Building Information Modelling. Aim of the paper is to examine the impact of BIM exploitation and benefits on construction project management in Slovak companies.
Automaticity in Anxiety Disorders and Major Depressive Disorder
Teachman, Bethany A.; Joormann, Jutta; Steinman, Shari; Gotlib, Ian H.
2012-01-01
In this paper we examine the nature of automatic cognitive processing in anxiety disorders and Major Depressive Disorder (MDD). Rather than viewing automaticity as a unitary construct, we follow a social cognition perspective (Bargh, 1994) that argues for four theoretically independent features of automaticity: unconscious (processing of emotional stimuli occurs outside awareness), efficient (processing emotional meaning uses minimal attentional resources), unintentional (no goal is needed to engage in processing emotional meaning), and uncontrollable (limited ability to avoid, alter or terminate processing emotional stimuli). Our review of the literature suggests that most anxiety disorders are characterized by uncontrollable, and likely also unconscious and unintentional, biased processing of threat-relevant information. In contrast, MDD is most clearly typified by uncontrollable, but not unconscious or unintentional, processing of negative information. For the anxiety disorders and for MDD, there is not sufficient evidence to draw firm conclusions about efficiency of processing, though early indications are that neither anxiety disorders nor MDD are characterized by this feature. Clinical and theoretical implications of these findings are discussed and directions for future research are offered. In particular, it is clear that paradigms that more directly delineate the different features of automaticity are required to gain a more comprehensive and systematic understanding of the importance of automatic processing in emotion dysregulation. PMID:22858684
How to design a horizontal patient-focused hospital.
Murphy, E C; Ruflin, P
1993-05-01
Work Imaging is an executive information system for analyzing the cost effectiveness and efficiency of work processes and structures in health care. Advanced Work Imaging relational database technology allows managers and employees to take a sample work activities profile organization-wide. This is married to financial and organizational data to produce images of work within and across all functions, departments, and levels. The images are benchmarked against best practice data to provide insight on the quality and cost efficiency of work practice patterns, from individual roles to departmental skill mix to organization-wide service processes.
Quantum Image Processing and Its Application to Edge Detection: Theory and Experiment
NASA Astrophysics Data System (ADS)
Yao, Xi-Wei; Wang, Hengyan; Liao, Zeyang; Chen, Ming-Cheng; Pan, Jian; Li, Jun; Zhang, Kechao; Lin, Xingcheng; Wang, Zhehui; Luo, Zhihuang; Zheng, Wenqiang; Li, Jianzhong; Zhao, Meisheng; Peng, Xinhua; Suter, Dieter
2017-07-01
Processing of digital images is continuously gaining in volume and relevance, with concomitant demands on data storage, transmission, and processing power. Encoding the image information in quantum-mechanical systems instead of classical ones and replacing classical with quantum information processing may alleviate some of these challenges. By encoding and processing the image information in quantum-mechanical systems, we here demonstrate the framework of quantum image processing, where a pure quantum state encodes the image information: we encode the pixel values in the probability amplitudes and the pixel positions in the computational basis states. Our quantum image representation reduces the required number of qubits compared to existing implementations, and we present image processing algorithms that provide exponential speed-up over their classical counterparts. For the commonly used task of detecting the edge of an image, we propose and implement a quantum algorithm that completes the task with only one single-qubit operation, independent of the size of the image. This demonstrates the potential of quantum image processing for highly efficient image and video processing in the big data era.
Executive Guide: Information Security Management. Learning From Leading Organizations
1998-05-01
data. In September 1996, we reported that audit reports and agency self - assessments issued during the previous 2 years showed that weak information...company has developed an efficient and disciplined process for ensuring that information security-related risks to business operations are considered and...protection group at the utility was required to approve all new applications to indicate that risks had been adequately considered. Providing self
Federal roles to realize national energy-efficiency opportunities in the 1990s
NASA Astrophysics Data System (ADS)
Hirst, Eric
1989-10-01
Improving energy efficiency throughout the U.S. economy is a vital component of our nation's energy future, with many benefits. Improving efficiency can: save money consumers, increase economic productivity and international competitiveness, reduce oil and gas prices by reducing the demand for foreign oil, enhance national security by lowering oil imports, reduce the adverse environmental consequences of fuel cycles, especially acid rain and global warming, add diversity and flexibility to the nation's portfolio of energy resources, respond to public interest in, and support of, energy efficiency. The primary purpose of this report is to suggest expanded roles for the U.S. Department of Energy (DOE) in improving energy efficiency during the 1990s. In an ideal world, the normal workings of the market place would yield optimal energy-efficiency purchase and operating decisions. Unfortunately, distortions in fuel prices, limited access to capital, misplaced incentives, lack of information, and difficulty in processing information complicate energy-related decision making. Thus, consumers in all sectors of the economy underinvest in energy-efficient systems. These market barriers, coupled with growing concern about environmental quality, justify a larger Federal role.
Learning To Learn: A Guide to Becoming Information Literate.
ERIC Educational Resources Information Center
Riedling, Ann Marlow
This guide is designed to help students from middle school through the beginning college level master the essential information literacy skills and become effective, efficient learners. It covers the entire process of the research experience from choosing a topic and learning how to explore it effectively, to using the library and its resources,…
Object-Based Attention Overrides Perceptual Load to Modulate Visual Distraction
ERIC Educational Resources Information Center
Cosman, Joshua D.; Vecera, Shaun P.
2012-01-01
The ability to ignore task-irrelevant information and overcome distraction is central to our ability to efficiently carry out a number of tasks. One factor shown to strongly influence distraction is the perceptual load of the task being performed; as the perceptual load of task-relevant information processing increases, the likelihood that…
The Effect of User Characteristics on the Efficiency of Visual Querying
ERIC Educational Resources Information Center
Bak, Peter; Meyer, Joachim
2011-01-01
Information systems increasingly provide options for visually inspecting data during the process of information discovery and exploration. Little research has dealt so far with user interactions with these systems, and specifically with the effects of characteristics of the displayed data and the user on performance with such systems. The study…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-27
... Radiological Health (CDRH) to easily and efficiently elicit and review information from students and health care professionals who are interested in becoming involved in CDRH activities. The process will reduce... protecting the public health by encouraging outside persons to share their expertise with CDRH. In the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mcdonald, Kathleen Herrera
2016-02-29
KIVA is a family of Fortran-based computational fluid dynamics software developed by LANL. The software predicts complex fuel and air flows as well as ignition, combustion, and pollutant-formation processes in engines. The KIVA models have been used to understand combustion chemistry processes, such as auto-ignition of fuels, and to optimize diesel engines for high efficiency and low emissions. Fuel economy is heavily dependent upon engine efficiency, which in turn depends to a large degree on how fuel is burned within the cylinders of the engine. Higher in-cylinder pressures and temperatures lead to increased fuel economy, but they also create moremore » difficulty in controlling the combustion process. Poorly controlled and incomplete combustion can cause higher levels of emissions and lower engine efficiencies.« less
[Cognitive experimental approach to anxiety disorders].
Azaïs, F
1995-01-01
Cognitive psychology is proposing a functional model to explain the mental organisation leading to emotional disorders. Among these disorders, anxiety spectrum represents a domain in which this model seems to be interesting for an efficient and comprehensive approach of the pathology. Number of behavioral or cognitive psychotherapeutic methods are relating to these cognitive references, but the theorical concepts of cognitive "shemata" or cognitive "processes" evoked to describe mental functioning in anxiety need an experimental approach for a better rational understanding. Cognitive function as perception, attention or memory can be explored in this domaine in an efficient way, allowing a more precise study of each stage of information processing. The cognitive model proposed in the psychopathology of anxiety suggests that anxious subjects are characterized by biases in processing of emotionally valenced information. This hypothesis suggests functional interference in information processing in these subjects, leading to an anxious response to the most of different stimuli. Experimental approach permit to explore this hypothesis, using many tasks for testing different cognitive dysfunction evoked in the anxious cognitive organisation. Impairments revealed in anxiety disorders seem to result from specific biases in threat-related information processing, involving several stages of cognitive processes. Semantic interference, attentional bias, implicit memory bias and priming effect are the most often disorders observed in anxious pathology, like simple phobia, generalised anxiety, panic disorder or post-traumatic stress disorder. These results suggest a top-down organisation of information processing in anxious subjects, who tend to detect, perceive and label many situations as threatening experience. The processes of reasoning and elaboration are consequently impaired in their adaptative function to threat, leading to the anxious response observed in clinical condition. The cognitive, behavioral and emotional components of this anxious reaction maintain the stressful experience for the subject, in which the self cognitive competence remain pathologically decreased. Cognitive psychology proposes an interesting model for the understanding of anxiety, in a domain in which subjectivity could benefit from an experimental approach.(ABSTRACT TRUNCATED AT 400 WORDS)
Workflow management systems in radiology
NASA Astrophysics Data System (ADS)
Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim
1998-07-01
In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.
Measure Guideline: High Efficiency Natural Gas Furnaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brand, L.; Rose, W.
2012-10-01
This Measure Guideline covers installation of high-efficiency gas furnaces. Topics covered include when to install a high-efficiency gas furnace as a retrofit measure, how to identify and address risks, and the steps to be used in the selection and installation process. The guideline is written for Building America practitioners and HVAC contractors and installers. It includes a compilation of information provided by manufacturers, researchers, and the Department of Energy as well as recent research results from the Partnership for Advanced Residential Retrofit (PARR) Building America team.
Measure Guideline. High Efficiency Natural Gas Furnaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brand, L.; Rose, W.
2012-10-01
This measure guideline covers installation of high-efficiency gas furnaces, including: when to install a high-efficiency gas furnace as a retrofit measure; how to identify and address risks; and the steps to be used in the selection and installation process. The guideline is written for Building America practitioners and HVAC contractors and installers. It includes a compilation of information provided by manufacturers, researchers, and the Department of Energy as well as recent research results from the Partnership for Advanced Residential Retrofit (PARR) Building America team.
Marchman, Virginia A.; Fernald, Anne; Hurtado, Nereyda
2010-01-01
Research using online comprehension measures with monolingual children shows that speed and accuracy of spoken word recognition are correlated with lexical development. Here we examined speech processing efficiency in relation to vocabulary development in bilingual children learning both Spanish and English (n=26; 2;6 yrs). Between-language associations were weak: vocabulary size in Spanish was uncorrelated with vocabulary in English, and children’s facility in online comprehension in Spanish was unrelated to their facility in English. Instead, efficiency of online processing in one language was significantly related to vocabulary size in that language, after controlling for processing speed and vocabulary size in the other language. These links between efficiency of lexical access and vocabulary knowledge in bilinguals parallel those previously reported for Spanish and English monolinguals, suggesting that children’s ability to abstract information from the input in building a working lexicon relates fundamentally to mechanisms underlying the construction of language. PMID:19726000
Pyramidal neurovision architecture for vision machines
NASA Astrophysics Data System (ADS)
Gupta, Madan M.; Knopf, George K.
1993-08-01
The vision system employed by an intelligent robot must be active; active in the sense that it must be capable of selectively acquiring the minimal amount of relevant information for a given task. An efficient active vision system architecture that is based loosely upon the parallel-hierarchical (pyramidal) structure of the biological visual pathway is presented in this paper. Although the computational architecture of the proposed pyramidal neuro-vision system is far less sophisticated than the architecture of the biological visual pathway, it does retain some essential features such as the converging multilayered structure of its biological counterpart. In terms of visual information processing, the neuro-vision system is constructed from a hierarchy of several interactive computational levels, whereupon each level contains one or more nonlinear parallel processors. Computationally efficient vision machines can be developed by utilizing both the parallel and serial information processing techniques within the pyramidal computing architecture. A computer simulation of a pyramidal vision system for active scene surveillance is presented.
[A medical consumable material management information system].
Tang, Guoping; Hu, Liang
2014-05-01
Medical consumables material is essential supplies to carry out medical work, which has a wide range of varieties and a large amount of usage. How to manage it feasibly and efficiently that has been a topic of concern to everyone. This article discussed about how to design a medical consumable material management information system that has a set of standardized processes, bring together medical supplies administrator, suppliers and clinical departments. Advanced management mode, enterprise resource planning (ERP) applied to the whole system design process.
Enhanced analysis of real-time PCR data by using a variable efficiency model: FPK-PCR
Lievens, Antoon; Van Aelst, S.; Van den Bulcke, M.; Goetghebeur, E.
2012-01-01
Current methodology in real-time Polymerase chain reaction (PCR) analysis performs well provided PCR efficiency remains constant over reactions. Yet, small changes in efficiency can lead to large quantification errors. Particularly in biological samples, the possible presence of inhibitors forms a challenge. We present a new approach to single reaction efficiency calculation, called Full Process Kinetics-PCR (FPK-PCR). It combines a kinetically more realistic model with flexible adaptation to the full range of data. By reconstructing the entire chain of cycle efficiencies, rather than restricting the focus on a ‘window of application’, one extracts additional information and loses a level of arbitrariness. The maximal efficiency estimates returned by the model are comparable in accuracy and precision to both the golden standard of serial dilution and other single reaction efficiency methods. The cycle-to-cycle changes in efficiency, as described by the FPK-PCR procedure, stay considerably closer to the data than those from other S-shaped models. The assessment of individual cycle efficiencies returns more information than other single efficiency methods. It allows in-depth interpretation of real-time PCR data and reconstruction of the fluorescence data, providing quality control. Finally, by implementing a global efficiency model, reproducibility is improved as the selection of a window of application is avoided. PMID:22102586
Organisation of biotechnological information into knowledge.
Boh, B
1996-09-01
The success of biotechnological research, development and marketing depends to a large extent on the international transfer of information and on the ability to organise biotechnology information into knowledge. To increase the efficiency of information-based approaches, an information strategy has been developed and consists of the following stages: definition of the problem, its structure and sub-problems; acquisition of data by targeted processing of computer-supported bibliographic, numeric, textual and graphic databases; analysis of data and building of specialized in-house information systems; information processing for structuring data into systems, recognition of trends and patterns of knowledge, particularly by information synthesis using the concept of information density; design of research hypotheses; testing hypotheses in the laboratory and/or pilot plant; repeated evaluation and optimization of hypotheses by information methods and testing them by further laboratory work. The information approaches are illustrated by examples from the university-industry joint projects in biotechnology, biochemistry and agriculture.
Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)
NASA Astrophysics Data System (ADS)
Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.
2014-12-01
NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. Analytics within NEX occurs at several levels - data, workflows, science and knowledge. At the data level, we are focusing on collecting and analyzing any information that is relevant to efficient acquisition, processing and management of data at the smallest granularity, such as files or collections. This includes processing and analyzing all local and many external metadata that are relevant to data quality, size, provenance, usage and other attributes. This then helps us better understand usage patterns and improve efficiency of data handling within NEX. When large-scale workflows are executed on NEX, we capture information that is relevant to processing and that can be analyzed in order to improve efficiencies in job scheduling, resource optimization, or data partitioning that would improve processing throughput. At this point we also collect data provenance as well as basic statistics of intermediate and final products created during the workflow execution. These statistics and metrics form basic process and data QA that, when combined with analytics algorithms, helps us identify issues early in the production process. We have already seen impact in some petabyte-scale projects, such as global Landsat processing, where we were able to reduce processing times from days to hours and enhance process monitoring and QA. While the focus so far has been mostly on support of NEX operations, we are also building a web-based infrastructure that enables users to perform direct analytics on science data - such as climate predictions or satellite data. Finally, as one of the main goals of NEX is knowledge acquisition and sharing, we began gathering and organizing information that associates users and projects with data, publications, locations and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.
Wang, Julia; Al-Ouran, Rami; Hu, Yanhui; Kim, Seon-Young; Wan, Ying-Wooi; Wangler, Michael F; Yamamoto, Shinya; Chao, Hsiao-Tuan; Comjean, Aram; Mohr, Stephanie E; Perrimon, Norbert; Liu, Zhandong; Bellen, Hugo J
2017-06-01
One major challenge encountered with interpreting human genetic variants is the limited understanding of the functional impact of genetic alterations on biological processes. Furthermore, there remains an unmet demand for an efficient survey of the wealth of information on human homologs in model organisms across numerous databases. To efficiently assess the large volume of publically available information, it is important to provide a concise summary of the most relevant information in a rapid user-friendly format. To this end, we created MARRVEL (model organism aggregated resources for rare variant exploration). MARRVEL is a publicly available website that integrates information from six human genetic databases and seven model organism databases. For any given variant or gene, MARRVEL displays information from OMIM, ExAC, ClinVar, Geno2MP, DGV, and DECIPHER. Importantly, it curates model organism-specific databases to concurrently display a concise summary regarding the human gene homologs in budding and fission yeast, worm, fly, fish, mouse, and rat on a single webpage. Experiment-based information on tissue expression, protein subcellular localization, biological process, and molecular function for the human gene and homologs in the seven model organisms are arranged into a concise output. Hence, rather than visiting multiple separate databases for variant and gene analysis, users can obtain important information by searching once through MARRVEL. Altogether, MARRVEL dramatically improves efficiency and accessibility to data collection and facilitates analysis of human genes and variants by cross-disciplinary integration of 18 million records available in public databases to facilitate clinical diagnosis and basic research. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.
Aguilar, I; Misztal, I; Legarra, A; Tsuruta, S
2011-12-01
Genomic evaluations can be calculated using a unified procedure that combines phenotypic, pedigree and genomic information. Implementation of such a procedure requires the inverse of the relationship matrix based on pedigree and genomic relationships. The objective of this study was to investigate efficient computing options to create relationship matrices based on genomic markers and pedigree information as well as their inverses. SNP maker information was simulated for a panel of 40 K SNPs, with the number of genotyped animals up to 30 000. Matrix multiplication in the computation of the genomic relationship was by a simple 'do' loop, by two optimized versions of the loop, and by a specific matrix multiplication subroutine. Inversion was by a generalized inverse algorithm and by a LAPACK subroutine. With the most efficient choices and parallel processing, creation of matrices for 30 000 animals would take a few hours. Matrices required to implement a unified approach can be computed efficiently. Optimizations can be either by modifications of existing code or by the use of efficient automatic optimizations provided by open source or third-party libraries. © 2011 Blackwell Verlag GmbH.
Effects of fundamentals acquisition and strategy switch on stock price dynamics
NASA Astrophysics Data System (ADS)
Wu, Songtao; He, Jianmin; Li, Shouwei
2018-02-01
An agent-based artificial stock market is developed to simulate trading behavior of investors. In the market, acquisition and employment of information about fundamentals and strategy switch are investigated to explain stock price dynamics. Investors could obtain the information from both market and neighbors resided on their social networks. Depending on information status and performances of different strategies, an informed investor may switch to the strategy of fundamentalist. This in turn affects the information acquisition process, since fundamentalists are more inclined to search and spread the information than chartists. Further investigation into price dynamics generated from three typical networks, i.e. regular lattice, small-world network and random graph, are conducted after general relation between network structures and price dynamics is revealed. In each network, integrated effects of different combinations of information efficiency and switch intensity are investigated. Results have shown that, along with increasing switch intensity, market and social information efficiency play different roles in the formation of price distortion, standard deviation and kurtosis of returns.
NASA Technical Reports Server (NTRS)
Muellerschoen, R. J.
1988-01-01
A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.
Microalgal drying and cell disruption--recent advances.
Show, Kuan-Yeow; Lee, Duu-Jong; Tay, Joo-Hwa; Lee, Tse-Min; Chang, Jo-Shu
2015-05-01
Production of intracellular metabolites or biofuels from algae involves various processing steps, and extensive work on laboratory- and pilot-scale algae cultivation, harvesting and processing has been reported. As algal drying and cell disruption are integral processes of the unit operations, this review examines recent advances in algal drying and disruption for nutrition or biofuel production. Challenges and prospects of the processing are also outlined. Engineering improvements in addressing the challenges of energy efficiency and cost-effective and rigorous techno-economic analyses for a clearer prospect comparison between different processing methods are highlighted. Holistic life cycle assessments need to be conducted in assessing the energy balance and the potential environmental impacts of algal processing. The review aims to provide useful information for future development of efficient and commercially viable algal food products and biofuels production. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ammann, C. M.; Vigh, J. L.; Lee, J. A.
2016-12-01
Society's growing needs for robust and relevant climate information have fostered an explosion in tools and frameworks for processing climate projections. Many top-down workflows might be employed to generate sets of pre-computed data and plots, frequently served in a "loading-dock style" through a metadata-enabled search and discovery engine. Despite these increasing resources, the diverse needs of applications-driven projects often result in data processing workflow requirements that cannot be fully satisfied using past approaches. In parallel to the data processing challenges, the provision of climate information to users in a form that is also usable represents a formidable challenge of its own. Finally, many users do not have the time nor the desire to synthesize and distill massive volumes of climate information to find the relevant information for their particular application. All of these considerations call for new approaches to developing actionable climate information. CRMe seeks to bridge the gap between the diversity and richness of bottom-up needs of practitioners, with discrete, structured top-down workflows typically implemented for rapid delivery. Additionally, CRMe has implemented web-based data services capable of providing focused climate information in usable form for a given location, or as spatially aggregated information for entire regions or countries following the needs of users and sectors. Making climate data actionable also involves summarizing and presenting it in concise and approachable ways. CRMe is developing the concept of dashboards, co-developed with the users, to condense the key information into a quick summary of the most relevant, curated climate data for a given discipline, application, or location, while still enabling users to efficiently conduct deeper discovery into rich datasets on an as-needed basis.
Framework for the Intelligent Transportation System (ITS) Evaluation : ITS Integration Activities
DOT National Transportation Integrated Search
2006-08-01
Intelligent Transportation Systems (ITS) represent a significant opportunity to improve the efficiency and safety of the surface transportation system. ITS includes technologies to support information processing, communications, surveillance and cont...
Concepts for a global resources information system
NASA Technical Reports Server (NTRS)
Billingsley, F. C.; Urena, J. L.
1984-01-01
The objective of the Global Resources Information System (GRIS) is to establish an effective and efficient information management system to meet the data access requirements of NASA and NASA-related scientists conducting large-scale, multi-disciplinary, multi-mission scientific investigations. Using standard interfaces and operating guidelines, diverse data systems can be integrated to provide the capabilities to access and process multiple geographically dispersed data sets and to develop the necessary procedures and algorithms to derive global resource information.
New infrared-sensitive photopolymer materials for information storage and processing
NASA Astrophysics Data System (ADS)
Nagtegaele, Patrice; Galstian, Tigran V.
2001-11-01
In response to the increasing demand of information systems, we need new materials with high performance for storage and processing applications. Available on the market optical storage materials present very useful characteristics but are still limited in the visible spectrum and are expansive. Recently, we have developed holographic polymer dispersed liquid crystal (H-PDLC) materials sensitive in the near infrared region (800 nm to 850 nm). These compounds are based on acrylate monomers and different liquid crystals (LC) and allow highly efficient in-situ recording of holographic optical elements using infra red lasers. Diffraction efficiency above 95% is demonstrated. Photosensitivity of the material, its dark development and photochemical stability of recorded gratings are investigated. The angular and spectral selectivities of gratings, recorded in these films are examined for recovering the refractive index modulation profile.
Scaling of ratings: Concepts and methods
Thomas C. Brown; Terry C. Daniel
1990-01-01
Rating scales provide an efficient and widely used means of recording judgments. This paper reviews scaling issues within the context of a psychometric model of the rating process, describes several methods of scaling rating data, and compares the methods in terms of the assumptions they require about the rating process and the information they provide about the...
76 FR 13989 - Information Collection; Submission for OMB Review, Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... of publication in the Federal Register: (1) By fax to: (202) 395-6974, Attention: Ms. Sharon Mar, OMB... change and noted that an Executive Summary would add minimal burden to the application process. (b) Five... efficiency of the grant review process. Therefore, we do not intend to remove another portion of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... the efficiency and effectiveness of FHA's quality assurance process (QAP). The objective of FHA's QAP... control plan (QCP).\\1\\ A copy of the plan must be submitted by the lender when applying for FHA lender... processes: post-endorsement technical reviews, Quality Assurance Division reviews and targeted lender...
Cao, Yuansheng; Gong, Zongping; Quan, H T
2015-06-01
Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012)] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013)], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.
Boiler MACT Technical Assistance (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-03-01
Fact sheet describing the changes to Environmental Protection Act process standards. The DOE will offer technical assistance to ensure that major sources burning coal and oil have information on cost-effective, clean energy strategies for compliance, and to promote cleaner, more efficient boiler burning to cut harmful pollution and reduce operational costs. The U.S. Environmental Protection Agency (EPA) is expected to finalize the reconsideration process for its Clean Air Act pollution standards National Emissions Standards for Hazardous Air Pollutants for Major Sources: Industrial, Commercial, and Institutional Boilers and Process Heaters (known as Boiler Maximum Achievable Control Technology (MACT)), in Spring 2012.more » This rule applies to large and small boilers in a wide range of industrial facilities and institutions. The U.S. Department of Energy (DOE) will offer technical assistance to ensure that major sources burning coal or oil have information on cost-effective clean energy strategies for compliance, including combined heat and power, and to promote cleaner, more efficient boilers to cut harmful pollution and reduce operational costs.« less
When "Veps" Cry: Two-Year-Olds Efficiently Learn Novel Words from Linguistic Contexts Alone
ERIC Educational Resources Information Center
Ferguson, Brock; Graf, Eileen; Waxman, Sandra R.
2018-01-01
We assessed 24-month-old infants' lexical processing efficiency for both novel and familiar words. Prior work documented that 19-month-olds successfully identify referents of familiar words (e.g., The dog is so little) as well as novel words whose meanings were informed only by the surrounding sentence (e.g., The vep is crying), but that the speed…
E-loyalty towards a cancer information website: applying a theoretical framework.
Crutzen, Rik; Beekers, Nienke; van Eenbergen, Mies; Becker, Monique; Jongen, Lilian; van Osch, Liesbeth
2014-06-01
To provide more insight into user perceptions related to e-loyalty towards a cancer information website. This is needed to assure adequate provision of high quality information during the full process of cancer treatment-from diagnosis to after care-and an important first step towards optimizing cancer information websites in order to promote e-loyalty. Participants were cancer patients (n = 63) and informal caregivers (n = 202) that visited a website providing regional information about cancer care for all types of cancer. Subsequently, they filled out a questionnaire assessing e-loyalty towards the website and user perceptions (efficiency, effectiveness, active trust and enjoyment) based on a theoretical framework derived from the field of e-commerce. A structural equation model was constructed to test the relationships between user perceptions and e-loyalty. Participants in general could find the information they were looking for (efficiency), thought it was relevant (effectiveness) and that they could act upon it (active trust) and thought the visit itself was pleasant (enjoyment). Effectiveness and enjoyment were both positively related with e-loyalty, but this was mediated by active trust. Efficiency was positively related with e-loyalty. The explained variance of e-loyalty was high (R(2) = 0.70). This study demonstrates that the importance of user perceptions is not limited to fields such as e-commerce but is also present within the context of cancer information websites. The high information need among participants might explain the positive relationship between efficiency and e-loyalty. Therefore, cancer information websites need to foster easy search and access of information provided. Copyright © 2014 John Wiley & Sons, Ltd.
Changes in Brain Network Efficiency and Working Memory Performance in Aging
Stanley, Matthew L.; Simpson, Sean L.; Dagenbach, Dale; Lyday, Robert G.; Burdette, Jonathan H.; Laurienti, Paul J.
2015-01-01
Working memory is a complex psychological construct referring to the temporary storage and active processing of information. We used functional connectivity brain network metrics quantifying local and global efficiency of information transfer for predicting individual variability in working memory performance on an n-back task in both young (n = 14) and older (n = 15) adults. Individual differences in both local and global efficiency during the working memory task were significant predictors of working memory performance in addition to age (and an interaction between age and global efficiency). Decreases in local efficiency during the working memory task were associated with better working memory performance in both age cohorts. In contrast, increases in global efficiency were associated with much better working performance for young participants; however, increases in global efficiency were associated with a slight decrease in working memory performance for older participants. Individual differences in local and global efficiency during resting-state sessions were not significant predictors of working memory performance. Significant group whole-brain functional network decreases in local efficiency also were observed during the working memory task compared to rest, whereas no significant differences were observed in network global efficiency. These results are discussed in relation to recently developed models of age-related differences in working memory. PMID:25875001
Changes in brain network efficiency and working memory performance in aging.
Stanley, Matthew L; Simpson, Sean L; Dagenbach, Dale; Lyday, Robert G; Burdette, Jonathan H; Laurienti, Paul J
2015-01-01
Working memory is a complex psychological construct referring to the temporary storage and active processing of information. We used functional connectivity brain network metrics quantifying local and global efficiency of information transfer for predicting individual variability in working memory performance on an n-back task in both young (n = 14) and older (n = 15) adults. Individual differences in both local and global efficiency during the working memory task were significant predictors of working memory performance in addition to age (and an interaction between age and global efficiency). Decreases in local efficiency during the working memory task were associated with better working memory performance in both age cohorts. In contrast, increases in global efficiency were associated with much better working performance for young participants; however, increases in global efficiency were associated with a slight decrease in working memory performance for older participants. Individual differences in local and global efficiency during resting-state sessions were not significant predictors of working memory performance. Significant group whole-brain functional network decreases in local efficiency also were observed during the working memory task compared to rest, whereas no significant differences were observed in network global efficiency. These results are discussed in relation to recently developed models of age-related differences in working memory.
ERIC Educational Resources Information Center
Chase, Justin P.; Yan, Zheng
2017-01-01
The ability to effective learn, process, and retain new information is critical to the success of any student. Since mathematics are becoming increasingly more important in our educational systems, it is imperative that we devise an efficient system to measure these types of information recall. "Assessing and Measuring Statistics Cognition in…
Sroubek, Jakub; Krishnan, Yamini; McDonald, Thomas V.
2013-01-01
Human ether-á-gogo-related gene (HERG) encodes a potassium channel that is highly susceptible to deleterious mutations resulting in susceptibility to fatal cardiac arrhythmias. Most mutations adversely affect HERG channel assembly and trafficking. Why the channel is so vulnerable to missense mutations is not well understood. Since nothing is known of how mRNA structural elements factor in channel processing, we synthesized a codon-modified HERG cDNA (HERG-CM) where the codons were synonymously changed to reduce GC content, secondary structure, and rare codon usage. HERG-CM produced typical IKr-like currents; however, channel synthesis and processing were markedly different. Translation efficiency was reduced for HERG-CM, as determined by heterologous expression, in vitro translation, and polysomal profiling. Trafficking efficiency to the cell surface was greatly enhanced, as assayed by immunofluorescence, subcellular fractionation, and surface labeling. Chimeras of HERG-NT/CM indicated that trafficking efficiency was largely dependent on 5′ sequences, while translation efficiency involved multiple areas. These results suggest that HERG translation and trafficking rates are independently governed by noncoding information in various regions of the mRNA molecule. Noncoding information embedded within the mRNA may play a role in the pathogenesis of hereditary arrhythmia syndromes and could provide an avenue for targeted therapeutics.—Sroubek, J., Krishnan, Y., McDonald, T V. Sequence- and structure-specific elements of HERG mRNA determine channel synthesis and trafficking efficiency. PMID:23608144
Comparing capacity coefficient and dual task assessment of visual multitasking workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie M.
Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less
Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback
Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi
2016-01-01
Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery. PMID:27861505
Hu, Kai; Gui, Zhipeng; Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi
2016-01-01
Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery.
Quantum tomography of near-unitary processes in high-dimensional quantum systems
NASA Astrophysics Data System (ADS)
Lysne, Nathan; Sosa Martinez, Hector; Jessen, Poul; Baldwin, Charles; Kalev, Amir; Deutsch, Ivan
2016-05-01
Quantum Tomography (QT) is often considered the ideal tool for experimental debugging of quantum devices, capable of delivering complete information about quantum states (QST) or processes (QPT). In practice, the protocols used for QT are resource intensive and scale poorly with system size. In this situation, a well behaved model system with access to large state spaces (qudits) can serve as a useful platform for examining the tradeoffs between resource cost and accuracy inherent in QT. In past years we have developed one such experimental testbed, consisting of the electron-nuclear spins in the electronic ground state of individual Cs atoms. Our available toolkit includes high fidelity state preparation, complete unitary control, arbitrary orthogonal measurements, and accurate and efficient QST in Hilbert space dimensions up to d = 16. Using these tools, we have recently completed a comprehensive study of QPT in 4, 7 and 16 dimensions. Our results show that QPT of near-unitary processes is quite feasible if one chooses optimal input states and efficient QST on the outputs. We further show that for unitary processes in high dimensional spaces, one can use informationally incomplete QPT to achieve high-fidelity process reconstruction (90% in d = 16) with greatly reduced resource requirements.
Stress leads to aberrant hippocampal involvement when processing schema-related information.
Vogel, Susanne; Kluen, Lisa Marieke; Fernández, Guillén; Schwabe, Lars
2018-01-01
Prior knowledge, represented as a mental schema, has critical impact on how we organize, interpret, and process incoming information. Recent findings indicate that the use of an existing schema is coordinated by the medial prefrontal cortex (mPFC), communicating with parietal areas. The hippocampus, however, is crucial for encoding schema-unrelated information but not for schema-related information. A recent study indicated that stress mediators may affect schema-related memory, but the underlying neural mechanisms are currently unknown. Here, we thus tested the impact of acute stress on neural processing of schema-related information. We exposed healthy participants to a stress or control manipulation before they processed, in the MRI scanner, words related or unrelated to a preexisting schema activated by a specific cue. Participants' memory for the presented material was tested 3-5 d after encoding. Overall, the processing of schema-related information activated the mPFC, the precuneus, and the angular gyrus. Stress resulted in aberrant hippocampal activity and connectivity while participants processed schema-related information. This aberrant engagement of the hippocampus was linked to altered subsequent memory. These findings suggest that stress may interfere with the efficient use of prior knowledge during encoding and may have important practical implications, in particular for educational settings. © 2018 Vogel et al.; Published by Cold Spring Harbor Laboratory Press.
Control of coherent information via on-chip photonic-phononic emitter-receivers.
Shin, Heedeuk; Cox, Jonathan A; Jarecki, Robert; Starbuck, Andrew; Wang, Zheng; Rakich, Peter T
2015-03-05
Rapid progress in integrated photonics has fostered numerous chip-scale sensing, computing and signal processing technologies. However, many crucial filtering and signal delay operations are difficult to perform with all-optical devices. Unlike photons propagating at luminal speeds, GHz-acoustic phonons moving at slower velocities allow information to be stored, filtered and delayed over comparatively smaller length-scales with remarkable fidelity. Hence, controllable and efficient coupling between coherent photons and phonons enables new signal processing technologies that greatly enhance the performance and potential impact of integrated photonics. Here we demonstrate a mechanism for coherent information processing based on travelling-wave photon-phonon transduction, which achieves a phonon emit-and-receive process between distinct nanophotonic waveguides. Using this device, physics--which supports GHz frequencies--we create wavelength-insensitive radiofrequency photonic filters with frequency selectivity, narrow-linewidth and high power-handling in silicon. More generally, this emit-receive concept is the impetus for enabling new signal processing schemes.
2013-01-01
Background Next generation sequencing technologies have greatly advanced many research areas of the biomedical sciences through their capability to generate massive amounts of genetic information at unprecedented rates. The advent of next generation sequencing has led to the development of numerous computational tools to analyze and assemble the millions to billions of short sequencing reads produced by these technologies. While these tools filled an important gap, current approaches for storing, processing, and analyzing short read datasets generally have remained simple and lack the complexity needed to efficiently model the produced reads and assemble them correctly. Results Previously, we presented an overlap graph coarsening scheme for modeling read overlap relationships on multiple levels. Most current read assembly and analysis approaches use a single graph or set of clusters to represent the relationships among a read dataset. Instead, we use a series of graphs to represent the reads and their overlap relationships across a spectrum of information granularity. At each information level our algorithm is capable of generating clusters of reads from the reduced graph, forming an integrated graph modeling and clustering approach for read analysis and assembly. Previously we applied our algorithm to simulated and real 454 datasets to assess its ability to efficiently model and cluster next generation sequencing data. In this paper we extend our algorithm to large simulated and real Illumina datasets to demonstrate that our algorithm is practical for both sequencing technologies. Conclusions Our overlap graph theoretic algorithm is able to model next generation sequencing reads at various levels of granularity through the process of graph coarsening. Additionally, our model allows for efficient representation of the read overlap relationships, is scalable for large datasets, and is practical for both Illumina and 454 sequencing technologies. PMID:24564333
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-18
... process for gathering the essential post-burn activity information to support emissions inventory and... considers visibility and is based on the criteria of efficiency, economics, law, emission reduction...
Women process multisensory emotion expressions more efficiently than men.
Collignon, O; Girard, S; Gosselin, F; Saint-Amour, D; Lepore, F; Lassonde, M
2010-01-01
Despite claims in the popular press, experiments investigating whether female are more efficient than male observers at processing expression of emotions produced inconsistent findings. In the present study, participants were asked to categorize fear and disgust expressions displayed auditorily, visually, or audio-visually. Results revealed an advantage of women in all the conditions of stimulus presentation. We also observed more nonlinear probabilistic summation in the bimodal conditions in female than male observers, indicating greater neural integration of different sensory-emotional informations. These findings indicate robust differences between genders in the multisensory perception of emotion expression.
Coordinating patient care within radiology and across the enterprise.
McEnery, Kevin W
2014-12-01
For the practice of radiology, the transition to filmless imaging operations has resulted in a fundamental transition to more efficient clinical operations. In addition, the electronic delivery of diagnostic studies to the bedside has had a great impact on the care process throughout the health care enterprise. The radiology information system (RIS) has been at the core of the transition to filmless patient care. In a similar manner, the electronic medical record (EMR) is fundamentally and rapidly transforming the clinical enterprise into paperless/digital coordination of care. The widespread availability of EMR systems can be predicted to continue to increase the level of coordination of clinical care within the EMR framework. For the radiologist, readily available clinical information at the point of interpretation will continue to drive the evolution of the interpretation process, leading to improved patient outcomes. Regardless of practice size, efficient workflow processes are required to best leverage the functionality of IT systems. The radiologist should be aware of the scope of the RIS capabilities that allow for maximizing clinical benefit, and of the EMR system capabilities for improving = clinical imaging practice and care coordination across the enterprise. Radiology departments should be actively involved in forming practice patterns that allow efficient EMR-based clinical practice. This summary article is intended to assist radiologists in becoming active participants in the evolving role of both the RIS and EMR systems in coordinating efficient and effective delivery across the clinical enterprise. Copyright © 2014 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Iterative filtering decomposition based on local spectral evolution kernel
Wang, Yang; Wei, Guo-Wei; Yang, Siyang
2011-01-01
The synthesizing information, achieving understanding, and deriving insight from increasingly massive, time-varying, noisy and possibly conflicting data sets are some of most challenging tasks in the present information age. Traditional technologies, such as Fourier transform and wavelet multi-resolution analysis, are inadequate to handle all of the above-mentioned tasks. The empirical model decomposition (EMD) has emerged as a new powerful tool for resolving many challenging problems in data processing and analysis. Recently, an iterative filtering decomposition (IFD) has been introduced to address the stability and efficiency problems of the EMD. Another data analysis technique is the local spectral evolution kernel (LSEK), which provides a near prefect low pass filter with desirable time-frequency localizations. The present work utilizes the LSEK to further stabilize the IFD, and offers an efficient, flexible and robust scheme for information extraction, complexity reduction, and signal and image understanding. The performance of the present LSEK based IFD is intensively validated over a wide range of data processing tasks, including mode decomposition, analysis of time-varying data, information extraction from nonlinear dynamic systems, etc. The utility, robustness and usefulness of the proposed LESK based IFD are demonstrated via a large number of applications, such as the analysis of stock market data, the decomposition of ocean wave magnitudes, the understanding of physiologic signals and information recovery from noisy images. The performance of the proposed method is compared with that of existing methods in the literature. Our results indicate that the LSEK based IFD improves both the efficiency and the stability of conventional EMD algorithms. PMID:22350559
Age Differences in the Effects of Domain Knowledge on Reading Efficiency
Miller, Lisa M. Soederberg
2009-01-01
The present study investigated age differences in the effects of knowledge on the efficiency with which information is processed while reading. Individuals between 18 and 85 years of age, with varying levels of cooking knowledge, read and recalled a series of short passages within the domain of cooking. Reading efficiency was operationalized as time spent reading divided by the amount recalled for each passage. Results showed that reading efficiency increased with increasing levels of knowledge among older but not younger adults. Similarly, those with smaller working memory capacities showed increasing efficiency with increasing knowledge. These findings suggest that knowledge promotes a more efficient allocation policy which is particularly helpful in later life, perhaps due to age-related declines in working memory capacity. PMID:19290738
NASA Technical Reports Server (NTRS)
Zernic, Michael J.
2002-01-01
Broadband satellite communications for aeronautics marries communication and network technologies to address NASA's goals in information technology base research and development, thereby serving the safety and capacity needs of the National Airspace System. This marriage of technology increases the interactivity between airborne vehicles and ground systems. It improves decision-making and efficiency, reduces operation costs, and improves the safety and capacity of the National Airspace System. To this end, a collaborative project called the Aeronautical Satellite Assisted Process for Information Exchange through Network Technologies, or Aero-SAPIENT, was conducted out of Tinker AFB, Oklahoma, during November and December 2000.
RAPID Toolkit Creates Smooth Flow Toward New Projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levine, Aaron; Young, Katherine
2016-07-01
Uncertainty about the duration and outcome of the permitting process has historically been seen as a deterrent to investment in renewable energy projects, including new hydropower projects. What if the process were clearer, smoother, faster? That's the purpose of the Regulatory and Permitting Information Desktop (RAPID) Toolkit, developed by the National Renewable Energy Laboratory (NREL) with funding from the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy and the Western Governors' Association. Now, the RAPID Toolkit is being expanded to include information about developing and permitting hydropower projects, with initial outreach and information gathering occurring duringmore » 2015.« less
Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong
2015-02-01
Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
den Heeten, G J; Barneveld Binkhuysen, F H
2001-08-25
Determining the rate at which radiology must be digitalised has been a controversial issue for many years. Much radiological information is still obtained from the film-screen combination (X-rays) with all of its known inherent restrictions. The importance of imaging information in the healthcare process continues to increase for both radiologists and referring physicians, and the ongoing developments in information technology means that it is possible to integrate imaging information and electronic patient files. The healthcare process can only become more effective and efficient when the appropriate information is in the right place at the right time, something that conventional methods, using photos that need to be physically moved, can scarcely satisfy. There is also a desire for integration with information obtained from nuclear medicine, pathology and endoscopy, and eventually of all stand-alone data systems with relevance for the individually oriented hospital healthcare. The transition from a conventional to a digital process is complex; it is accompanied by the transition from a data-oriented to a process-oriented system. Many years have already been invested in the integration of information systems and the development of digital systems within radiology, the current performance of which is such that many hospitals are considering the digitalisation process or are already implementing parts of it.
Integrating policy-based management and SLA performance monitoring
NASA Astrophysics Data System (ADS)
Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu
2001-10-01
Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the SNMP agents. The SNMP agents build the management information base (MIB) of all VPNs, policies and rules according to the relationships obtained from the management server. Thus, the proposed policy-based management system may get all performance monitoring information of VPNs and policies from agents. The proposed policy-based manager achieves two goals: a) provide a management environment for the system administrator to configure their network only considering the policy requirement issues and b) let the device have only to process the packet and then collect the required performance information. These two things make the proposed management system satisfy both the user and device requirements.
Symbolic Knowledge Processing for the Acquisition of Expert Behavior: A Study in Medicine.
1984-05-01
information . It provides a model for this type of study, suggesting a different approach to the problem of learning and efficiency of knowledge -based...flow of information 2.2. Scope and description of the subsystems Three subsystems perform distinct operations using the preceding knowledge sources...which actually yields a new knowledge rCpresentation Ahere new external information is encoded in the combination and ordering of elements of the
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
The Development of Word Processing and Its Implications for the Business Education Profession.
ERIC Educational Resources Information Center
Ober, B. Scot
In an attempt to deal with the paperwork explosion occurring in business offices, administrative management has developed the concept of word processing as a means of increasing office efficiency. Thus, the purpose of this study was to provide business educators with information on this new management tool and to identify those skills needed by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knittel, Christopher; Wolfran, Catherine; Gandhi, Raina
A wide range of climate plans rely on energy efficiency to generate energy and carbon emissions reductions, but conventional wisdom holds that consumers have historically underinvested in energy efficiency upgrades. This underinvestment may occur for a variety of reasons, one of which is that consumers are not adequately informed about the benefits to energy efficiency. To address this, the U.S. Department of Energy created a tool called the Home Energy Score (HEScore) to act as a simple, low-cost means to provide clear information about a home’s energy efficiency and motivate homeowners and homebuyers to invest in energy efficiency. The Departmentmore » of Energy is in the process of conducting four evaluations assessing the impact of the Home Energy Score on residential energy efficiency investments and program participation. This paper describes one of these evaluations: a randomized controlled trial conducted in New Jersey in partnership with New Jersey Natural Gas. The evaluation randomly provides homeowners who have received an audit, either because they have recently replaced their furnace, boiler, and/or gas water heater with a high-efficiency model and participated in a free audit to access an incentive, or because they requested an independent audit3, between May 2014 and October 2015, with the Home Energy Score.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ji Zhengfeng; Feng Yuan; Ying Mingsheng
Local quantum operations and classical communication (LOCC) put considerable constraints on many quantum information processing tasks such as cloning and discrimination. Surprisingly, however, discrimination of any two pure states survives such constraints in some sense. We show that cloning is not that lucky; namely, probabilistic LOCC cloning of two product states is strictly less efficient than global cloning. We prove our result by giving explicitly the efficiency formula of local cloning of any two product states.
A GPU-Accelerated Approach for Feature Tracking in Time-Varying Imagery Datasets.
Peng, Chao; Sahani, Sandip; Rushing, John
2017-10-01
We propose a novel parallel connected component labeling (CCL) algorithm along with efficient out-of-core data management to detect and track feature regions of large time-varying imagery datasets. Our approach contributes to the big data field with parallel algorithms tailored for GPU architectures. We remove the data dependency between frames and achieve pixel-level parallelism. Due to the large size, the entire dataset cannot fit into cached memory. Frames have to be streamed through the memory hierarchy (disk to CPU main memory and then to GPU memory), partitioned, and processed as batches, where each batch is small enough to fit into the GPU. To reconnect the feature regions that are separated due to data partitioning, we present a novel batch merging algorithm to extract the region connection information across multiple batches in a parallel fashion. The information is organized in a memory-efficient structure and supports fast indexing on the GPU. Our experiment uses a commodity workstation equipped with a single GPU. The results show that our approach can efficiently process a weather dataset composed of terabytes of time-varying radar images. The advantages of our approach are demonstrated by comparing to the performance of an efficient CPU cluster implementation which is being used by the weather scientists.
System analysis through bond graph modeling
NASA Astrophysics Data System (ADS)
McBride, Robert Thomas
2005-07-01
Modeling and simulation form an integral role in the engineering design process. An accurate mathematical description of a system provides the design engineer the flexibility to perform trade studies quickly and accurately to expedite the design process. Most often, the mathematical model of the system contains components of different engineering disciplines. A modeling methodology that can handle these types of systems might be used in an indirect fashion to extract added information from the model. This research examines the ability of a modeling methodology to provide added insight into system analysis and design. The modeling methodology used is bond graph modeling. An investigation into the creation of a bond graph model using the Lagrangian of the system is provided. Upon creation of the bond graph, system analysis is performed. To aid in the system analysis, an object-oriented approach to bond graph modeling is introduced. A framework is provided to simulate the bond graph directly. Through object-oriented simulation of a bond graph, the information contained within the bond graph can be exploited to create a measurement of system efficiency. A definition of system efficiency is given. This measurement of efficiency is used in the design of different controllers of varying architectures. Optimal control of a missile autopilot is discussed within the framework of the calculated system efficiency.
Info/information theory: speakers choose shorter words in predictive contexts.
Mahowald, Kyle; Fedorenko, Evelina; Piantadosi, Steven T; Gibson, Edward
2013-02-01
A major open question in natural language research is the role of communicative efficiency in the origin and on-line processing of language structures. Here, we use word pairs like chimp/chimpanzee, which differ in length but have nearly identical meanings, to investigate the communicative properties of lexical systems and the communicative pressures on language users.If language is designed to be information-theoretically optimal, then shorter words should convey less information than their longer counterparts, when controlling for meaning. Consistent with this prediction, a corpus analysis revealed that the short form of our meaning-matched pairs occurs in more predictive contexts than the longer form. Second, a behavioral study showed that language users choose the short form more often in predictive contexts, suggesting that tendencies to be information-theoretically efficient manifest in explicit behavioral choices. Our findings, which demonstrate the prominent role of communicative efficiency in the structure of the lexicon, complement and extend the results of Piantadosi, Tily, and Gibson (2011), who showed that word length is better correlated with Shannon information content than with frequency. Crucially, we show that this effect arises at least in part from active speaker choice. Copyright © 2012 Elsevier B.V. All rights reserved.
Optimal Learning Paths in Information Networks
Rodi, G. C.; Loreto, V.; Servedio, V. D. P.; Tria, F.
2015-01-01
Each sphere of knowledge and information could be depicted as a complex mesh of correlated items. By properly exploiting these connections, innovative and more efficient navigation strategies could be defined, possibly leading to a faster learning process and an enduring retention of information. In this work we investigate how the topological structure embedding the items to be learned can affect the efficiency of the learning dynamics. To this end we introduce a general class of algorithms that simulate the exploration of knowledge/information networks standing on well-established findings on educational scheduling, namely the spacing and lag effects. While constructing their learning schedules, individuals move along connections, periodically revisiting some concepts, and sometimes jumping on very distant ones. In order to investigate the effect of networked information structures on the proposed learning dynamics we focused both on synthetic and real-world graphs such as subsections of Wikipedia and word-association graphs. We highlight the existence of optimal topological structures for the simulated learning dynamics whose efficiency is affected by the balance between hubs and the least connected items. Interestingly, the real-world graphs we considered lead naturally to almost optimal learning performances. PMID:26030508
Energy-Saving Opportunities for Manufacturing Enterprises (International English Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This fact sheet provides information about the Industrial Technologies Program Save Energy Now energy audit process, software tools, training, energy management standards, and energy efficient technologies to help U.S. companies identify energy cost savings.
Optical implementation of the synthetic discriminant function
NASA Astrophysics Data System (ADS)
Butler, S.; Riggins, J.
1984-10-01
Much attention is focused on the use of coherent optical pattern recognition (OPR) using matched spatial filters for robotics and intelligent systems. The OPR problem consists of three aspects -- information input, information processing, and information output. This paper discusses the information processing aspect which consists of choosing a filter to provide robust correlation with high efficiency. The filter should ideally be invariant to image shift, rotation and scale, provide a reasonable signal-to-noise (S/N) ratio and allow high throughput efficiency. The physical implementation of a spatial matched filter involves many choices. These include the use of conventional holograms or computer-generated holograms (CGH) and utilizing absorption or phase materials. Conventional holograms inherently modify the reference image by non-uniform emphasis of spatial frequencies. Proper use of film nonlinearity provides improved filter performance by emphasizing frequency ranges crucial to target discrimination. In the case of a CGH, the emphasis of the reference magnitude and phase can be controlled independently of the continuous tone or binary writing processes. This paper describes computer simulation and optical implementation of a geometrical shape and a Synthetic Discriminant Function (SDF) matched filter. The authors chose the binary Allebach-Keegan (AK) CGH algorithm to produce actual filters. The performances of these filters were measured to verify the simulation results. This paper provides a brief summary of the matched filter theory, the SDF, CGH algorithms, Phase-Only-Filtering, simulation procedures, and results.
Quantum communication and information processing
NASA Astrophysics Data System (ADS)
Beals, Travis Roland
Quantum computers enable dramatically more efficient algorithms for solving certain classes of computational problems, but, in doing so, they create new problems. In particular, Shor's Algorithm allows for efficient cryptanalysis of many public-key cryptosystems. As public key cryptography is a critical component of present-day electronic commerce, it is crucial that a working, secure replacement be found. Quantum key distribution (QKD), first developed by C.H. Bennett and G. Brassard, offers a partial solution, but many challenges remain, both in terms of hardware limitations and in designing cryptographic protocols for a viable large-scale quantum communication infrastructure. In Part I, I investigate optical lattice-based approaches to quantum information processing. I look at details of a proposal for an optical lattice-based quantum computer, which could potentially be used for both quantum communications and for more sophisticated quantum information processing. In Part III, I propose a method for converting and storing photonic quantum bits in the internal state of periodically-spaced neutral atoms by generating and manipulating a photonic band gap and associated defect states. In Part II, I present a cryptographic protocol which allows for the extension of present-day QKD networks over much longer distances without the development of new hardware. I also present a second, related protocol which effectively solves the authentication problem faced by a large QKD network, thus making QKD a viable, information-theoretic secure replacement for public key cryptosystems.
Efficient coding explains the universal law of generalization in human perception.
Sims, Chris R
2018-05-11
Perceptual generalization and discrimination are fundamental cognitive abilities. For example, if a bird eats a poisonous butterfly, it will learn to avoid preying on that species again by generalizing its past experience to new perceptual stimuli. In cognitive science, the "universal law of generalization" seeks to explain this ability and states that generalization between stimuli will follow an exponential function of their distance in "psychological space." Here, I challenge existing theoretical explanations for the universal law and offer an alternative account based on the principle of efficient coding. I show that the universal law emerges inevitably from any information processing system (whether biological or artificial) that minimizes the cost of perceptual error subject to constraints on the ability to process or transmit information. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Scalable ion-photon quantum interface based on integrated diffractive mirrors
NASA Astrophysics Data System (ADS)
Ghadimi, Moji; Blūms, Valdis; Norton, Benjamin G.; Fisher, Paul M.; Connell, Steven C.; Amini, Jason M.; Volin, Curtis; Hayden, Harley; Pai, Chien-Shing; Kielpinski, David; Lobino, Mirko; Streed, Erik W.
2017-12-01
Quantum networking links quantum processors through remote entanglement for distributed quantum information processing and secure long-range communication. Trapped ions are a leading quantum information processing platform, having demonstrated universal small-scale processors and roadmaps for large-scale implementation. Overall rates of ion-photon entanglement generation, essential for remote trapped ion entanglement, are limited by coupling efficiency into single mode fibers and scaling to many ions. Here, we show a microfabricated trap with integrated diffractive mirrors that couples 4.1(6)% of the fluorescence from a 174Yb+ ion into a single mode fiber, nearly triple the demonstrated bulk optics efficiency. The integrated optic collects 5.8(8)% of the π transition fluorescence, images the ion with sub-wavelength resolution, and couples 71(5)% of the collected light into the fiber. Our technology is suitable for entangling multiple ions in parallel and overcomes mode quality limitations of existing integrated optical interconnects.
Query-Based Outlier Detection in Heterogeneous Information Networks.
Kuck, Jonathan; Zhuang, Honglei; Yan, Xifeng; Cam, Hasan; Han, Jiawei
2015-03-01
Outlier or anomaly detection in large data sets is a fundamental task in data science, with broad applications. However, in real data sets with high-dimensional space, most outliers are hidden in certain dimensional combinations and are relative to a user's search space and interest. It is often more effective to give power to users and allow them to specify outlier queries flexibly, and the system will then process such mining queries efficiently. In this study, we introduce the concept of query-based outlier in heterogeneous information networks, design a query language to facilitate users to specify such queries flexibly, define a good outlier measure in heterogeneous networks, and study how to process outlier queries efficiently in large data sets. Our experiments on real data sets show that following such a methodology, interesting outliers can be defined and uncovered flexibly and effectively in large heterogeneous networks.
Query-Based Outlier Detection in Heterogeneous Information Networks
Kuck, Jonathan; Zhuang, Honglei; Yan, Xifeng; Cam, Hasan; Han, Jiawei
2015-01-01
Outlier or anomaly detection in large data sets is a fundamental task in data science, with broad applications. However, in real data sets with high-dimensional space, most outliers are hidden in certain dimensional combinations and are relative to a user’s search space and interest. It is often more effective to give power to users and allow them to specify outlier queries flexibly, and the system will then process such mining queries efficiently. In this study, we introduce the concept of query-based outlier in heterogeneous information networks, design a query language to facilitate users to specify such queries flexibly, define a good outlier measure in heterogeneous networks, and study how to process outlier queries efficiently in large data sets. Our experiments on real data sets show that following such a methodology, interesting outliers can be defined and uncovered flexibly and effectively in large heterogeneous networks. PMID:27064397
Grover, Ginni; DeLuca, Keith; Quirin, Sean; DeLuca, Jennifer; Piestun, Rafael
2012-01-01
Super-resolution imaging with photo-activatable or photo-switchable probes is a promising tool in biological applications to reveal previously unresolved intra-cellular details with visible light. This field benefits from developments in the areas of molecular probes, optical systems, and computational post-processing of the data. The joint design of optics and reconstruction processes using double-helix point spread functions (DH-PSF) provides high resolution three-dimensional (3D) imaging over a long depth-of-field. We demonstrate for the first time a method integrating a Fisher information efficient DH-PSF design, a surface relief optical phase mask, and an optimal 3D localization estimator. 3D super-resolution imaging using photo-switchable dyes reveals the 3D microtubule network in mammalian cells with localization precision approaching the information theoretical limit over a depth of 1.2 µm. PMID:23187521
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasanbeigi, Ali; Price, Lynn; Lin, Elina
2012-04-06
Globally, the cement industry accounts for approximately 5 percent of current anthropogenic carbon dioxide (CO{sub 2}) emissions. World cement demand and production are increasing significantly, leading to an increase in this industry's absolute energy use and CO{sub 2} emissions. Development of new energy-efficiency and CO{sub 2} emission-reduction technologies and their deployment in the market will be key for the cement industry's mid- and long-term climate change mitigation strategies. This report is an initial effort to compile available information on process description, energy savings, environmental and other benefits, costs, commercialization status, and references for emerging technologies to reduce the cement industry'smore » energy use and CO{sub 2} emissions. Although studies from around the world identify a variety of sector-specific and cross-cutting energy-efficiency technologies for the cement industry that have already been commercialized, information is scarce and/or scattered regarding emerging or advanced energy-efficiency and low-carbon technologies that are not yet commercialized. This report consolidates available information on nineteen emerging technologies for the cement industry, with the goal of providing engineers, researchers, investors, cement companies, policy makers, and other interested parties with easy access to a well-structured database of information on these technologies.« less
Knowledge sifters in MDA technologies
NASA Astrophysics Data System (ADS)
Kravchenko, Yuri; Kursitys, Ilona; Bova, Victoria
2018-05-01
The article considers a new approach to efficient management of information processes on the basis of object models. With the help of special design tools, a generic and application-independent application model is created, and then the program is implemented in a specific development environment. At the same time, the development process is completely based on a model that must contain all the information necessary for programming. The presence of a detailed model provides the automatic creation of typical parts of the application, the development of which is amenable to automation.
A Graph Based Interface for Representing Volume Visualization Results
NASA Technical Reports Server (NTRS)
Patten, James M.; Ma, Kwan-Liu
1998-01-01
This paper discusses a graph based user interface for representing the results of the volume visualization process. As images are rendered, they are connected to other images in a graph based on their rendering parameters. The user can take advantage of the information in this graph to understand how certain rendering parameter changes affect a dataset, making the visualization process more efficient. Because the graph contains more information than is contained in an unstructured history of images, the image graph is also helpful for collaborative visualization and animation.
Optical recording of information on paper by CO2 and YAG-lasers
NASA Astrophysics Data System (ADS)
Bayev, S. G.; Bessemltsev, V. P.; Koronkevich, D. V.; Tkachuk, Y. N.
1984-09-01
Methods for outputting information from computers that have the advantages of typographic printing processes, but are distinguished by the lack of an intermediate medium are investigated. Methods for recording graphic and half-tone images are investigated that are based on layers of ink deposited on the paper in advance, as well as fixing a temperature-sensitive dye on the paper by using a focused laser beam with radiation power density of .000001 w/sq.cm. to heat the surface. IR process lasers provide good efficiency and resolution.
2012-01-01
This paper describes a modification of the basic directions of state accounting and control of radioactive substances and radioactive waste products, whose implementation will significantly improve the efficiency of its operation at the regional level. Selected areas are designed to improve accounting and control system for the submission of the enterprises established by the reporting forms, the quality of the information contained in them, as well as structures of information and process for collecting, analyzing and data processing concerning radioactive substances and waste products.
[Development of Hospital Equipment Maintenance Information System].
Zhou, Zhixin
2015-11-01
Hospital equipment maintenance information system plays an important role in improving medical treatment quality and efficiency. By requirement analysis of hospital equipment maintenance, the system function diagram is drawed. According to analysis of input and output data, tables and reports in connection with equipment maintenance process, relationships between entity and attribute is found out, and E-R diagram is drawed and relational database table is established. Software development should meet actual process requirement of maintenance and have a friendly user interface and flexible operation. The software can analyze failure cause by statistical analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-09
... members for this purpose. To assist the FAA in accurately and efficiently processing the number of... information. The receipt of this information could influence whether the FAA can add or delete aircraft from... blocking at the FAA source or at the industry level. The distinction between blocking ASDI data at the FAA...
ERIC Educational Resources Information Center
Kutuev, Ruslan A.; Nuriyeva, Elvira N.; Safiullina, Tatyana R.; Kryukova, Nina I.; Tagirova, Nataliya P.; Karpenko, Galina V.
2016-01-01
The relevance of the study is conditioned by a radical impact on the learning process of the university by information technology, which put start a new phase in its transformation. According to experts at the present time the main factor of efficiency of university's activity becomes the expansion of students' learning activities, realized on the…
The role of information and communication technology in planning the digital hospital.
Lacanna, Giuseppe
2013-01-01
Hospital structure is undergoing radical changes, forced by contemporary market trends, new demands from different stakeholders and a common interest in innovation. Health care expenditure around the globe continues to rise at unsustainable levels. In this context efficiency and optimization become the keywords of the process aimed at lowering costs and increasing the quality of care services. Efficiency and optimization leads to innovation, and innovation in the contemporary age leads to the power of information and communication technology (ICT). This paper discusses how ICT became the new shaping tool for hospital environments and highlights one of the best examples of its implementation.
I/O efficient algorithms and applications in geographic information systems
NASA Astrophysics Data System (ADS)
Danner, Andrew
Modern remote sensing methods such a laser altimetry (lidar) and Interferometric Synthetic Aperture Radar (IfSAR) produce georeferenced elevation data at unprecedented rates. Many Geographic Information System (GIS) algorithms designed for terrain modelling applications cannot process these massive data sets. The primary problem is that these data sets are too large to fit in the main internal memory of modern computers and must therefore reside on larger, but considerably slower disks. In these applications, the transfer of data between disk and main memory, or I/O, becomes the primary bottleneck. Working in a theoretical model that more accurately represents this two level memory hierarchy, we can develop algorithms that are I/O-efficient and reduce the amount of disk I/O needed to solve a problem. In this thesis we aim to modernize GIS algorithms and develop a number of I/O-efficient algorithms for processing geographic data derived from massive elevation data sets. For each application, we convert a geographic question to an algorithmic question, develop an I/O-efficient algorithm that is theoretically efficient, implement our approach and verify its performance using real-world data. The applications we consider include constructing a gridded digital elevation model (DEM) from an irregularly spaced point cloud, removing topological noise from a DEM, modeling surface water flow over a terrain, extracting river networks and watershed hierarchies from the terrain, and locating polygons containing query points in a planar subdivision. We initially developed solutions to each of these applications individually. However, we also show how to combine individual solutions to form a scalable geo-processing pipeline that seamlessly solves a sequence of sub-problems with little or no manual intervention. We present experimental results that demonstrate orders of magnitude improvement over previously known algorithms.
Preparatory neural activity predicts performance on a conflict task.
Stern, Emily R; Wager, Tor D; Egner, Tobias; Hirsch, Joy; Mangels, Jennifer A
2007-10-24
Advance preparation has been shown to improve the efficiency of conflict resolution. Yet, with little empirical work directly linking preparatory neural activity to the performance benefits of advance cueing, it is not clear whether this relationship results from preparatory activation of task-specific networks, or from activity associated with general alerting processes. Here, fMRI data were acquired during a spatial Stroop task in which advance cues either informed subjects of the upcoming relevant feature of conflict stimuli (spatial or semantic) or were neutral. Informative cues decreased reaction time (RT) relative to neutral cues, and cues indicating that spatial information would be task-relevant elicited greater activity than neutral cues in multiple areas, including right anterior prefrontal and bilateral parietal cortex. Additionally, preparatory activation in bilateral parietal cortex and right dorsolateral prefrontal cortex predicted faster RT when subjects responded to spatial location. No regions were found to be specific to semantic cues at conventional thresholds, and lowering the threshold further revealed little overlap between activity associated with spatial and semantic cueing effects, thereby demonstrating a single dissociation between activations related to preparing a spatial versus semantic task-set. This relationship between preparatory activation of spatial processing networks and efficient conflict resolution suggests that advance information can benefit performance by leading to domain-specific biasing of task-relevant information.
Coherent spin control of a nanocavity-enhanced qubit in diamond
Li, Luozhou; Lu, Ming; Schroder, Tim; ...
2015-01-28
A central aim of quantum information processing is the efficient entanglement of multiple stationary quantum memories via photons. Among solid-state systems, the nitrogen-vacancy centre in diamond has emerged as an excellent optically addressable memory with second-scale electron spin coherence times. Recently, quantum entanglement and teleportation have been shown between two nitrogen-vacancy memories, but scaling to larger networks requires more efficient spin-photon interfaces such as optical resonators. Here we report such nitrogen-vacancy nanocavity systems in strong Purcell regime with optical quality factors approaching 10,000 and electron spin coherence times exceeding 200 µs using a silicon hard-mask fabrication process. This spin-photon interfacemore » is integrated with on-chip microwave striplines for coherent spin control, providing an efficient quantum memory for quantum networks.« less
Zhang, Wenhai; Li, Hong; Pan, Xiaohong
2015-02-01
Recent resting-state functional magnetic resonance imaging (fMRI) studies using graph theory metrics have revealed that the functional network of the human brain possesses small-world characteristics and comprises several functional hub regions. However, it is unclear how the affective functional network is organized in the brain during the processing of affective information. In this study, the fMRI data were collected from 25 healthy college students as they viewed a total of 81 positive, neutral, and negative pictures. The results indicated that affective functional networks exhibit weaker small-worldness properties with higher local efficiency, implying that local connections increase during viewing affective pictures. Moreover, positive and negative emotional processing exhibit dissociable functional hubs, emerging mainly in task-positive regions. These functional hubs, which are the centers of information processing, have nodal betweenness centrality values that are at least 1.5 times larger than the average betweenness centrality of the network. Positive affect scores correlated with the betweenness values of the right orbital frontal cortex (OFC) and the right putamen in the positive emotional network; negative affect scores correlated with the betweenness values of the left OFC and the left amygdala in the negative emotional network. The local efficiencies in the left superior and inferior parietal lobe correlated with subsequent arousal ratings of positive and negative pictures, respectively. These observations provide important evidence for the organizational principles of the human brain functional connectome during the processing of affective information. © 2014 Wiley Periodicals, Inc.
Total quality management - It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.
[Development of a medical equipment support information system based on PDF portable document].
Cheng, Jiangbo; Wang, Weidong
2010-07-01
According to the organizational structure and management system of the hospital medical engineering support, integrate medical engineering support workflow to ensure the medical engineering data effectively, accurately and comprehensively collected and kept in electronic archives. Analyse workflow of the medical, equipment support work and record all work processes by the portable electronic document. Using XML middleware technology and SQL Server database, complete process management, data calculation, submission, storage and other functions. The practical application shows that the medical equipment support information system optimizes the existing work process, standardized and digital, automatic and efficient orderly and controllable. The medical equipment support information system based on portable electronic document can effectively optimize and improve hospital medical engineering support work, improve performance, reduce costs, and provide full and accurate digital data
Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics
NASA Astrophysics Data System (ADS)
Vasil'ev, V. A.; Dobrynina, N. V.
2017-06-01
The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.
Miura, Asako; Kobayashi, Tetsuro
2016-01-01
Though survey satisficing, grudging cognitive efforts required to provide optimal answers in the survey response process, poses a serious threat to the validity of online experiments, a detailed explanation of the mechanism has yet to be established. Focusing on attitudes toward immigrants, we examined the mechanism by which survey satisficing distorts treatment effect estimates in online experiments. We hypothesized that satisficers would display more stereotypical responses than non-satisficers would when presented with stereotype-disconfirming information about an immigrant. Results of two experiments largely supported our hypotheses. Satisficers, whom we identified through an instructional manipulation check (IMC), processed information about immigrants' personality traits congruently with the stereotype activated by information provided about nationality. The significantly shorter vignette reading time of satisficers corroborates their time-efficient impression formation based on stereotyping. However, the shallow information processing of satisficers can be rectified by alerting them to their inattentiveness through use of a repeated IMC. PMID:27803680
Medical Devices Transition to Information Systems: Lessons Learned
Charters, Kathleen G.
2012-01-01
Medical devices designed to network can share data with a Clinical Information System (CIS), making that data available within clinician workflow. Some lessons learned by transitioning anesthesia reporting and monitoring devices (ARMDs) on a local area network (LAN) to integration of anesthesia documentation within a CIS include the following categories: access, contracting, deployment, implementation, planning, security, support, training and workflow integration. Areas identified for improvement include: Vendor requirements for access reconciled with the organizations’ security policies and procedures. Include clauses supporting transition from stand-alone devices to information integrated into clinical workflow in the medical device procurement contract. Resolve deployment and implementation barriers that make the process less efficient and more costly. Include effective field communication and creative alternatives in planning. Build training on the baseline knowledge of trainees. Include effective help desk processes and metrics. Have a process for determining where problems originate when systems share information. PMID:24199054
Ruiter, R A; Kok, G; Verplanken, B; Brug, J
2001-06-01
The effect of fear arousal on attitude toward participating in early detection activities [i.e. breast self-examination (BSE)] was studied from an information-processing perspective. It was hypothesized that fear arousal motivates respondents to more argument-based processing of fear-relevant persuasive information. Respondents first read information about breast cancer in which fear was manipulated. After measuring fear arousal, respondents read a persuasive message about performing BSE. Analyses with reported fear, but not manipulated fear, found support for the hypothesis. Respondents who reported mild fear of breast cancer based their attitude toward BSE more on the arguments provided than respondents who reported low fear of breast cancer. This finding suggests that the use of fear arousal may be an efficient tool in health education practice. However, alternative interpretations are provided, in addition to the suggestion to be careful with using fear arousal in health education messages.
Liu, Shih-Chii; Delbruck, Tobi
2010-06-01
Biology provides examples of efficient machines which greatly outperform conventional technology. Designers in neuromorphic engineering aim to construct electronic systems with the same efficient style of computation. This task requires a melding of novel engineering principles with knowledge gleaned from neuroscience. We discuss recent progress in realizing neuromorphic sensory systems which mimic the biological retina and cochlea, and subsequent sensor processing. The main trends are the increasing number of sensors and sensory systems that communicate through asynchronous digital signals analogous to neural spikes; the improved performance and usability of these sensors; and novel sensory processing methods which capitalize on the timing of spikes from these sensors. Experiments using these sensors can impact how we think the brain processes sensory information. 2010 Elsevier Ltd. All rights reserved.
Quantitative optical diagnostics in pathology recognition and monitoring of tissue reaction to PDT
NASA Astrophysics Data System (ADS)
Kirillin, Mikhail; Shakhova, Maria; Meller, Alina; Sapunov, Dmitry; Agrba, Pavel; Khilov, Alexander; Pasukhin, Mikhail; Kondratieva, Olga; Chikalova, Ksenia; Motovilova, Tatiana; Sergeeva, Ekaterina; Turchin, Ilya; Shakhova, Natalia
2017-07-01
Optical coherence tomography (OCT) is currently actively introduced into clinical practice. Besides diagnostics, it can be efficiently employed for treatment monitoring allowing for timely correction of the treatment procedure. In monitoring of photodynamic therapy (PDT) traditionally employed fluorescence imaging (FI) can benefit from complementary use of OCT. Additional diagnostic efficiency can be derived from numerical processing of optical diagnostics data providing more information compared to visual evaluation. In this paper we report on application of OCT together with numerical processing for clinical diagnostic in gynecology and otolaryngology, for monitoring of PDT in otolaryngology and on OCT and FI applications in clinical and aesthetic dermatology. Image numerical processing and quantification provides increase in diagnostic accuracy. Keywords: optical coherence tomography, fluorescence imaging, photod
Changes in Efficiency and Safety Culture After Integration of an I-PASS-Supported Handoff Process.
Sheth, Shreya; McCarthy, Elisa; Kipps, Alaina K; Wood, Matthew; Roth, Stephen J; Sharek, Paul J; Shin, Andrew Y
2016-02-01
Recent publications have shown improved outcomes associated with resident-to-resident handoff processes. However, the implementation of similar handoff processes for patients moving between units and teams with expansive responsibilities presents unique challenges. We sought to determine the impact of a multidisciplinary standardized handoff process on efficiency, safety culture, and satisfaction. A prospective improvement initiative to standardize handoffs during patient transitions from the cardiovascular ICU to the acute care unit was implemented in a university-affiliated children's hospital. Time between verbal handoff and patient transfer decreased from baseline (397 ± 167 minutes) to the postintervention period (24 ± 21 minutes) (P < .01). Percentage positive scores for the handoff/transitions domain of a national culture of safety survey improved (39.8% vs 15.2% and 38.8% vs 19.6%; P = .005 and 0.03, respectively). Provider satisfaction improved related to the information conveyed (34% to 41%; P = .03), time to transfer (5% to 34%; P < .01), and overall experience (3% to 24%; P < .01). Family satisfaction improved for several questions, including: "satisfaction with the information conveyed" (42% to 70%; P = .02), "opportunities to ask questions" (46% to 74%; P < .01), and "Acute Care team's knowledgeabout my child's issues" (50% to 73%; P = .04). No differences in rates of readmission, rapid response team calls, or mortality were observed. Implementation of a multidisciplinary I-PASS-supported handoff process for patients transferring from the cardiovascular ICU to the acute care unit resulted in improved transfer efficiency, safety culture scores, and satisfaction of providers and families. Copyright © 2016 by the American Academy of Pediatrics.
Ying, William; Levons, Jaquan K; Carney, Andrea; Gandhi, Rajesh; Vydra, Vicky; Rubin, A Erik
2016-06-01
A novel semiautomated buffer exchange process workflow was developed to enable efficient early protein formulation screening. An antibody fragment protein, BMSdab, was used to demonstrate the workflow. The process afforded 60% to 80% cycle time and scientist time savings and significant material efficiencies. These efficiencies ultimately facilitated execution of this stability work earlier in the drug development process, allowing this tool to inform the developability of potential candidates for development from a formulation perspective. To overcome the key technical challenges, the protein solution was buffer-exchanged by centrifuge filtration into formulations for stability screening in a 96-well plate with an ultrafiltration membrane, leveraging automated liquid handling and acoustic volume measurements to allow several cycles of exchanges. The formulations were transferred into a vacuum manifold and sterile filtered into a rack holding 96 glass vials. The vials were sealed with a capmat of individual caps and placed in stability stations. Stability of the samples prepared by this process and by the standard process was demonstrated to be comparable. This process enabled screening a number of formulations of a protein at an early pharmaceutical development stage with a short sample preparation time. © 2015 Society for Laboratory Automation and Screening.
Deputy Administrator Robert Perciasepe requested a workgroup develop options and recommendations to ensure that the Agency’s administration of FOIA and related processes are effective, efficient and promote open government and transparency policies.
Haugh, Richard
2004-02-01
Thanks to HIPAA, banks stand to earn billions of dollars in new business by processing electronic claims for health care providers and payers. And the health care industry could realize $35 billion a year in efficiency gains and cost savings. But overshadowing it all is the question of how protected patient information will be--and how liable hospitals will be for any breach of that information by their business partners.
Gvozdanović, Darko; Koncar, Miroslav; Kojundzić, Vinko; Jezidzić, Hrvoje
2007-01-01
In order to improve the quality of patient care, while at the same time keeping up with the pace of increased needs of the population for healthcare services that directly impacts on the cost of care delivery processes, the Republic of Croatia, under the leadership of the Ministry of Health and Social Welfare, has formed a strategy and campaign for national public healthcare system reform. The strategy is very comprehensive and addresses all niches of care delivery processes; it is founded on the enterprise information systems that will aim to support end-to-end business processes in the healthcare domain. Two major requirements are in focus: (1) to provide efficient healthcare-related data management in support of decision-making processes; (2) to support a continuous process of healthcare resource spending optimisation. The first project is the Integrated Healthcare Information System (IHCIS) on the primary care level; this encompasses the integration of all primary point-of-care facilities and subjects with the Croatian Institute for Health Insurance and Croatian National Institute of Public Health. In years to come, IHCIS will serve as the main integration platform for connecting all other stakeholders and levels of health care (that is, hospitals, pharmacies, laboratories) into a single enterprise healthcare network. This article gives an overview of Croatian public healthcare system strategy aims and goals, and focuses on properties and characteristics of the primary care project implementation that started in 2003; it achieved a major milestone in early 2007 - the official grand opening of the project with 350 GPs already fully connected to the integrated healthcare information infrastructure based on the IHCIS solution.
Evidence for a neural dual-process account for adverse effects of cognitive control.
Zink, Nicolas; Stock, Ann-Kathrin; Colzato, Lorenza; Beste, Christian
2018-06-09
Advantageous effects of cognitive control are well-known, but cognitive control may also have adverse effects, for example when it suppresses the implicit processing of stimulus-response (S-R) bindings that could benefit task performance. Yet, the neurophysiological and functional neuroanatomical structures associated with adverse effects of cognitive control are poorly understood. We used an extreme group approach to compare individuals who exhibit adverse effects of cognitive control to individuals who do not by combining event-related potentials (ERPs), source localization, time-frequency analysis and network analysis methods. While neurophysiological correlates of cognitive control (i.e. N2, N450, theta power and theta-mediated neuronal network efficiency) and task-set updating (P3) both reflect control demands and implicit information processing, differences in the degree of adverse cognitive control effects are associated with two independent neural mechanisms: Individuals, who show adverse behavioral effects of cognitive control, show reduced small-world properties and thus reduced efficiency in theta-modulated networks when they fail to effectively process implicit information. In contrast to this, individuals who do not display adverse control effects show enhanced task-set updating mechanism when effectively processing implicit information, which is reflected by the P3 ERP component and associated with the temporo-parietal junction (TPJ, BA 40) and medial frontal gyrus (MFG; BA 8). These findings suggest that implicit S-R contingencies, which benefit response selection without cognitive control, are always 'picked up', but may fail to be integrated with task representations to guide response selection. This provides evidence for a neurophysiological and functional neuroanatomical "dual-process" account of adverse cognitive control effects.
Jadhav, Pravin R; Neal, Lauren; Florian, Jeff; Chen, Ying; Naeger, Lisa; Robertson, Sarah; Soon, Guoxing; Birnkrant, Debra
2010-09-01
This article presents a prototype for an operational innovation in knowledge management (KM). These operational innovations are geared toward managing knowledge efficiently and accessing all available information by embracing advances in bioinformatics and allied fields. The specific components of the proposed KM system are (1) a database to archive hepatitis C virus (HCV) treatment data in a structured format and retrieve information in a query-capable manner and (2) an automated analysis tool to inform trial design elements for HCV drug development. The proposed framework is intended to benefit drug development by increasing efficiency of dose selection and improving the consistency of advice from US Food and Drug Administration (FDA). It is also hoped that the framework will encourage collaboration among FDA, industry, and academic scientists to guide the HCV drug development process using model-based quantitative analysis techniques.
Singh, Prabal Vikram; Tatambhotla, Anand; Kalvakuntla, Rohini; Chokshi, Maulik
2013-01-01
Objective To perform an initial qualitative comparison of the different procurement models in India to frame questions for future research in this area; to capture the finer differences between the state models through 53 process and price parameters to determine their functional efficiencies. Design Qualitative analysis is performed for the study. Five states: Tamil Nadu, Kerala, Odisha, Punjab and Maharashtra were chosen to ensure heterogeneity in a number of factors such as procurement type (centralised, decentralised or mixed); autonomy of the procurement organisation; state of public health infrastructure; geography and availability of data through Right to Information Act (RTI). Data on procurement processes were collected through key informant analysis by way of semistructured interviews with leadership teams of procuring organisations. These process data were validated through interviews with field staff (stakeholders of district hospitals, taluk hospitals, community health centres and primary health centres) in each state. A total of 30 actors were interviewed in all five states. The data collected are analysed against 52 process and price parameters to determine the functional efficiency of the model. Results The analysis indicated that autonomous procurement organisations were more efficient in relation to payments to suppliers, had relatively lower drug procurement prices and managed their inventory more scientifically. Conclusions The authors highlight critical success factors that significantly influence the outcome of any procurement model. In a way, this study raises more questions and seeks the need for further research in this arena to aid policy makers. PMID:23388196
Beer, Sebastian; Dobler, Dorota; Gross, Alexander; Ost, Martin; Elseberg, Christiane; Maeder, Ulf; Schmidts, Thomas Michael; Keusgen, Michael; Fiebich, Martin; Runkel, Frank
2013-01-30
Multiple emulsions offer various applications in a wide range of fields such as pharmaceutical, cosmetics and food technology. Two features are known to yield a great influence on multiple emulsion quality and utility as encapsulation efficiency and prolonged stability. To achieve a prolonged stability, the production of the emulsions has to be observed and controlled, preferably in line. In line measurements provide available parameters in a short time frame without the need for the sample to be removed from the process stream, thereby enabling continuous process control. In this study, information about the physical state of multiple emulsions obtained from dielectric spectroscopy (DS) is evaluated for this purpose. Results from dielectric measurements performed in line during the production cycle are compared to theoretically expected results and to well established off line measurements. Thus, a first step to include the production of multiple emulsions into the process analytical technology (PAT) guidelines of the Food and Drug Administration (FDA) is achieved. DS proved to be beneficial in determining the crucial stopping criterion, which is essential in the production of multiple emulsions. The stopping of the process at a less-than-ideal point can severely lower the encapsulation efficiency and the stability, thereby lowering the quality of the emulsion. DS is also expected to provide further information about the multiple emulsion like encapsulation efficiency. Copyright © 2012 Elsevier B.V. All rights reserved.
Singh, Prabal Vikram; Tatambhotla, Anand; Kalvakuntla, Rohini; Chokshi, Maulik
2013-01-01
To perform an initial qualitative comparison of the different procurement models in India to frame questions for future research in this area; to capture the finer differences between the state models through 53 process and price parameters to determine their functional efficiencies. Qualitative analysis is performed for the study. Five states: Tamil Nadu, Kerala, Odisha, Punjab and Maharashtra were chosen to ensure heterogeneity in a number of factors such as procurement type (centralised, decentralised or mixed); autonomy of the procurement organisation; state of public health infrastructure; geography and availability of data through Right to Information Act (RTI). Data on procurement processes were collected through key informant analysis by way of semistructured interviews with leadership teams of procuring organisations. These process data were validated through interviews with field staff (stakeholders of district hospitals, taluk hospitals, community health centres and primary health centres) in each state. A total of 30 actors were interviewed in all five states. The data collected are analysed against 52 process and price parameters to determine the functional efficiency of the model. The analysis indicated that autonomous procurement organisations were more efficient in relation to payments to suppliers, had relatively lower drug procurement prices and managed their inventory more scientifically. The authors highlight critical success factors that significantly influence the outcome of any procurement model. In a way, this study raises more questions and seeks the need for further research in this arena to aid policy makers.
NASA Technical Reports Server (NTRS)
Lloyd, J. F., Sr.
1987-01-01
Industrial radiography is a well established, reliable means of providing nondestructive structural integrity information. The majority of industrial radiographs are interpreted by trained human eyes using transmitted light and various visual aids. Hundreds of miles of radiographic information are evaluated, documented and archived annually. In many instances, there are serious considerations in terms of interpreter fatigue, subjectivity and limited archival space. Quite often it is difficult to quickly retrieve radiographic information for further analysis or investigation. Methods of improving the quality and efficiency of the radiographic process are being explored, developed and incorporated whenever feasible. High resolution cameras, digital image processing, and mass digital data storage offer interesting possibilities for improving the industrial radiographic process. A review is presented of computer aided radiographic interpretation technology in terms of how it could be used to enhance the radiographic interpretation process in evaluating radiographs of aluminum welds.
Key Drivers of Marines Willingness to Adopt Energy-Efficient Technologies
2013-12-01
influences the rate of adoption. Communication is “the process by which participants create and share information with one another in order to reach a...more likely to assess the value of the innovation themselves rather than the value of the implementer’s market . Kleijnen, Lee, and Wetzels (2009...willingness to ucc (~pt energy cftid(’nt technologil~. The adaptation of energy efficient technologies will significantly reduce fossil fuel der>endency
Mutemwa, Richard I
2006-01-01
At the onset of health system decentralization as a primary health care strategy, which constituted a key feature of health sector reforms across the developing world, efficient and effective health management information systems (HMIS) were widely acknowledged and adopted as a critical element of district health management strengthening programmes. The focal concern was about the performance and long-term sustainability of decentralized district health systems. The underlying logic was that effective and efficient HMIS would provide district health managers with the information required to make effective strategic decisions that are the vehicle for district performance and sustainability in these decentralized health systems. However, this argument is rooted in normative management and decision theory without significant unequivocal empirical corroboration. Indeed, extensive empirical evidence continues to indicate that managers' decision-making behaviour and the existence of other forms of information outside the HMIS, within the organizational environment, suggest a far more tenuous relationship between the presence of organizational management information systems (such as HMIS) and effective strategic decision-making. This qualitative comparative case-study conducted in two districts of Zambia focused on investigating the presence and behaviour of five formally identified, different information forms, including that from HMIS, in the strategic decision-making process. The aim was to determine the validity of current arguments for HMIS, and establish implications for current HMIS policies. Evidence from the eight strategic decision-making processes traced in the study confirmed the existence of different forms of information in the organizational environment, including that provided by the conventional HMIS. These information forms attach themselves to various organizational management processes and key aspects of organizational routine. The study results point to the need for a radical re-think of district health management information solutions in ways that account for the existence of other information forms outside the formal HMIS in the district health system.
ERIC Educational Resources Information Center
Jewer, Jennifer; Evermann, Joerg
2015-01-01
Enterprise systems and business process management are the two key information technologies to integrate the functions of a modern business into a coherent and efficient system. While the benefits of these systems are easy to describe, students, especially those without business experience, have difficulty appreciating how these systems are used…
Innovation in managing the referral process at a Canadian pediatric hospital.
MacGregor, Daune; Parker, Sandra; MacMillan, Sharon; Blais, Irene; Wong, Eugene; Robertson, Chris J; Bruce-Barrett, Cindy
2009-01-01
The provision of timely and optimal patient care is a priority in pediatric academic health science centres. Timely access to care is optimized when there is an efficient and consistent referral system in place. In order to improve the patient referral process and, therefore, access to care, an innovative web-based system was developed and implemented. The Ambulatory Referral Management System enables the electronic routing for submission, review, triage and management of all outpatient referrals. The implementation of this system has provided significant metrics that have informed how processes can be improved to increase access to care. Use of the system has improved efficiency in the referral process and has reduced the work associated with the previous paper-based referral system. It has also enhanced communication between the healthcare provider and the patient and family and has improved the security and confidentiality of patient information management. Referral guidelines embedded within the system have helped to ensure that referrals are more complete and that the patient being referred meets the criteria for assessment and treatment in an ambulatory setting. The system calculates and reports on wait times, as well as other measures.
Nested polynomial trends for the improvement of Gaussian process-based predictors
NASA Astrophysics Data System (ADS)
Perrin, G.; Soize, C.; Marque-Pucheu, S.; Garnier, J.
2017-10-01
The role of simulation keeps increasing for the sensitivity analysis and the uncertainty quantification of complex systems. Such numerical procedures are generally based on the processing of a huge amount of code evaluations. When the computational cost associated with one particular evaluation of the code is high, such direct approaches based on the computer code only, are not affordable. Surrogate models have therefore to be introduced to interpolate the information given by a fixed set of code evaluations to the whole input space. When confronted to deterministic mappings, the Gaussian process regression (GPR), or kriging, presents a good compromise between complexity, efficiency and error control. Such a method considers the quantity of interest of the system as a particular realization of a Gaussian stochastic process, whose mean and covariance functions have to be identified from the available code evaluations. In this context, this work proposes an innovative parametrization of this mean function, which is based on the composition of two polynomials. This approach is particularly relevant for the approximation of strongly non linear quantities of interest from very little information. After presenting the theoretical basis of this method, this work compares its efficiency to alternative approaches on a series of examples.
Topology-dependent density optima for efficient simultaneous network exploration
NASA Astrophysics Data System (ADS)
Wilson, Daniel B.; Baker, Ruth E.; Woodhouse, Francis G.
2018-06-01
A random search process in a networked environment is governed by the time it takes to visit every node, termed the cover time. Often, a networked process does not proceed in isolation but competes with many instances of itself within the same environment. A key unanswered question is how to optimize this process: How many concurrent searchers can a topology support before the benefits of parallelism are outweighed by competition for space? Here, we introduce the searcher-averaged parallel cover time (APCT) to quantify these economies of scale. We show that the APCT of the networked symmetric exclusion process is optimized at a searcher density that is well predicted by the spectral gap. Furthermore, we find that nonequilibrium processes, realized through the addition of bias, can support significantly increased density optima. Our results suggest alternative hybrid strategies of serial and parallel search for efficient information gathering in social interaction and biological transport networks.
Wang, Siqi; Wang, Hengwei; Lv, Jiyang; Deng, Zixin; Cheng, Hairong
2017-12-20
Erythritol, a natural sugar alcohol, is produced industrially by fermentation and crystallization, but this process leaves a large amount of waste erythritol mother liquor (WEML) which contains more than 200 g/L erythritol as well as other polyol byproducts. These impurities make it very difficult to crystallize more erythritol. In our study, an efficient process for the recovery of erythritol from the WEML is described. The polyol impurities were first identified by high-performance liquid chromatography and gas chromatography-mass spectrometry, and a yeast strain Candida maltosa CGMCC 7323 was then isolated to metabolize those impurities to purify erythritol. Our results demonstrated that the process could remarkably improve the purity of erythritol and thus make the subsequent crystallization easier. This newly developed strategy is expected to have advantages in WEML treatment and provide helpful information with regard to green cell factories and zero-waste processing.
NASA Astrophysics Data System (ADS)
Cao, Yuansheng; Gong, Zongping; Quan, H. T.
2015-06-01
Motivated by the recent proposed models of the information engine [Proc. Natl. Acad. Sci. USA 109, 11641 (2012), 10.1073/pnas.1204263109] and the information refrigerator [Phys. Rev. Lett. 111, 030602 (2013), 10.1103/PhysRevLett.111.030602], we propose a minimal model of the information pump and the information eraser based on enzyme kinetics. This device can either pump molecules against the chemical potential gradient by consuming the information to be encoded in the bit stream or (partially) erase the information initially encoded in the bit stream by consuming the Gibbs free energy. The dynamics of this model is solved exactly, and the "phase diagram" of the operation regimes is determined. The efficiency and the power of the information machine is analyzed. The validity of the second law of thermodynamics within our model is clarified. Our model offers a simple paradigm for the investigating of the thermodynamics of information processing involving the chemical potential in small systems.
Wasike, Chrilukovian B; Magothe, Thomas M; Kahi, Alexander K; Peters, Kurt J
2011-01-01
Animal recording in Kenya is characterised by erratic producer participation and high drop-out rates from the national recording scheme. This study evaluates factors influencing efficiency of beef and dairy cattle recording system. Factors influencing efficiency of animal identification and registration, pedigree and performance recording, and genetic evaluation and information utilisation were generated using qualitative and participatory methods. Pairwise comparison of factors was done by strengths, weaknesses, opportunities and threats-analytical hierarchical process analysis and priority scores to determine their relative importance to the system calculated using Eigenvalue method. For identification and registration, and evaluation and information utilisation, external factors had high priority scores. For pedigree and performance recording, threats and weaknesses had the highest priority scores. Strengths factors could not sustain the required efficiency of the system. Weaknesses of the system predisposed it to threats. Available opportunities could be explored as interventions to restore efficiency in the system. Defensive strategies such as reorienting the system to offer utility benefits to recording, forming symbiotic and binding collaboration between recording organisations and NARS, and development of institutions to support recording were feasible.
Fernández-Berni, Jorge; Carmona-Galán, Ricardo; del Río, Rocío; Kleihorst, Richard; Philips, Wilfried; Rodríguez-Vázquez, Ángel
2014-01-01
The capture, processing and distribution of visual information is one of the major challenges for the paradigm of the Internet of Things. Privacy emerges as a fundamental barrier to overcome. The idea of networked image sensors pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. Power consumption also constitutes a crucial aspect. Images contain a massive amount of data to be processed under strict timing requirements, demanding high-performance vision systems. In this paper, we describe a hardware-based strategy to concurrently address these two key issues. By conveying processing capabilities to the focal plane in addition to sensing, we can implement privacy protection measures just at the point where sensitive data are generated. Furthermore, such measures can be tailored for efficiently reducing the computational load of subsequent processing stages. As a proof of concept, a full-custom QVGA vision sensor chip is presented. It incorporates a mixed-signal focal-plane sensing-processing array providing programmable pixelation of multiple image regions in parallel. In addition to this functionality, the sensor exploits reconfigurability to implement other processing primitives, namely block-wise dynamic range adaptation, integral image computation and multi-resolution filtering. The proposed circuitry is also suitable to build a granular space, becoming the raw material for subsequent feature extraction and recognition of categorized objects. PMID:25195849
Fernández-Berni, Jorge; Carmona-Galán, Ricardo; del Río, Rocío; Kleihorst, Richard; Philips, Wilfried; Rodríguez-Vázquez, Ángel
2014-08-19
The capture, processing and distribution of visual information is one of the major challenges for the paradigm of the Internet of Things. Privacy emerges as a fundamental barrier to overcome. The idea of networked image sensors pervasively collecting data generates social rejection in the face of sensitive information being tampered by hackers or misused by legitimate users. Power consumption also constitutes a crucial aspect. Images contain a massive amount of data to be processed under strict timing requirements, demanding high-performance vision systems. In this paper, we describe a hardware-based strategy to concurrently address these two key issues. By conveying processing capabilities to the focal plane in addition to sensing, we can implement privacy protection measures just at the point where sensitive data are generated. Furthermore, such measures can be tailored for efficiently reducing the computational load of subsequent processing stages. As a proof of concept, a full-custom QVGA vision sensor chip is presented. It incorporates a mixed-signal focal-plane sensing-processing array providing programmable pixelation of multiple image regions in parallel. In addition to this functionality, the sensor exploits reconfigurability to implement other processing primitives, namely block-wise dynamic range adaptation, integral image computation and multi-resolution filtering. The proposed circuitry is also suitable to build a granular space, becoming the raw material for subsequent feature extraction and recognition of categorized objects.
NASA Astrophysics Data System (ADS)
Zheng, Yan
2015-03-01
Internet of things (IoT), focusing on providing users with information exchange and intelligent control, attracts a lot of attention of researchers from all over the world since the beginning of this century. IoT is consisted of large scale of sensor nodes and data processing units, and the most important features of IoT can be illustrated as energy confinement, efficient communication and high redundancy. With the sensor nodes increment, the communication efficiency and the available communication band width become bottle necks. Many research work is based on the instance which the number of joins is less. However, it is not proper to the increasing multi-join query in whole internet of things. To improve the communication efficiency between parallel units in the distributed sensor network, this paper proposed parallel query optimization algorithm based on distribution attributes cost graph. The storage information relations and the network communication cost are considered in this algorithm, and an optimized information changing rule is established. The experimental result shows that the algorithm has good performance, and it would effectively use the resource of each node in the distributed sensor network. Therefore, executive efficiency of multi-join query between different nodes could be improved.
Efficient and secure outsourcing of genomic data storage.
Sousa, João Sá; Lefebvre, Cédric; Huang, Zhicong; Raisaro, Jean Louis; Aguilar-Melchor, Carlos; Killijian, Marc-Olivier; Hubaux, Jean-Pierre
2017-07-26
Cloud computing is becoming the preferred solution for efficiently dealing with the increasing amount of genomic data. Yet, outsourcing storage and processing sensitive information, such as genomic data, comes with important concerns related to privacy and security. This calls for new sophisticated techniques that ensure data protection from untrusted cloud providers and that still enable researchers to obtain useful information. We present a novel privacy-preserving algorithm for fully outsourcing the storage of large genomic data files to a public cloud and enabling researchers to efficiently search for variants of interest. In order to protect data and query confidentiality from possible leakage, our solution exploits optimal encoding for genomic variants and combines it with homomorphic encryption and private information retrieval. Our proposed algorithm is implemented in C++ and was evaluated on real data as part of the 2016 iDash Genome Privacy-Protection Challenge. Results show that our solution outperforms the state-of-the-art solutions and enables researchers to search over millions of encrypted variants in a few seconds. As opposed to prior beliefs that sophisticated privacy-enhancing technologies (PETs) are unpractical for real operational settings, our solution demonstrates that, in the case of genomic data, PETs are very efficient enablers.
Problems and Processes in Medical Encounters: The CASES method of dialogue analysis
Laws, M. Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B.
2013-01-01
Objective To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Methods Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The Comprehensive Analysis of the Structure of Encounters System (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads – the problems or issues addressed; and processes within threads –basic tasks of clinical care labeled Presentation, Information, Resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. Results 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. Conclusions In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Practice Implications Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. PMID:23391684
Problems and processes in medical encounters: the cases method of dialogue analysis.
Laws, M Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B
2013-05-01
To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The comprehensive analysis of the structure of encounters system (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads--the problems or issues addressed; and processes within threads--basic tasks of clinical care labeled presentation, information, resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
1994-01-01
The ChemScan UV-6100 is a spectrometry system originally developed by Biotronics Technologies, Inc. under a Small Business Innovation Research (SBIR) contract. It is marketed to the water and wastewater treatment industries, replacing "grab sampling" with on-line data collection. It analyzes the light absorbance characteristics of a water sample, simultaneously detects hundreds of individual wavelengths absorbed by chemical substances in a process solution, and quantifies the information. Spectral data is then processed by ChemScan analyzer and compared with calibration files in the system's memory in order to calculate concentrations of chemical substances that cause UV light absorbance in specific patterns. Monitored substances can be analyzed for quality and quantity. Applications include detection of a variety of substances, and the information provided enables an operator to control a process more efficiently.
Garcia, Macarena C; Garrett, Nedra Y; Singletary, Vivian; Brown, Sheereen; Hennessy-Burt, Tamara; Haney, Gillian; Link, Kimberly; Tripp, Jennifer; Mac Kenzie, William R; Yoon, Paula
2017-12-07
State and local public health agencies collect and use surveillance data to identify outbreaks, track cases, investigate causes, and implement measures to protect the public-s health through various surveillance systems and data exchange practices. The purpose of this assessment was to better understand current practices at state and local public health agencies for collecting, managing, processing, reporting, and exchanging notifiable disease surveillance information. Over an 18-month period (January 2014-June 2015), we evaluated the process of data exchange between surveillance systems, reporting burdens, and challenges within 3 states (California, Idaho, and Massachusetts) that were using 3 different reporting systems. All 3 states use a combination of paper-based and electronic information systems for managing and exchanging data on reportable conditions within the state. The flow of data from local jurisdictions to the state health departments varies considerably. When state and local information systems are not interoperable, manual duplicative data entry and other work-arounds are often required. The results of the assessment show the complexity of disease reporting at the state and local levels and the multiple systems, processes, and resources engaged in preparing, processing, and transmitting data that limit interoperability and decrease efficiency. Through this structured assessment, the Centers for Disease Control and Prevention (CDC) has a better understanding of the complexities for surveillance of using commercial off-the-shelf data systems (California and Massachusetts), and CDC-developed National Electronic Disease Surveillance System Base System. More efficient data exchange and use of data will help facilitate interoperability between National Notifiable Diseases Surveillance Systems.
Implementation of Systematic Review Tools in IRIS | Science ...
Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view
Representation control increases task efficiency in complex graphical representations.
Moritz, Julia; Meyerhoff, Hauke S; Meyer-Dernbecher, Claudia; Schwan, Stephan
2018-01-01
In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients.
Representation control increases task efficiency in complex graphical representations
Meyerhoff, Hauke S.; Meyer-Dernbecher, Claudia; Schwan, Stephan
2018-01-01
In complex graphical representations, the relevant information for a specific task is often distributed across multiple spatial locations. In such situations, understanding the representation requires internal transformation processes in order to extract the relevant information. However, digital technology enables observers to alter the spatial arrangement of depicted information and therefore to offload the transformation processes. The objective of this study was to investigate the use of such a representation control (i.e. the users' option to decide how information should be displayed) in order to accomplish an information extraction task in terms of solution time and accuracy. In the representation control condition, the participants were allowed to reorganize the graphical representation and reduce information density. In the control condition, no interactive features were offered. We observed that participants in the representation control condition solved tasks that required reorganization of the maps faster and more accurate than participants without representation control. The present findings demonstrate how processes of cognitive offloading, spatial contiguity, and information coherence interact in knowledge media intended for broad and diverse groups of recipients. PMID:29698443
Hadoop-based implementation of processing medical diagnostic records for visual patient system
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Shi, Liehang; Xie, Zhe; Zhang, Jianguo
2018-03-01
We have innovatively introduced Visual Patient (VP) concept and method visually to represent and index patient imaging diagnostic records (IDR) in last year SPIE Medical Imaging (SPIE MI 2017), which can enable a doctor to review a large amount of IDR of a patient in a limited appointed time slot. In this presentation, we presented a new approach to design data processing architecture of VP system (VPS) to acquire, process and store various kinds of IDR to build VP instance for each patient in hospital environment based on Hadoop distributed processing structure. We designed this system architecture called Medical Information Processing System (MIPS) with a combination of Hadoop batch processing architecture and Storm stream processing architecture. The MIPS implemented parallel processing of various kinds of clinical data with high efficiency, which come from disparate hospital information system such as PACS, RIS LIS and HIS.
Automated global structure extraction for effective local building block processing in XCS.
Butz, Martin V; Pelikan, Martin; Llorà, Xavier; Goldberg, David E
2006-01-01
Learning Classifier Systems (LCSs), such as the accuracy-based XCS, evolve distributed problem solutions represented by a population of rules. During evolution, features are specialized, propagated, and recombined to provide increasingly accurate subsolutions. Recently, it was shown that, as in conventional genetic algorithms (GAs), some problems require efficient processing of subsets of features to find problem solutions efficiently. In such problems, standard variation operators of genetic and evolutionary algorithms used in LCSs suffer from potential disruption of groups of interacting features, resulting in poor performance. This paper introduces efficient crossover operators to XCS by incorporating techniques derived from competent GAs: the extended compact GA (ECGA) and the Bayesian optimization algorithm (BOA). Instead of simple crossover operators such as uniform crossover or one-point crossover, ECGA or BOA-derived mechanisms are used to build a probabilistic model of the global population and to generate offspring classifiers locally using the model. Several offspring generation variations are introduced and evaluated. The results show that it is possible to achieve performance similar to runs with an informed crossover operator that is specifically designed to yield ideal problem-dependent exploration, exploiting provided problem structure information. Thus, we create the first competent LCSs, XCS/ECGA and XCS/BOA, that detect dependency structures online and propagate corresponding lower-level dependency structures effectively without any information about these structures given in advance.
Brain white matter structure and information processing speed in healthy older age.
Kuznetsova, Ksenia A; Maniega, Susana Muñoz; Ritchie, Stuart J; Cox, Simon R; Storkey, Amos J; Starr, John M; Wardlaw, Joanna M; Deary, Ian J; Bastin, Mark E
2016-07-01
Cognitive decline, especially the slowing of information processing speed, is associated with normal ageing. This decline may be due to brain cortico-cortical disconnection caused by age-related white matter deterioration. We present results from a large, narrow age range cohort of generally healthy, community-dwelling subjects in their seventies who also had their cognitive ability tested in youth (age 11 years). We investigate associations between older age brain white matter structure, several measures of information processing speed and childhood cognitive ability in 581 subjects. Analysis of diffusion tensor MRI data using Tract-based Spatial Statistics (TBSS) showed that all measures of information processing speed, as well as a general speed factor composed from these tests (g speed), were significantly associated with fractional anisotropy (FA) across the white matter skeleton rather than in specific tracts. Cognitive ability measured at age 11 years was not associated with older age white matter FA, except for the g speed-independent components of several individual processing speed tests. These results indicate that quicker and more efficient information processing requires global connectivity in older age, and that associations between white matter FA and information processing speed (both individual test scores and g speed), unlike some other aspects of later life brain structure, are generally not accounted for by cognitive ability measured in youth.
Utility-based early modulation of processing distracting stimulus information.
Wendt, Mike; Luna-Rodriguez, Aquiles; Jacobsen, Thomas
2014-12-10
Humans are selective information processors who efficiently prevent goal-inappropriate stimulus information to gain control over their actions. Nonetheless, stimuli, which are both unnecessary for solving a current task and liable to cue an incorrect response (i.e., "distractors"), frequently modulate task performance, even when consistently paired with a physical feature that makes them easily discernible from target stimuli. Current models of cognitive control assume adjustment of the processing of distractor information based on the overall distractor utility (e.g., predictive value regarding the appropriate response, likelihood to elicit conflict with target processing). Although studies on distractor interference have supported the notion of utility-based processing adjustment, previous evidence is inconclusive regarding the specificity of this adjustment for distractor information and the stage(s) of processing affected. To assess the processing of distractors during sensory-perceptual phases we applied EEG recording in a stimulus identification task, involving successive distractor-target presentation, and manipulated the overall distractor utility. Behavioral measures replicated previously found utility modulations of distractor interference. Crucially, distractor-evoked visual potentials (i.e., posterior N1) were more pronounced in high-utility than low-utility conditions. This effect generalized to distractors unrelated to the utility manipulation, providing evidence for item-unspecific adjustment of early distractor processing to the experienced utility of distractor information. Copyright © 2014 the authors 0270-6474/14/3416720-06$15.00/0.
Ju, Feng; Lee, Hyo Kyung; Yu, Xinhua; Faris, Nicholas R; Rugless, Fedoria; Jiang, Shan; Li, Jingshan; Osarogiagbon, Raymond U
2017-12-01
The process of lung cancer care from initial lesion detection to treatment is complex, involving multiple steps, each introducing the potential for substantial delays. Identifying the steps with the greatest delays enables a focused effort to improve the timeliness of care-delivery, without sacrificing quality. We retrospectively reviewed clinical events from initial detection, through histologic diagnosis, radiologic and invasive staging, and medical clearance, to surgery for all patients who had an attempted resection of a suspected lung cancer in a community healthcare system. We used a computer process modeling approach to evaluate delays in care delivery, in order to identify potential 'bottlenecks' in waiting time, the reduction of which could produce greater care efficiency. We also conducted 'what-if' analyses to predict the relative impact of simulated changes in the care delivery process to determine the most efficient pathways to surgery. The waiting time between radiologic lesion detection and diagnostic biopsy, and the waiting time from radiologic staging to surgery were the two most critical bottlenecks impeding efficient care delivery (more than 3 times larger compared to reducing other waiting times). Additionally, instituting surgical consultation prior to cardiac consultation for medical clearance and decreasing the waiting time between CT scans and diagnostic biopsies, were potentially the most impactful measures to reduce care delays before surgery. Rigorous computer simulation modeling, using clinical data, can provide useful information to identify areas for improving the efficiency of care delivery by process engineering, for patients who receive surgery for lung cancer.
Process-driven information management system at a biotech company: concept and implementation.
Gobbi, Alberto; Funeriu, Sandra; Ioannou, John; Wang, Jinyi; Lee, Man-Ling; Palmer, Chris; Bamford, Bob; Hewitt, Robin
2004-01-01
While established pharmaceutical companies have chemical information systems in place to manage their compounds and the associated data, new startup companies need to implement these systems from scratch. Decisions made early in the design phase usually have long lasting effects on the expandability, maintenance effort, and costs associated with the information management system. Careful analysis of work and data flows, both inter- and intradepartmental, and identification of existing dependencies between activities are important. This knowledge is required to implement an information management system, which enables the research community to work efficiently by avoiding redundant registration and processing of data and by timely provision of the data whenever needed. This paper first presents the workflows existing at Anadys, then ARISE, the research information management system developed in-house at Anadys. ARISE was designed to support the preclinical drug discovery process and covers compound registration, analytical quality control, inventory management, high-throughput screening, lower throughput screening, and data reporting.
NASA Technical Reports Server (NTRS)
Bryant, N. A.; Zobrist, A. L.
1978-01-01
The paper describes the development of an image based information system and its use to process a Landsat thematic map showing land use or land cover in conjunction with a census tract polygon file to produce a tabulation of land use acreages per census tract. The system permits the efficient cross-tabulation of two or more geo-coded data sets, thereby setting the stage for the practical implementation of models of diffusion processes or cellular transformation. Characteristics of geographic information systems are considered, and functional requirements, such as data management, geocoding, image data management, and data analysis are discussed. The system is described, and the potentialities of its use are examined.
Applications for radio-frequency identification technology in the perioperative setting.
Zhao, Tiyu; Zhang, Xiaoxiang; Zeng, Lili; Xia, Shuyan; Hinton, Antentor Othrell; Li, Xiuyun
2014-06-01
We implemented a two-year project to develop a security-gated management system for the perioperative setting using radio-frequency identification (RFID) technology to enhance the management efficiency of the OR. We installed RFID readers beside the entrances to the OR and changing areas to receive and process signals from the RFID tags that we sewed into surgical scrub attire and shoes. The system also required integrating automatic access control panels, computerized lockers, light-emitting diode (LED) information screens, wireless networks, and an information system. By doing this, we are able to control the flow of personnel and materials more effectively, reduce OR costs, optimize the registration and attire-changing process for personnel, and improve management efficiency. We also anticipate this system will improve patient safety by reducing the risk of surgical site infection. Application of security-gated management systems is an important and effective way to help ensure a clean, convenient, and safe management process to manage costs in the perioperative area and promote patient safety. Copyright © 2014 AORN, Inc. Published by Elsevier Inc. All rights reserved.
[Traditional Chinese Medicine data management policy in big data environment].
Liang, Yang; Ding, Chang-Song; Huang, Xin-di; Deng, Le
2018-02-01
As traditional data management model cannot effectively manage the massive data in traditional Chinese medicine(TCM) due to the uncertainty of data object attributes as well as the diversity and abstraction of data representation, a management strategy for TCM data based on big data technology is proposed. Based on true characteristics of TCM data, this strategy could solve the problems of the uncertainty of data object attributes in TCM information and the non-uniformity of the data representation by using modeless properties of stored objects in big data technology. Hybrid indexing mode was also used to solve the conflicts brought by different storage modes in indexing process, with powerful capabilities in query processing of massive data through efficient parallel MapReduce process. The theoretical analysis provided the management framework and its key technology, while its performance was tested on Hadoop by using several common traditional Chinese medicines and prescriptions from practical TCM data source. Result showed that this strategy can effectively solve the storage problem of TCM information, with good performance in query efficiency, completeness and robustness. Copyright© by the Chinese Pharmaceutical Association.
ERIC Educational Resources Information Center
Wilkinson, Krista M.; O'Neill, Tara; McIlvane, William J.
2014-01-01
Purpose: Many individuals with communication impairments use aided augmentative and alternative communication (AAC) systems involving letters, words, or line drawings that rely on the visual modality. It seems reasonable to suggest that display design should incorporate information about how users attend to and process visual information. The…
Efficient Caption-Based Retrieval of Multimedia Information
1993-10-09
in the design of transportable natural language interfaces. Artifcial Intelligence , 32 (1987), 173-243. - 13- (101 Jones, M. and Eisner, J. A...systems for multimedia data . They exploit captions on the data and perform natural-language processing of them and English retrieval requests. Some...content analysis of the data is also performed to obtain additional descriptive information. The key to getting this approach to work is sufficiently
Office of the CIO: Setting the Vision
NASA Technical Reports Server (NTRS)
Rinaldi, James J.
2006-01-01
This slide presentation reviews the vision of the Office of JPL's Chief Information Officer for future of information technology (IT) at JPL. This includes a strong working relation with industry to provide cost efficient and effective IT services. This includes a vision of taking desktop to the next level and the process to achieve it and ensuring that JPL becomes a world class IT provider.
Unmanned Underwater Vehicle (UUV) Information Study
2014-11-28
Maritime Unmanned System NATO North Atlantic Treaty Organization xi The use or disclosure of the information on this sheet is subject to the... Unmanned Aerial System UDA Underwater Domain Awareness UNISIPS Unified Sonar Image Processing System USV Unmanned Surface Vehicle UUV Unmanned Underwater...data distribution to ashore systems , such as the delay, its impact and the benefits to the overall MDA and required metadata for efficient search and
Test of VPHGS in SHSG for use at cryogenic temperatures
NASA Astrophysics Data System (ADS)
Insaustia, Maider; Garzón, Francisco; Mas-Abellán, P.; Madrigal, R.; Fimia, A.
2017-05-01
Silver halide sensitized gelatin (SHSG) processes are interesting because they combine the spectral and energetic sensitivity of a photographic emulsions with good optical quality and high diffraction efficiency of dichromate gelatin (DCG). Previous papers had been demonstrated that it is possible to obtain diffraction efficiencies near to 90% with Agfa- Gevaert plates and Colour Holographic plates in SHSG transmission gratings. In this communication, we report on the performances measured at room temperature and in cryogenic conditions of a set of volume phase holographic gratings(VPHGs) manufactured with SHSG process aimed at their use in astronomical instrumentations. Two set of diffraction gratings has been manufactured using different processing. The first with SHSG process and the second with typical bleached process (developed with AAC and bleached in R-10). In both cases the plate was BB640, ultrafine grain emulsions with a nominal thickness of 9 μm. The recording was performed with asymmetric geometry a 30° degrees between the light beams of wavelength 632.8 nm (He-Ne laser), which give a raise a spectral frequency of 800 l/m. The exposure was between 46 to 2048 μJ/cm2. The results give us information about Bragg plane modification and reduction of diffraction efficiency when we introduced the VPHG to 77° K. In the case of SHSG process the final diffraction efficiency after cryogenic temperature are better at some exposure energy than previous measurements at room temperature. This experimental result give us possibilities to applied SHSG process in Astrophysics applications.
Pan, Jinger; Laubrock, Jochen; Yan, Ming
2016-08-01
We examined how reading mode (i.e., silent vs. oral reading) influences parafoveal semantic and phonological processing during the reading of Chinese sentences, using the gaze-contingent boundary paradigm. In silent reading, we found in 2 experiments that reading times on target words were shortened with semantic previews in early and late processing, whereas phonological preview effects mainly occurred in gaze duration or second-pass reading. In contrast, results showed that phonological preview information is obtained early on in oral reading. Strikingly, in oral reading, we observed a semantic preview cost on the target word in Experiment 1 and a decrease in the effect size of preview benefit from first- to second-pass measures in Experiment 2, which we hypothesize to result from increased preview duration. Taken together, our results indicate that parafoveal semantic information can be obtained irrespective of reading mode, whereas readers more efficiently process parafoveal phonological information in oral reading. We discuss implications for notions of information processing priority and saccade generation during silent and oral reading. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
NASA Technical Reports Server (NTRS)
Scholz, A. L.; Hart, M. T.; Lowry, D. J.
1987-01-01
The Technology Information Sheet was assembled in database format during Phase I. This document was designed to provide a repository for information pertaining to 144 Operations and Maintenance Instructions (OMI) controlled operations in the Orbiter Processing Facility (OPF), Vehicle Assembly Building (VAB), and PAD. It provides a way to accumulate information about required crew sizes, operations task time duration (serial and/or parallel), special Ground Support Equipment (GSE). required, and identification of a potential application of existing technology or the need for the development of a new technolgoy item.
Hernández-Sancho, Francesc; Molinos-Senante, María; Sala-Garrido, Ramón
2010-01-15
Economic research into the design and implementation of policies for the efficient management of water resources has been emphasized by the European Water Framework Directive (Directive 2000/60/EC). The efficient implementation of policies to prevent the degradation and depletion of water resources requires determining their value in social and economic terms and incorporating this information into the decision-making process. A process of wastewater treatment has many associated environmental benefits. However, these benefits are often not calculated because they are not set by the market, due to inadequate property rights, the presence of externalities, and the lack of perfect information. Nevertheless, the valuation of these benefits is necessary to justify a suitable investment policy and a limited number of studies exist on the subject of the economic valuation of environmental benefits. In this paper, we propose a methodology based on the estimation of shadow prices for the pollutants removed in a treatment process. This value represents the environmental benefit (avoided cost) associated with undischarged pollution. This is a pioneering approach to the economic valuation of wastewater treatment. The comparison of these benefits with the internal costs of the treatment process will provide a useful indicator for the feasibility of wastewater treatment projects. Copyright 2009 Elsevier B.V. All rights reserved.
The usability axiom of medical information systems.
Pantazi, Stefan V; Kushniruk, Andre; Moehr, Jochen R
2006-12-01
In this article we begin by connecting the concept of simplicity of user interfaces of information systems with that of usability, and the concept of complexity of the problem-solving in information systems with the concept of usefulness. We continue by stating "the usability axiom" of medical information technology: information systems must be, at the same time, usable and useful. We then try to show why, given existing technology, the axiom is a paradox and we continue with analysing and reformulating it several times, from more fundamental information processing perspectives. We underline the importance of the concept of representation and demonstrate the need for context-dependent representations. By means of thought experiments and examples, we advocate the need for context-dependent information processing and argue for the relevance of algorithmic information theory and case-based reasoning in this context. Further, we introduce the notion of concept spaces and offer a pragmatic perspective on context-dependent representations. We conclude that the efficient management of concept spaces may help with the solution to the medical information technology paradox. Finally, we propose a view of informatics centred on the concepts of context-dependent information processing and management of concept spaces that aligns well with existing knowledge centric definitions of informatics in general and medical informatics in particular. In effect, our view extends M. Musen's proposal and proposes a definition of Medical Informatics as context-dependent medical information processing. The axiom that medical information systems must be, at the same time, useful and usable, is a paradox and its investigation by means of examples and thought experiments leads to the recognition of the crucial importance of context-dependent information processing. On the premise that context-dependent information processing equates to knowledge processing, this view defines Medical Informatics as a context-dependent medical information processing which aligns well with existing knowledge centric definitions of our field.
Information spread of emergency events: path searching on social networks.
Dai, Weihui; Hu, Hongzhi; Wu, Tunan; Dai, Yonghui
2014-01-01
Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning.
An Airborne Onboard Parallel Processing Testbed
NASA Technical Reports Server (NTRS)
Mandl, Daniel J.
2014-01-01
This presentation provides information on the progress the Intelligent Payload Module (IPM) development effort. In addition, a vision is presented on integration of the IPM architecture with the GeoSocial Application Program Interface (API) architecture to enable efficient distribution of satellite data products.
Information needs to support state and local transportation decision making into the 21st century
DOT National Transportation Integrated Search
1997-03-01
The Intermodal Surface Transportation Efficiency Act of 1991 (ISTEA) established new requirements for data development and dissemination that have had an impact on federal, state, and local transportation planning processes across the United States. ...
Seven Steps for Success: Selecting IT Consultants
ERIC Educational Resources Information Center
Moriarty, Daniel F.
2004-01-01
Information technology (IT) presents community colleges with both powerful opportunities and formidable challenges. The prospects of expedited and more efficient business processes, greater student access through distance learning, improved communication, and strengthened relationships with students can embolden the most hesitant college…
The Performance and Registration Information Systems Management (PRISM) pilot demonstration project
DOT National Transportation Integrated Search
1999-12-01
The Intermodal Surface Transportation Efficiency Act of 1991 mandated a study to explore the potential of the commercial motor vehicle (CMV) registration process as a safety enforcement tool for reducing CMV accidents. The project sought to establish...
IEEE TRANSACTIONS ON CYBERNETICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig R. RIeger; David H. Scheidt; William D. Smart
2014-11-01
MODERN societies depend on complex and critical infrastructures for energy, transportation, sustenance, medical care, emergency response, communications security. As computers, automation, and information technology (IT) have advanced, these technologies have been exploited to enhance the efficiency of operating the processes that make up these infrastructures
Trends towards Distance Education of Nursing Education in Turkey
ERIC Educational Resources Information Center
Senyuva, Emine
2011-01-01
The contemporary world, developments, changes, transformations, globalization, information and communication technologies developments, diversification of the educational environment and life-long education to become compulsory in education, learning-teaching process efficiency and effectiveness in their discussion raises, while the educational…
Gyurko, David M; Soti, Csaba; Stetak, Attila; Csermely, Peter
2014-05-01
During the last decade, network approaches became a powerful tool to describe protein structure and dynamics. Here, we describe first the protein structure networks of molecular chaperones, then characterize chaperone containing sub-networks of interactomes called as chaperone-networks or chaperomes. We review the role of molecular chaperones in short-term adaptation of cellular networks in response to stress, and in long-term adaptation discussing their putative functions in the regulation of evolvability. We provide a general overview of possible network mechanisms of adaptation, learning and memory formation. We propose that changes of network rigidity play a key role in learning and memory formation processes. Flexible network topology provides ' learning-competent' state. Here, networks may have much less modular boundaries than locally rigid, highly modular networks, where the learnt information has already been consolidated in a memory formation process. Since modular boundaries are efficient filters of information, in the 'learning-competent' state information filtering may be much smaller, than after memory formation. This mechanism restricts high information transfer to the 'learning competent' state. After memory formation, modular boundary-induced segregation and information filtering protect the stored information. The flexible networks of young organisms are generally in a 'learning competent' state. On the contrary, locally rigid networks of old organisms have lost their 'learning competent' state, but store and protect their learnt information efficiently. We anticipate that the above mechanism may operate at the level of both protein-protein interaction and neuronal networks.
NASA Astrophysics Data System (ADS)
Hoffman, Kenneth J.
1995-10-01
Few information systems create a standardized clinical patient record in which there are discrete and concise observations of patient problems and their resolution. Clinical notes usually are narratives which don't support an aggregate and systematic outcome analysis. Many programs collect information on diagnosis and coded procedures but are not focused on patient problems. Integrated definition (IDEF) methodology has been accepted by the Department of Defense as part of the Corporate Information Management Initiative and serves as the foundation that establishes a need for automation. We used IDEF modeling to describe present and idealized patient care activities. A logical IDEF data model was created to support those activities. The modeling process allows for accurate cost estimates based upon performed activities, efficient collection of relevant information, and outputs which allow real- time assessments of process and outcomes. This model forms the foundation for a prototype automated clinical information system (ACIS).
Zubek, Julian; Denkiewicz, Michał; Barański, Juliusz; Wróblewski, Przemysław; Rączaszek-Leonardi, Joanna; Plewczynski, Dariusz
2017-01-01
This paper explores how information flow properties of a network affect the formation of categories shared between individuals, who are communicating through that network. Our work is based on the established multi-agent model of the emergence of linguistic categories grounded in external environment. We study how network information propagation efficiency and the direction of information flow affect categorization by performing simulations with idealized network topologies optimizing certain network centrality measures. We measure dynamic social adaptation when either network topology or environment is subject to change during the experiment, and the system has to adapt to new conditions. We find that both decentralized network topology efficient in information propagation and the presence of central authority (information flow from the center to peripheries) are beneficial for the formation of global agreement between agents. Systems with central authority cope well with network topology change, but are less robust in the case of environment change. These findings help to understand which network properties affect processes of social adaptation. They are important to inform the debate on the advantages and disadvantages of centralized systems.
Denkiewicz, Michał; Barański, Juliusz; Wróblewski, Przemysław; Rączaszek-Leonardi, Joanna; Plewczynski, Dariusz
2017-01-01
This paper explores how information flow properties of a network affect the formation of categories shared between individuals, who are communicating through that network. Our work is based on the established multi-agent model of the emergence of linguistic categories grounded in external environment. We study how network information propagation efficiency and the direction of information flow affect categorization by performing simulations with idealized network topologies optimizing certain network centrality measures. We measure dynamic social adaptation when either network topology or environment is subject to change during the experiment, and the system has to adapt to new conditions. We find that both decentralized network topology efficient in information propagation and the presence of central authority (information flow from the center to peripheries) are beneficial for the formation of global agreement between agents. Systems with central authority cope well with network topology change, but are less robust in the case of environment change. These findings help to understand which network properties affect processes of social adaptation. They are important to inform the debate on the advantages and disadvantages of centralized systems. PMID:28809957
Kannampallil, Thomas G; Franklin, Amy; Mishra, Rashmi; Almoosa, Khalid F; Cohen, Trevor; Patel, Vimla L
2013-01-01
Information in critical care environments is distributed across multiple sources, such as paper charts, electronic records, and support personnel. For decision-making tasks, physicians have to seek, gather, filter and organize information from various sources in a timely manner. The objective of this research is to characterize the nature of physicians' information seeking process, and the content and structure of clinical information retrieved during this process. Eight medical intensive care unit physicians provided a verbal think-aloud as they performed a clinical diagnosis task. Verbal descriptions of physicians' activities, sources of information they used, time spent on each information source, and interactions with other clinicians were captured for analysis. The data were analyzed using qualitative and quantitative approaches. We found that the information seeking process was exploratory and iterative and driven by the contextual organization of information. While there was no significant differences between the overall time spent paper or electronic records, there was marginally greater relative information gain (i.e., more unique information retrieved per unit time) from electronic records (t(6)=1.89, p=0.1). Additionally, information retrieved from electronic records was at a higher level (i.e., observations and findings) in the knowledge structure than paper records, reflecting differences in the nature of knowledge utilization across resources. A process of local optimization drove the information seeking process: physicians utilized information that maximized their information gain even though it required significantly more cognitive effort. Implications for the design of health information technology solutions that seamlessly integrate information seeking activities within the workflow, such as enriching the clinical information space and supporting efficient clinical reasoning and decision-making, are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.
Modeling of information diffusion in Twitter-like social networks under information overload.
Li, Pei; Li, Wei; Wang, Hui; Zhang, Xin
2014-01-01
Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations.
Modeling of Information Diffusion in Twitter-Like Social Networks under Information Overload
Li, Wei
2014-01-01
Due to the existence of information overload in social networks, it becomes increasingly difficult for users to find useful information according to their interests. This paper takes Twitter-like social networks into account and proposes models to characterize the process of information diffusion under information overload. Users are classified into different types according to their in-degrees and out-degrees, and user behaviors are generalized into two categories: generating and forwarding. View scope is introduced to model the user information-processing capability under information overload, and the average number of times a message appears in view scopes after it is generated by a given type user is adopted to characterize the information diffusion efficiency, which is calculated theoretically. To verify the accuracy of theoretical analysis results, we conduct simulations and provide the simulation results, which are consistent with the theoretical analysis results perfectly. These results are of importance to understand the diffusion dynamics in social networks, and this analysis framework can be extended to consider more realistic situations. PMID:24795541
Accounting Information Systems in Healthcare: A Review of the Literature.
Hammour, Hadal; Househ, Mowafa; Razzak, Hira Abdul
2017-01-01
As information technology progresses in Saudi Arabia, the manual accounting systems have become graduallyinadequate for decision needs. Subsequently, private and public healthcare divisions in Saudi Arabia perceive Computerized accounting information system (CAIS) as a vehicle to safeguard efficient and effective flow of information during the analysis, processes, and recording of financial data. Efficient and effective flow of information improvesthe decision making of staff, thereby improving the capability of health care sectors to reduce cost of the medical services.In this paper, we define computerized accounting systems from the point of view of health informatics. Also, the challenges and benefits of supporting CAIS applications in hospitals of Saudi Arabia. With these elements, we conclude that CAIS in Saudi Arabia can serve as a valuable tool for evaluating and controlling the cost of medical services in healthcare sectors. Supplementary education on the significance of having systems of computerized accounting within hospitals for nurses, doctors, and accountants with other health care staff is warranted in future.
Chappell, Jackie; Demery, Zoe P; Arriola-Rios, Veronica; Sloman, Aaron
2012-02-01
Imagine a situation in which you had to design a physical agent that could collect information from its environment, then store and process that information to help it respond appropriately to novel situations. What kinds of information should it attend to? How should the information be represented so as to allow efficient use and re-use? What kinds of constraints and trade-offs would there be? There are no unique answers. In this paper, we discuss some of the ways in which the need to be able to address problems of varying kinds and complexity can be met by different information processing systems. We also discuss different ways in which relevant information can be obtained, and how different kinds of information can be processed and used, by both biological organisms and artificial agents. We analyse several constraints and design features, and show how they relate both to biological organisms, and to lessons that can be learned from building artificial systems. Our standpoint overlaps with Karmiloff-Smith (1992) in that we assume that a collection of mechanisms geared to learning and developing in biological environments are available in forms that constrain, but do not determine, what can or will be learnt by individuals. Copyright © 2011 Elsevier B.V. All rights reserved.
O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor
2012-08-01
Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Margaret; Spurlock, C. Anna; Yang, Hung-Chia
The dual purpose of this project was to contribute to basic knowledge about the interaction between regulation and innovation and to inform the cost and benefit expectations related to technical change which are embedded in the rulemaking process of an important area of national regulation. The area of regulation focused on here is minimum efficiency performance standards (MEPS) for appliances and other energy-using products. Relevant both to U.S. climate policy and energy policy for buildings, MEPS remove certain product models from the market that do not meet specified efficiency thresholds.
White-Matter Structural Connectivity Underlying Human Laughter-Related Traits Processing.
Wu, Ching-Lin; Zhong, Suyu; Chan, Yu-Chen; Chen, Hsueh-Chih; Gong, Gaolang; He, Yong; Li, Ping
2016-01-01
Most research into the neural mechanisms of humor has not explicitly focused on the association between emotion and humor on the brain white matter networks mediating this connection. However, this connection is especially salient in gelotophobia (the fear of being laughed at), which is regarded as the presentation of humorlessness, and two related traits, gelotophilia (the enjoyment of being laughed at) and katagelasticism (the enjoyment of laughing at others). Here, we explored whether the topological properties of white matter networks can account for the individual differences in the laughter-related traits of 31 healthy adults. We observed a significant negative correlation between gelotophobia scores and the clustering coefficient, local efficiency and global efficiency, but a positive association between gelotophobia scores and path length in the brain's white matter network. Moreover, the current study revealed that with increasing individual fear of being laughed at, the linking efficiencies in superior frontal gyrus, anterior cingulate cortex, parahippocampal gyrus, and middle temporal gyrus decreased. However, there were no significant correlations between either gelotophilia or katagelasticism scores or the topological properties of the brain white matter network. These findings suggest that the fear of being laughed at is directly related to the level of local and global information processing of the brain network, which might provide new insights into the neural mechanisms of the humor information processing.
White-Matter Structural Connectivity Underlying Human Laughter-Related Traits Processing
Wu, Ching-Lin; Zhong, Suyu; Chan, Yu-Chen; Chen, Hsueh-Chih; Gong, Gaolang; He, Yong; Li, Ping
2016-01-01
Most research into the neural mechanisms of humor has not explicitly focused on the association between emotion and humor on the brain white matter networks mediating this connection. However, this connection is especially salient in gelotophobia (the fear of being laughed at), which is regarded as the presentation of humorlessness, and two related traits, gelotophilia (the enjoyment of being laughed at) and katagelasticism (the enjoyment of laughing at others). Here, we explored whether the topological properties of white matter networks can account for the individual differences in the laughter-related traits of 31 healthy adults. We observed a significant negative correlation between gelotophobia scores and the clustering coefficient, local efficiency and global efficiency, but a positive association between gelotophobia scores and path length in the brain's white matter network. Moreover, the current study revealed that with increasing individual fear of being laughed at, the linking efficiencies in superior frontal gyrus, anterior cingulate cortex, parahippocampal gyrus, and middle temporal gyrus decreased. However, there were no significant correlations between either gelotophilia or katagelasticism scores or the topological properties of the brain white matter network. These findings suggest that the fear of being laughed at is directly related to the level of local and global information processing of the brain network, which might provide new insights into the neural mechanisms of the humor information processing. PMID:27833572
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223
NASA Technical Reports Server (NTRS)
Effinger, Michael; Beshears, Ron; Hufnagle, David; Walker, James; Russell, Sam; Stowell, Bob; Myers, David
2002-01-01
Nondestructive characterization techniques have been used to steer development and testing of CMCs. Computed tomography is used to determine the volumetric integrity of the CMC plates and components. Thermography is used to determine the near surface integrity of the CMC plates and components. For process and material development, information such as density uniformity, part delamination, and dimensional tolerance conformity is generated. The information from the thermography and computed tomography is correlated and then specimen cutting maps are superimposed on the thermography images. This enables for tighter data and potential explanation of off nominal test data. Examples of nondestructive characterization utilization to make decisions in process and material development and testing are presented.
Open source cardiology electronic health record development for DIGICARDIAC implementation
NASA Astrophysics Data System (ADS)
Dugarte, Nelson; Medina, Rubén.; Huiracocha, Lourdes; Rojas, Rubén.
2015-12-01
This article presents the development of a Cardiology Electronic Health Record (CEHR) system. Software consists of a structured algorithm designed under Health Level-7 (HL7) international standards. Novelty of the system is the integration of high resolution ECG (HRECG) signal acquisition and processing tools, patient information management tools and telecardiology tools. Acquisition tools are for management and control of the DIGICARDIAC electrocardiograph functions. Processing tools allow management of HRECG signal analysis searching for indicative patterns of cardiovascular pathologies. Telecardiology tools incorporation allows system communication with other health care centers decreasing access time to the patient information. CEHR system was completely developed using open source software. Preliminary results of process validation showed the system efficiency.
Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example
NASA Astrophysics Data System (ADS)
Sarı, F.; Ceylan, D. A.
2017-11-01
Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.
Hurtado, Nereyda; Marchman, Virginia A.; Fernald, Anne
2010-01-01
It is well established that variation in caregivers' speech is associated with language outcomes, yet little is known about the learning principles that mediate these effects. This longitudinal study (n = 27) explores whether Spanish-learning children's early experiences with language predict efficiency in real-time comprehension and vocabulary learning. Measures of mothers' speech at 18 months were examined in relation to children's speech processing efficiency and reported vocabulary at 18 and 24 months. Children of mothers who provided more input at 18 months knew more words and were faster in word recognition at 24 months. Moreover, multiple regression analyses indicated that the influences of caregiver speech on speed of word recognition and vocabulary were largely overlapping. This study provides the first evidence that input shapes children's lexical processing efficiency and that vocabulary growth and increasing facility in spoken word comprehension work together to support the uptake of the information that rich input affords the young language learner. PMID:19046145
Control of coherent information via on-chip photonic–phononic emitter–receivers
Shin, Heedeuk; Cox, Jonathan A.; Jarecki, Robert; ...
2015-03-05
We report that rapid progress in integrated photonics has fostered numerous chip-scale sensing, computing and signal processing technologies. However, many crucial filtering and signal delay operations are difficult to perform with all-optical devices. Unlike photons propagating at luminal speeds, GHz-acoustic phonons moving at slower velocities allow information to be stored, filtered and delayed over comparatively smaller length-scales with remarkable fidelity. Hence, controllable and efficient coupling between coherent photons and phonons enables new signal processing technologies that greatly enhance the performance and potential impact of integrated photonics. Here we demonstrate a mechanism for coherent information processing based on travelling-wave photon–phonon transduction,more » which achieves a phonon emit-and-receive process between distinct nanophotonic waveguides. Using this device, physics—which supports GHz frequencies—we create wavelength-insensitive radiofrequency photonic filters with frequency selectivity, narrow-linewidth and high power-handling in silicon. More generally, this emit-receive concept is the impetus for enabling new signal processing schemes.« less
Control of coherent information via on-chip photonic–phononic emitter–receivers
Shin, Heedeuk; Cox, Jonathan A.; Jarecki, Robert; Starbuck, Andrew; Wang, Zheng; Rakich, Peter T.
2015-01-01
Rapid progress in integrated photonics has fostered numerous chip-scale sensing, computing and signal processing technologies. However, many crucial filtering and signal delay operations are difficult to perform with all-optical devices. Unlike photons propagating at luminal speeds, GHz-acoustic phonons moving at slower velocities allow information to be stored, filtered and delayed over comparatively smaller length-scales with remarkable fidelity. Hence, controllable and efficient coupling between coherent photons and phonons enables new signal processing technologies that greatly enhance the performance and potential impact of integrated photonics. Here we demonstrate a mechanism for coherent information processing based on travelling-wave photon–phonon transduction, which achieves a phonon emit-and-receive process between distinct nanophotonic waveguides. Using this device, physics—which supports GHz frequencies—we create wavelength-insensitive radiofrequency photonic filters with frequency selectivity, narrow-linewidth and high power-handling in silicon. More generally, this emit-receive concept is the impetus for enabling new signal processing schemes. PMID:25740405
An Integrated Information System for Supporting Quality Management Tasks
NASA Astrophysics Data System (ADS)
Beyer, N.; Helmreich, W.
2004-08-01
In a competitive environment, well defined processes become the strategic advantage of a company. Hence, targeted Quality Management ensures efficiency, trans- parency and, ultimately, customer satisfaction. In the particular context of a Space Test Centre, a num- ber of specific Quality Management standards have to be applied. According to the revision of ISO 9001 dur- ing 2000, and due to the adaptation of ECSS-Q20-07, process orientation and data analysis are key tasks for ensuring and evaluating the efficiency of a company's processes. In line with these requirements, an integrated management system for accessing the necessary infor- mation to support Quality Management and other proc- esses has been established. Some of its test-related fea- tures are presented here. Easy access to the integrated management system from any work place at IABG's Space Test Centre is ensured by means of an intranet portal. It comprises a full set of quality-related process descriptions, information on test facilities, emergency procedures, and other relevant in- formation. The portal's web interface provides direct access to a couple of external applications. Moreover, easy updating of all information and low cost mainte- nance are features of this integrated information system. The timely and transparent management of non- conformances is covered by a dedicated NCR database which incorporates full documentation capability, elec- tronic signature and e-mail notification of concerned staff. A search interface allows for queries across all documented non-conformances. Furthermore, print ver- sions can be generated at any stage in the process, e.g. for distribution to customers. Feedback on customer satisfaction is sought through a web-based questionnaire. The process is initiated by the responsible test manager through submission of an e- mail that contains a hyperlink to a secure website, ask- ing the customer to complete the brief online form, which is directly fed to a database for subsequent evaluation by the Quality Manager. All such information can be processed and presented in an appropriate manner for internal or external audits, as well as for regular management reviews.
Decentralized cooperative TOA/AOA target tracking for hierarchical wireless sensor networks.
Chen, Ying-Chih; Wen, Chih-Yu
2012-11-08
This paper proposes a distributed method for cooperative target tracking in hierarchical wireless sensor networks. The concept of leader-based information processing is conducted to achieve object positioning, considering a cluster-based network topology. Random timers and local information are applied to adaptively select a sub-cluster for the localization task. The proposed energy-efficient tracking algorithm allows each sub-cluster member to locally estimate the target position with a Bayesian filtering framework and a neural networking model, and further performs estimation fusion in the leader node with the covariance intersection algorithm. This paper evaluates the merits and trade-offs of the protocol design towards developing more efficient and practical algorithms for object position estimation.
From ecological test site to geographic information system: lessons for the 1980's
Alexander, Robert H.
1981-01-01
Geographic information systems were common elements in two kinds of interdisciplinary regional demonstration projects in the 1970's. Ecological test sits attempted to provide for more efficient remote-sensing data delivery for regional environmental management. Regional environmental systems analysis attempted to formally describe and model the interacting regional social and environmental processes, including the resource-use decision making process. Lessons for the 1980's are drawn from recent evaluations and assessments of these programs, focusing on cost, rates of system development and technology transfer, program coordination, integrative analysis capability, and the involvement of system users and decision makers.
Mathematical programming for the efficient allocation of health care resources.
Stinnett, A A; Paltiel, A D
1996-10-01
Previous discussions of methods for the efficient allocation of health care resources subject to a budget constraint have relied on unnecessarily restrictive assumptions. This paper makes use of established optimization techniques to demonstrate that a general mathematical programming framework can accommodate much more complex information regarding returns to scale, partial and complete indivisibility and program interdependence. Methods are also presented for incorporating ethical constraints into the resource allocation process, including explicit identification of the cost of equity.
Efficiency improvement of technological preparation of power equipment manufacturing
NASA Astrophysics Data System (ADS)
Milukov, I. A.; Rogalev, A. N.; Sokolov, V. P.; Shevchenko, I. V.
2017-11-01
Competitiveness of power equipment primarily depends on speeding-up the development and mastering of new equipment samples and technologies, enhancement of organisation and management of design, manufacturing and operation. Actual political, technological and economic conditions cause the acute need in changing the strategy and tactics of process planning. At that the issues of maintenance of equipment with simultaneous improvement of its efficiency and compatibility to domestically produced components are considering. In order to solve these problems, using the systems of computer-aided process planning for process design at all stages of power equipment life cycle is economically viable. Computer-aided process planning is developed for the purpose of improvement of process planning by using mathematical methods and optimisation of design and management processes on the basis of CALS technologies, which allows for simultaneous process design, process planning organisation and management based on mathematical and physical modelling of interrelated design objects and production system. An integration of computer-aided systems providing the interaction of informative and material processes at all stages of product life cycle is proposed as effective solution to the challenges in new equipment design and process planning.
Smart Camera Technology Increases Quality
NASA Technical Reports Server (NTRS)
2004-01-01
When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.
Effects of visual working memory on brain information processing of irrelevant auditory stimuli.
Qu, Jiagui; Rizak, Joshua D; Zhao, Lun; Li, Minghong; Ma, Yuanye
2014-01-01
Selective attention has traditionally been viewed as a sensory processing modulator that promotes cognitive processing efficiency by favoring relevant stimuli while inhibiting irrelevant stimuli. However, the cross-modal processing of irrelevant information during working memory (WM) has been rarely investigated. In this study, the modulation of irrelevant auditory information by the brain during a visual WM task was investigated. The N100 auditory evoked potential (N100-AEP) following an auditory click was used to evaluate the selective attention to auditory stimulus during WM processing and at rest. N100-AEP amplitudes were found to be significantly affected in the left-prefrontal, mid-prefrontal, right-prefrontal, left-frontal, and mid-frontal regions while performing a high WM load task. In contrast, no significant differences were found between N100-AEP amplitudes in WM states and rest states under a low WM load task in all recorded brain regions. Furthermore, no differences were found between the time latencies of N100-AEP troughs in WM states and rest states while performing either the high or low WM load task. These findings suggested that the prefrontal cortex (PFC) may integrate information from different sensory channels to protect perceptual integrity during cognitive processing.
Sim, Kyoung Mi; Park, Hyun-Seol; Bae, Gwi-Nam; Jung, Jae Hee
2015-11-15
In this study, we demonstrated an antimicrobial nanoparticle-coated electrostatic (ES) air filter. Antimicrobial natural-product Sophora flavescens nanoparticles were produced using an aerosol process, and were continuously deposited onto the surface of air filter media. For the electrostatic activation of the filter medium, a corona discharge electrification system was used before and after antimicrobial treatment of the filter. In the antimicrobial treatment process, the deposition efficiency of S. flavescens nanoparticles on the ES filter was ~12% higher than that on the pristine (Non-ES) filter. In the evaluation of filtration performance using test particles (a nanosized KCl aerosol and submicron-sized Staphylococcus epidermidis bioaerosol), the ES filter showed better filtration efficiency than the Non-ES filter. However, antimicrobial treatment with S. flavescens nanoparticles affected the filtration efficiency of the filter differently depending on the size of the test particles. While the filtration efficiency of the KCl nanoparticles was reduced on the ES filter after the antimicrobial treatment, the filtration efficiency was improved after the recharging process. In summary, we prepared an antimicrobial ES air filter with >99% antimicrobial activity, ~92.5% filtration efficiency (for a 300-nm KCl aerosol), and a ~0.8 mmAq pressure drop (at 13 cm/s). This study provides valuable information for the development of a hybrid air purification system that can serve various functions and be used in an indoor environment. Copyright © 2015 Elsevier B.V. All rights reserved.
Differential effects of ADORA2A gene variations in pre-attentive visual sensory memory subprocesses.
Beste, Christian; Stock, Ann-Kathrin; Ness, Vanessa; Epplen, Jörg T; Arning, Larissa
2012-08-01
The ADORA2A gene encodes the adenosine A(2A) receptor that is highly expressed in the striatum where it plays a role in modulating glutamatergic and dopaminergic transmission. Glutamatergic signaling has been suggested to play a pivotal role in cognitive functions related to the pre-attentive processing of external stimuli. Yet, the precise molecular mechanism of these processes is poorly understood. Therefore, we aimed to investigate whether ADORA2A gene variation has modulating effects on visual pre-attentive sensory memory processing. Studying two polymorphisms, rs5751876 and rs2298383, in 199 healthy control subjects who performed a partial-report paradigm, we find that ADORA2A variation is associated with differences in the efficiency of pre-attentive sensory memory sub-processes. We show that especially the initial visual availability of stimulus information is rendered more efficiently in the homozygous rare genotype groups. Processes related to the transfer of information into working memory and the duration of visual sensory (iconic) memory are compromised in the homozygous rare genotype groups. Our results show a differential genotype-dependent modulation of pre-attentive sensory memory sub-processes. Hence, we assume that this modulation may be due to differential effects of increased adenosine A(2A) receptor signaling on glutamatergic transmission and striatal medium spiny neuron (MSN) interaction. Copyright © 2011 Elsevier B.V. and ECNP. All rights reserved.
Efficient structure from motion on large scenes using UAV with position and pose information
NASA Astrophysics Data System (ADS)
Teng, Xichao; Yu, Qifeng; Shang, Yang; Luo, Jing; Wang, Gang
2018-04-01
In this paper, we exploit prior information from global positioning systems and inertial measurement units to speed up the process of large scene reconstruction from images acquired by Unmanned Aerial Vehicles. We utilize weak pose information and intrinsic parameter to obtain the projection matrix for each view. As compared to unmanned aerial vehicles' flight altitude, topographic relief can usually be ignored, we assume that the scene is flat and use weak perspective camera to get projective transformations between two views. Furthermore, we propose an overlap criterion and select potentially matching view pairs between projective transformed views. A robust global structure from motion method is used for image based reconstruction. Our real world experiments show that the approach is accurate, scalable and computationally efficient. Moreover, projective transformations between views can also be used to eliminate false matching.
The algorithm of fast image stitching based on multi-feature extraction
NASA Astrophysics Data System (ADS)
Yang, Chunde; Wu, Ge; Shi, Jing
2018-05-01
This paper proposed an improved image registration method combining Hu-based invariant moment contour information and feature points detection, aiming to solve the problems in traditional image stitching algorithm, such as time-consuming feature points extraction process, redundant invalid information overload and inefficiency. First, use the neighborhood of pixels to extract the contour information, employing the Hu invariant moment as similarity measure to extract SIFT feature points in those similar regions. Then replace the Euclidean distance with Hellinger kernel function to improve the initial matching efficiency and get less mismatching points, further, estimate affine transformation matrix between the images. Finally, local color mapping method is adopted to solve uneven exposure, using the improved multiresolution fusion algorithm to fuse the mosaic images and realize seamless stitching. Experimental results confirm high accuracy and efficiency of method proposed in this paper.
Special Report: Part One. New Tools for Professionals.
ERIC Educational Resources Information Center
Liskin, Miriam; And Others
1984-01-01
This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)
Report #13-P-0167, February 28, 2013. Rule development is one of the Agency’s principal tasks. EPA develops rules to carry out the environmental and public health protection laws passed by Congress.
Teaching English Phrases through SMS
ERIC Educational Resources Information Center
Cig, Enes Kurtay; Guvercin, Selim; Bayimbetov, Berdak; Dos, Bulent
2015-01-01
Achieving the maximum efficiency in teaching a second language (L2) has always been an important issue for educators. Current globalization processes, development of international business relations, political integrations among the various countries throughout the world, and the abilities of latest information and communications technologies…
Electrondriven processes in polyatomic molecules
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKoy, Vincent
2017-03-20
This project developed and applied scalable computational methods to obtain information about low-energy electron collisions with larger polyatomic molecules. Such collisions are important in modeling radiation damage to living systems, in spark ignition and combustion, and in plasma processing of materials. The focus of the project was to develop efficient methods that could be used to obtain both fundamental scientific insights and data of practical value to applications.
Signature neural networks: definition and application to multidimensional sorting problems.
Latorre, Roberto; de Borja Rodriguez, Francisco; Varona, Pablo
2011-01-01
In this paper we present a self-organizing neural network paradigm that is able to discriminate information locally using a strategy for information coding and processing inspired in recent findings in living neural systems. The proposed neural network uses: 1) neural signatures to identify each unit in the network; 2) local discrimination of input information during the processing; and 3) a multicoding mechanism for information propagation regarding the who and the what of the information. The local discrimination implies a distinct processing as a function of the neural signature recognition and a local transient memory. In the context of artificial neural networks none of these mechanisms has been analyzed in detail, and our goal is to demonstrate that they can be used to efficiently solve some specific problems. To illustrate the proposed paradigm, we apply it to the problem of multidimensional sorting, which can take advantage of the local information discrimination. In particular, we compare the results of this new approach with traditional methods to solve jigsaw puzzles and we analyze the situations where the new paradigm improves the performance.
Thermodynamic efficiency of learning a rule in neural networks
NASA Astrophysics Data System (ADS)
Goldt, Sebastian; Seifert, Udo
2017-11-01
Biological systems have to build models from their sensory input data that allow them to efficiently process previously unseen inputs. Here, we study a neural network learning a binary classification rule for these inputs from examples provided by a teacher. We analyse the ability of the network to apply the rule to new inputs, that is to generalise from past experience. Using stochastic thermodynamics, we show that the thermodynamic costs of the learning process provide an upper bound on the amount of information that the network is able to learn from its teacher for both batch and online learning. This allows us to introduce a thermodynamic efficiency of learning. We analytically compute the dynamics and the efficiency of a noisy neural network performing online learning in the thermodynamic limit. In particular, we analyse three popular learning algorithms, namely Hebbian, Perceptron and AdaTron learning. Our work extends the methods of stochastic thermodynamics to a new type of learning problem and might form a suitable basis for investigating the thermodynamics of decision-making.
Kapnoula, Efthymia C.; McMurray, Bob
2016-01-01
Language learning is generally described as a problem of acquiring new information (e.g., new words). However, equally important are changes in how the system processes known information. For example, a wealth of studies has suggested dramatic changes over development in how efficiently children recognize familiar words, but it is unknown what kind of experience-dependent mechanisms of plasticity give rise to such changes in real-time processing. We examined the plasticity of the language processing system by testing whether a fundamental aspect of spoken word recognition, lexical interference, can be altered by experience. Adult participants were trained on a set of familiar words over a series of 4 tasks. In the high-competition (HC) condition, tasks were designed to encourage coactivation of similar words (e.g., net and neck) and to require listeners to resolve this competition. Tasks were similar in the low-competition (LC) condition, but did not enhance this competition. Immediately after training, interlexical interference was tested using a visual world paradigm task. Participants in the HC group resolved interference to a fuller degree than those in the LC group, demonstrating that experience can shape the way competition between words is resolved. TRACE simulations showed that the observed late differences in the pattern of interference resolution can be attributed to differences in the strength of lexical inhibition. These findings inform cognitive models in many domains that involve competition/interference processes, and suggest an experience-dependent mechanism of plasticity that may underlie longer term changes in processing efficiency associated with both typical and atypical development. PMID:26709587
Eco-efficiency in extended supply chains: a case study of furniture production.
Michelsen, Ottar; Fet, Annik Magerholm; Dahlsrud, Alexander
2006-05-01
This paper presents a methodology about how eco-efficiency in extended supply chains (ESCs) can be understood and measured. The extended supply chain includes all processes in the life cycle of a product and the eco-efficiency is measured as the relative environmental and value performance in one ESC compared to other ESCs. The paper is based on a case study of furniture production in Norway. Nine different environmental performance indicators are identified. These are based on suggestions from the World Business Council for Sustainable Development and additional indicators that are shown to have significant impacts in the life cycle of the products. Value performance is measured as inverse life cycle costs. The eco-efficiency for six different chair models is calculated and the relative values are shown graphically in XY-diagrams. This provides information about the relative performance of the products, which is valuable in green procurement processes. The same method is also used for analysing changes in eco-efficiency when possible alterations in the ESC are introduced. Here, it is shown that a small and realistic change of end-of-life treatment significantly changes the eco-efficiency of a product.
Designing a place for automation.
Bazzoli, F
1995-05-01
Re-engineering is a hot topic in health care as market forces increase pressure to cut costs. Providers and payers that are redesigning their business processes are counting on information systems to help achieve simplification and make large gains in efficiency. But these same organizations say they're reluctant to make large upfront investments in information systems until they know exactly what role technology will play in the re-engineered entity.
Learning to Predict Social Influence in Complex Networks
2012-03-29
03/2010 – 17/03/2012 Abstract: First, we addressed the problem of analyzing information diffusion process in a social network using two kinds...algorithm which avoids the inner loop optimization during the search. We tested the performance using the structures of four real world networks, and...result of information diffusion that starts from the node. 2 We use “infected” and “activated” interchangeably. Efficient Discovery of Influential
Informational system as an instrument for assessing the performance of the quality management system
NASA Astrophysics Data System (ADS)
Rohan, R.; Roşu, M. M.
2017-08-01
At present there is used a significant number of techniques and methods for diagnosis and management analysis which support the decision-making process. All these methods facilitate reaching the objectives for improving the results through efficiency, quality and customer satisfaction. By developing a methodology for analysing the problems identified in the macro-productive companies there can be brought outstanding benefits to the management and there are offered new perspectives on the critical influencing factors within a system. Through this paper we present an effective management strategy, applicable to an organization with productive profile in order to design an informational system aimed to manage one of its most important and complex systems, namely the coordination of the quality management system. The informational organisation of the quality management system on management principles, ensures an optimization of the informational energy consumption, allowing the management to deal with the following: to ascertain the current situation; to seize the opportunities, but also the potential risks afferent to the organisation policy; to observe the strengths and weaknesses; to take appropriate decisions and then to control the effects obtained. In this way, the decisional factors are able to better understand the available opportunities and to base more efficiently the process of choosing the alternatives.
Fluctuation sensitivity of a transcriptional signaling cascade
NASA Astrophysics Data System (ADS)
Pilkiewicz, Kevin R.; Mayo, Michael L.
2016-09-01
The internal biochemical state of a cell is regulated by a vast transcriptional network that kinetically correlates the concentrations of numerous proteins. Fluctuations in protein concentration that encode crucial information about this changing state must compete with fluctuations caused by the noisy cellular environment in order to successfully transmit information across the network. Oftentimes, one protein must regulate another through a sequence of intermediaries, and conventional wisdom, derived from the data processing inequality of information theory, leads us to expect that longer sequences should lose more information to noise. Using the metric of mutual information to characterize the fluctuation sensitivity of transcriptional signaling cascades, we find, counter to this expectation, that longer chains of regulatory interactions can instead lead to enhanced informational efficiency. We derive an analytic expression for the mutual information from a generalized chemical kinetics model that we reduce to simple, mass-action kinetics by linearizing for small fluctuations about the basal biological steady state, and we find that at long times this expression depends only on a simple ratio of protein production to destruction rates and the length of the cascade. We place bounds on the values of these parameters by requiring that the mutual information be at least one bit—otherwise, any received signal would be indistinguishable from noise—and we find not only that nature has devised a way to circumvent the data processing inequality, but that it must be circumvented to attain this one-bit threshold. We demonstrate how this result places informational and biochemical efficiency at odds with one another by correlating high transcription factor binding affinities with low informational output, and we conclude with an analysis of the validity of our assumptions and propose how they might be tested experimentally.
2011-01-01
Background No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. Methods A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Results Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Conclusions Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making. PMID:21214905
Dexter, Franklin; Wachtel, Ruth E; Epstein, Richard H
2011-01-07
No systematic process has previously been described for a needs assessment that identifies the operating room (OR) management decisions made by the anesthesiologists and nurse managers at a facility that do not maximize the efficiency of use of OR time. We evaluated whether event-based knowledge elicitation can be used practically for rapid assessment of OR management decision-making at facilities, whether scenarios can be adapted automatically from information systems data, and the usefulness of the approach. A process of event-based knowledge elicitation was developed to assess OR management decision-making that may reduce the efficiency of use of OR time. Hypothetical scenarios addressing every OR management decision influencing OR efficiency were created from published examples. Scenarios are adapted, so that cues about conditions are accurate and appropriate for each facility (e.g., if OR 1 is used as an example in a scenario, the listed procedure is a type of procedure performed at the facility in OR 1). Adaptation is performed automatically using the facility's OR information system or anesthesia information management system (AIMS) data for most scenarios (43 of 45). Performing the needs assessment takes approximately 1 hour of local managers' time while they decide if their decisions are consistent with the described scenarios. A table of contents of the indexed scenarios is created automatically, providing a simple version of problem solving using case-based reasoning. For example, a new OR manager wanting to know the best way to decide whether to move a case can look in the chapter on "Moving Cases on the Day of Surgery" to find a scenario that describes the situation being encountered. Scenarios have been adapted and used at 22 hospitals. Few changes in decisions were needed to increase the efficiency of use of OR time. The few changes were heterogeneous among hospitals, showing the usefulness of individualized assessments. Our technical advance is the development and use of automated event-based knowledge elicitation to identify suboptimal OR management decisions that decrease the efficiency of use of OR time. The adapted scenarios can be used in future decision-making.
Dix, Annika; Wartenburger, Isabell; van der Meer, Elke
2016-10-01
This study on analogical reasoning evaluates the impact of fluid intelligence on adaptive changes in neural efficiency over the course of an experiment and specifies the underlying cognitive processes. Grade 10 students (N=80) solved unfamiliar geometric analogy tasks of varying difficulty. Neural efficiency was measured by the event-related desynchronization (ERD) in the alpha band, an indicator of cortical activity. Neural efficiency was defined as a low amount of cortical activity accompanying high performance during problem-solving. Students solved the tasks faster and more accurately the higher their FI was. Moreover, while high FI led to greater cortical activity in the first half of the experiment, high FI was associated with a neurally more efficient processing (i.e., better performance but same amount of cortical activity) in the second half of the experiment. Performance in difficult tasks improved over the course of the experiment for all students while neural efficiency increased for students with higher but decreased for students with lower fluid intelligence. Based on analyses of the alpha sub-bands, we argue that high fluid intelligence was associated with a stronger investment of attentional resource in the integration of information and the encoding of relations in this unfamiliar task in the first half of the experiment (lower-2 alpha band). Students with lower fluid intelligence seem to adapt their applied strategies over the course of the experiment (i.e., focusing on task-relevant information; lower-1 alpha band). Thus, the initially lower cortical activity and its increase in students with lower fluid intelligence might reflect the overcoming of mental overload that was present in the first half of the experiment. Copyright © 2016 Elsevier Inc. All rights reserved.
Mahoney, John R; Aghamohammadi, Cina; Crutchfield, James P
2016-02-15
A stochastic process' statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process' cryptic order--a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.
Fifolt, Matthew; Blackburn, Justin; Rhodes, David J; Gillespie, Shemeka; Bennett, Aleena; Wolff, Paul; Rucks, Andrew
Historically, double data entry (DDE) has been considered the criterion standard for minimizing data entry errors. However, previous studies considered data entry alternatives through the limited lens of data accuracy. This study supplies information regarding data accuracy, operational efficiency, and cost for DDE and Optical Mark Recognition (OMR) for processing the Consumer Assessment of Healthcare Providers and Systems 5.0 survey. To assess data accuracy, we compared error rates for DDE and OMR by dividing the number of surveys that were arbitrated by the total number of surveys processed for each method. To assess operational efficiency, we tallied the cost of data entry for DDE and OMR after survey receipt. Costs were calculated on the basis of personnel, depreciation for capital equipment, and costs of noncapital equipment. The cost savings attributed to this method were negated by the operational efficiency of OMR. There was a statistical significance between rates of arbitration between DDE and OMR; however, this statistical significance did not create a practical significance. The potential benefits of DDE in terms of data accuracy did not outweigh the operational efficiency and thereby financial savings of OMR.
NASA Astrophysics Data System (ADS)
Laforest, Martin
Quantum information processing has been the subject of countless discoveries since the early 1990's. It is believed to be the way of the future for computation: using quantum systems permits one to perform computation exponentially faster than on a regular classical computer. Unfortunately, quantum systems that not isolated do not behave well. They tend to lose their quantum nature due to the presence of the environment. If key information is known about the noise present in the system, methods such as quantum error correction have been developed in order to reduce the errors introduced by the environment during a given quantum computation. In order to harness the quantum world and implement the theoretical ideas of quantum information processing and quantum error correction, it is imperative to understand and quantify the noise present in the quantum processor and benchmark the quality of the control over the qubits. Usual techniques to estimate the noise or the control are based on quantum process tomography (QPT), which, unfortunately, demands an exponential amount of resources. This thesis presents work towards the characterization of noisy processes in an efficient manner. The protocols are developed from a purely abstract setting with no system-dependent variables. To circumvent the exponential nature of quantum process tomography, three different efficient protocols are proposed and experimentally verified. The first protocol uses the idea of quantum error correction to extract relevant parameters about a given noise model, namely the correlation between the dephasing of two qubits. Following that is a protocol using randomization and symmetrization to extract the probability that a given number of qubits are simultaneously corrupted in a quantum memory, regardless of the specifics of the error and which qubits are affected. Finally, a last protocol, still using randomization ideas, is developed to estimate the average fidelity per computational gates for single and multi qubit systems. Even though liquid state NMR is argued to be unsuitable for scalable quantum information processing, it remains the best test-bed system to experimentally implement, verify and develop protocols aimed at increasing the control over general quantum information processors. For this reason, all the protocols described in this thesis have been implemented in liquid state NMR, which then led to further development of control and analysis techniques.
Content-Aware DataGuide with Incremental Index Update using Frequently Used Paths
NASA Astrophysics Data System (ADS)
Sharma, A. K.; Duhan, Neelam; Khattar, Priyanka
2010-11-01
Size of the WWW is increasing day by day. Due to the absence of structured data on the Web, it becomes very difficult for information retrieval tools to fully utilize the Web information. As a solution to this problem, XML pages come into play, which provide structural information to the users to some extent. Without efficient indexes, query processing can be quite inefficient due to an exhaustive traversal on XML data. In this paper an improved content-centric approach of Content-Aware DataGuide, which is an indexing technique for XML databases, is being proposed that uses frequently used paths from historical query logs to improve query performance. The index can be updated incrementally according to the changes in query workload and thus, the overhead of reconstruction can be minimized. Frequently used paths are extracted using any Sequential Pattern mining algorithm on subsequent queries in the query workload. After this, the data structures are incrementally updated. This indexing technique proves to be efficient as partial matching queries can be executed efficiently and users can now get the more relevant documents in results.
Herrero, Mario; Havlík, Petr; Valin, Hugo; Notenbaert, An; Rufino, Mariana C.; Thornton, Philip K.; Blümmel, Michael; Weiss, Franz; Grace, Delia; Obersteiner, Michael
2013-01-01
We present a unique, biologically consistent, spatially disaggregated global livestock dataset containing information on biomass use, production, feed efficiency, excretion, and greenhouse gas emissions for 28 regions, 8 livestock production systems, 4 animal species (cattle, small ruminants, pigs, and poultry), and 3 livestock products (milk, meat, and eggs). The dataset contains over 50 new global maps containing high-resolution information for understanding the multiple roles (biophysical, economic, social) that livestock can play in different parts of the world. The dataset highlights: (i) feed efficiency as a key driver of productivity, resource use, and greenhouse gas emission intensities, with vast differences between production systems and animal products; (ii) the importance of grasslands as a global resource, supplying almost 50% of biomass for animals while continuing to be at the epicentre of land conversion processes; and (iii) the importance of mixed crop–livestock systems, producing the greater part of animal production (over 60%) in both the developed and the developing world. These data provide critical information for developing targeted, sustainable solutions for the livestock sector and its widely ranging contribution to the global food system. PMID:24344273
Information Spread of Emergency Events: Path Searching on Social Networks
Hu, Hongzhi; Wu, Tunan
2014-01-01
Emergency has attracted global attentions of government and the public, and it will easily trigger a series of serious social problems if it is not supervised effectively in the dissemination process. In the Internet world, people communicate with each other and form various virtual communities based on social networks, which lead to a complex and fast information spread pattern of emergency events. This paper collects Internet data based on data acquisition and topic detection technology, analyzes the process of information spread on social networks, describes the diffusions and impacts of that information from the perspective of random graph, and finally seeks the key paths through an improved IBF algorithm. Application cases have shown that this algorithm can search the shortest spread paths efficiently, which may help us to guide and control the information dissemination of emergency events on early warning. PMID:24600323
Learning classification with auxiliary probabilistic information
Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos
2012-01-01
Finding ways of incorporating auxiliary information or auxiliary data into the learning process has been the topic of active data mining and machine learning research in recent years. In this work we study and develop a new framework for classification learning problem in which, in addition to class labels, the learner is provided with an auxiliary (probabilistic) information that reflects how strong the expert feels about the class label. This approach can be extremely useful for many practical classification tasks that rely on subjective label assessment and where the cost of acquiring additional auxiliary information is negligible when compared to the cost of the example analysis and labelling. We develop classification algorithms capable of using the auxiliary information to make the learning process more efficient in terms of the sample complexity. We demonstrate the benefit of the approach on a number of synthetic and real world data sets by comparing it to the learning with class labels only. PMID:25309141
NASA Astrophysics Data System (ADS)
Demigha, Souâd.
2016-03-01
The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.
Binary video codec for data reduction in wireless visual sensor networks
NASA Astrophysics Data System (ADS)
Khursheed, Khursheed; Ahmad, Naeem; Imran, Muhammad; O'Nils, Mattias
2013-02-01
Wireless Visual Sensor Networks (WVSN) is formed by deploying many Visual Sensor Nodes (VSNs) in the field. Typical applications of WVSN include environmental monitoring, health care, industrial process monitoring, stadium/airports monitoring for security reasons and many more. The energy budget in the outdoor applications of WVSN is limited to the batteries and the frequent replacement of batteries is usually not desirable. So the processing as well as the communication energy consumption of the VSN needs to be optimized in such a way that the network remains functional for longer duration. The images captured by VSN contain huge amount of data and require efficient computational resources for processing the images and wide communication bandwidth for the transmission of the results. Image processing algorithms must be designed and developed in such a way that they are computationally less complex and must provide high compression rate. For some applications of WVSN, the captured images can be segmented into bi-level images and hence bi-level image coding methods will efficiently reduce the information amount in these segmented images. But the compression rate of the bi-level image coding methods is limited by the underlined compression algorithm. Hence there is a need for designing other intelligent and efficient algorithms which are computationally less complex and provide better compression rate than that of bi-level image coding methods. Change coding is one such algorithm which is computationally less complex (require only exclusive OR operations) and provide better compression efficiency compared to image coding but it is effective for applications having slight changes between adjacent frames of the video. The detection and coding of the Region of Interest (ROIs) in the change frame efficiently reduce the information amount in the change frame. But, if the number of objects in the change frames is higher than a certain level then the compression efficiency of both the change coding and ROI coding becomes worse than that of image coding. This paper explores the compression efficiency of the Binary Video Codec (BVC) for the data reduction in WVSN. We proposed to implement all the three compression techniques i.e. image coding, change coding and ROI coding at the VSN and then select the smallest bit stream among the results of the three compression techniques. In this way the compression performance of the BVC will never become worse than that of image coding. We concluded that the compression efficiency of BVC is always better than that of change coding and is always better than or equal that of ROI coding and image coding.
Relatively Certain! Comparative Thinking Reduces Uncertainty
ERIC Educational Resources Information Center
Mussweiler, Thomas; Posten, Ann-Christin
2012-01-01
Comparison is one of the most ubiquitous and versatile mechanisms in human information processing. Previous research demonstrates that one consequence of comparative thinking is increased judgmental efficiency: Comparison allows for quicker judgments without a loss in accuracy. We hypothesised that a second potential consequence of comparative…
Nome Offshore Mining Information
Lands Coal Regulatory Program Large Mine Permits Mineral Property and Rights Mining Index Land potential safety concerns, prevent overcrowding, and provide for efficient processing of the permits and Regulatory Program Large Mine Permitting Mineral Property Management Mining Fact Sheets Mining Forms APMA
78 FR 77098 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-20
...: Regular submission (extension of a current information collection). Number of Respondents: 1,800. Average... Trustees in more efficiently carrying out the restoration planning phase of Natural Resource Damage... requiring restoration, during the restoration planning phase of the NRDA process. Affected Public: Not-for...
Vocational Education and Training in Denmark. Short Description
ERIC Educational Resources Information Center
Cedefop - European Centre for the Development of Vocational Training, 2012
2012-01-01
Vocational education and training in Denmark has embarked on a process of modernisation aiming at, primarily, increasing flexibility, and individualisation, quality and efficiency. Assessment and recognition of informal and non-formal learning, competence-based curricula, innovative approaches to teaching, and increased possibilities for partial…
A Preview of Coming Attractions: Classroom Teacher's Idea Notebook.
ERIC Educational Resources Information Center
Morin, Joy Ann
1995-01-01
Contends that it is important for students to be motivated and well prepared for class units and activities. Describes a "previews of coming attractions" instructional strategy that uses advance organizers to increase information processing efficiency. Includes a sample unit outline illustrating this approach. (CFR)
Representation and Integration of Scientific Information
NASA Technical Reports Server (NTRS)
1998-01-01
The objective of this Joint Research Interchange with NASA-Ames was to investigate how the Tsimmis technology could be used to represent and integrate scientific information. The main goal of the Tsimmis project is to allow a decision maker to find information of interest from such sources, fuse it, and process it (e.g., summarize it, visualize it, discover trends). Another important goal is the easy incorporation of new sources, as well the ability to deal with sources whose structure or services evolve. During the Interchange we had research meetings approximately every month or two. The funds provided by NASA supported work that lead to the following two papers: Fusion Queries over Internet Databases; Efficient Query Subscription Processing in a Multicast Environment.
Efficient embedding of complex networks to hyperbolic space via their Laplacian
Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.
2016-01-01
The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction. PMID:27445157
Applying the metro map to software development management
NASA Astrophysics Data System (ADS)
Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción
2010-01-01
This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.
Efficient embedding of complex networks to hyperbolic space via their Laplacian
NASA Astrophysics Data System (ADS)
Alanis-Lobato, Gregorio; Mier, Pablo; Andrade-Navarro, Miguel A.
2016-07-01
The different factors involved in the growth process of complex networks imprint valuable information in their observable topologies. How to exploit this information to accurately predict structural network changes is the subject of active research. A recent model of network growth sustains that the emergence of properties common to most complex systems is the result of certain trade-offs between node birth-time and similarity. This model has a geometric interpretation in hyperbolic space, where distances between nodes abstract this optimisation process. Current methods for network hyperbolic embedding search for node coordinates that maximise the likelihood that the network was produced by the afore-mentioned model. Here, a different strategy is followed in the form of the Laplacian-based Network Embedding, a simple yet accurate, efficient and data driven manifold learning approach, which allows for the quick geometric analysis of big networks. Comparisons against existing embedding and prediction techniques highlight its applicability to network evolution and link prediction.
Symbolic Constraint Maintenance Grid
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
Version 3.1 of Symbolic Constraint Maintenance Grid (SCMG) is a software system that provides a general conceptual framework for utilizing pre-existing programming techniques to perform symbolic transformations of data. SCMG also provides a language (and an associated communication method and protocol) for representing constraints on the original non-symbolic data. SCMG provides a facility for exchanging information between numeric and symbolic components without knowing the details of the components themselves. In essence, it integrates symbolic software tools (for diagnosis, prognosis, and planning) with non-artificial-intelligence software. SCMG executes a process of symbolic summarization and monitoring of continuous time series data that are being abstractly represented as symbolic templates of information exchange. This summarization process enables such symbolic- reasoning computing systems as artificial- intelligence planning systems to evaluate the significance and effects of channels of data more efficiently than would otherwise be possible. As a result of the increased efficiency in representation, reasoning software can monitor more channels and is thus able to perform monitoring and control functions more effectively.
Enas, G G; Andersen, J S
With the dawn of the 21st century, the pharmaceutical industry faces a dramatically different constellation of business and scientific predictors of success than those of just a few years ago. Significant advances in science at the genetic, molecular and cellular levels, combined with progress demonstrated around the globe with drug regulations, have increased business and competitive opportunities. This has occurred in search of better and cheaper medicines that reach patients with unmet medical needs as quickly as possible. Herein lie new opportunities for those who can help business and regulatory leaders make good decisions about drug development and market authorization as quickly and efficiently as possible in the presence of uncertainty. The statistician is uniquely trained and qualified to render such value. We show how the statistician can contribute to the process of drug innovation from the very early stages of drug discovery until patients, payers and regulators are satisfied. Indeed, the very nature of regulated innovation demands that efficient and effective processes are implemented which yield the right information for good decision making. The statistician can take the lead in setting a strategy that directs such processes in the direction of greatest value. This demands skills that enable one to identify important sources of variability and uncertainty and then leverage those skills to make decisions. If such decisions call for more information, then the statistician can render experimental designs which generate the right information needed to make the decision in an efficient, timely manner. To add value to the enterprise, statisticians will have to become more intimately associated with business and regulatory decisions by building on their traditional roles (for example, numerical analyst, tactician) and unique skill sets (for example, analysis, computation, logical thought and work process, precision, accuracy). Business and regulatory savvy, coupled with excellent communication and interpersonal skills, will allow statisticians to help create the knowledge needed to drive success in the future. Copyright 2001 John Wiley & Sons, Ltd.
Baddeley, Michelle; Tobler, Philippe N.; Schultz, Wolfram
2016-01-01
Given that the range of rewarding and punishing outcomes of actions is large but neural coding capacity is limited, efficient processing of outcomes by the brain is necessary. One mechanism to increase efficiency is to rescale neural output to the range of outcomes expected in the current context, and process only experienced deviations from this expectation. However, this mechanism comes at the cost of not being able to discriminate between unexpectedly low losses when times are bad versus unexpectedly high gains when times are good. Thus, too much adaptation would result in disregarding information about the nature and absolute magnitude of outcomes, preventing learning about the longer-term value structure of the environment. Here we investigate the degree of adaptation in outcome coding brain regions in humans, for directly experienced outcomes and observed outcomes. We scanned participants while they performed a social learning task in gain and loss blocks. Multivariate pattern analysis showed two distinct networks of brain regions adapt to the most likely outcomes within a block. Frontostriatal areas adapted to directly experienced outcomes, whereas lateral frontal and temporoparietal regions adapted to observed social outcomes. Critically, in both cases, adaptation was incomplete and information about whether the outcomes arose in a gain block or a loss block was retained. Univariate analysis confirmed incomplete adaptive coding in these regions but also detected nonadapting outcome signals. Thus, although neural areas rescale their responses to outcomes for efficient coding, they adapt incompletely and keep track of the longer-term incentives available in the environment. SIGNIFICANCE STATEMENT Optimal value-based choice requires that the brain precisely and efficiently represents positive and negative outcomes. One way to increase efficiency is to adapt responding to the most likely outcomes in a given context. However, too strong adaptation would result in loss of precise representation (e.g., when the avoidance of a loss in a loss-context is coded the same as receipt of a gain in a gain-context). We investigated an intermediate form of adaptation that is efficient while maintaining information about received gains and avoided losses. We found that frontostriatal areas adapted to directly experienced outcomes, whereas lateral frontal and temporoparietal regions adapted to observed social outcomes. Importantly, adaptation was intermediate, in line with influential models of reference dependence in behavioral economics. PMID:27683899
A new stationary gridline artifact suppression method based on the 2D discrete wavelet transform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Hui, E-mail: corinna@seu.edu.cn; Key Laboratory of Computer Network and Information Integration; Centre de Recherche en Information Biomédicale sino-français, Laboratoire International Associé, Inserm, Université de Rennes 1, Rennes 35000
2015-04-15
Purpose: In digital x-ray radiography, an antiscatter grid is inserted between the patient and the image receptor to reduce scattered radiation. If the antiscatter grid is used in a stationary way, gridline artifacts will appear in the final image. In most of the gridline removal image processing methods, the useful information with spatial frequencies close to that of the gridline is usually lost or degraded. In this study, a new stationary gridline suppression method is designed to preserve more of the useful information. Methods: The method is as follows. The input image is first recursively decomposed into several smaller subimagesmore » using a multiscale 2D discrete wavelet transform. The decomposition process stops when the gridline signal is found to be greater than a threshold in one or several of these subimages using a gridline detection module. An automatic Gaussian band-stop filter is then applied to the detected subimages to remove the gridline signal. Finally, the restored image is achieved using the corresponding 2D inverse discrete wavelet transform. Results: The processed images show that the proposed method can remove the gridline signal efficiently while maintaining the image details. The spectra of a 1D Fourier transform of the processed images demonstrate that, compared with some existing gridline removal methods, the proposed method has better information preservation after the removal of the gridline artifacts. Additionally, the performance speed is relatively high. Conclusions: The experimental results demonstrate the efficiency of the proposed method. Compared with some existing gridline removal methods, the proposed method can preserve more information within an acceptable execution time.« less
Energy Efficient Digital Logic Using Nanoscale Magnetic Devices
NASA Astrophysics Data System (ADS)
Lambson, Brian James
Increasing demand for information processing in the last 50 years has been largely satisfied by the steadily declining price and improving performance of microelectronic devices. Much of this progress has been made by aggressively scaling the size of semiconductor transistors and metal interconnects that microprocessors are built from. As devices shrink to the size regime in which quantum effects pose significant challenges, new physics may be required in order to continue historical scaling trends. A variety of new devices and physics are currently under investigation throughout the scientific and engineering community to meet these challenges. One of the more drastic proposals on the table is to replace the electronic components of information processors with magnetic components. Magnetic components are already commonplace in computers for their information storage capability. Unlike most electronic devices, magnetic materials can store data in the absence of a power supply. Today's magnetic hard disk drives can routinely hold billions of bits of information and are in widespread commercial use. Their ability to function without a constant power source hints at an intrinsic energy efficiency. The question we investigate in this dissertation is whether or not this advantage can be extended from information storage to the notoriously energy intensive task of information processing. Several proof-of-concept magnetic logic devices were proposed and tested in the past decade. In this dissertation, we build on the prior work by answering fundamental questions about how magnetic devices achieve such high energy efficiency and how they can best function in digital logic applications. The results of this analysis are used to suggest and test improvements to nanomagnetic computing devices. Two of our results are seen as especially important to the field of nanomagnetic computing: (1) we show that it is possible to operate nanomagnetic computers at the fundamental thermodyanimic limits of computation and (2) we develop a nanomagnet with a unique shape that is engineered to significantly improve the reliability of nanomagnetic logic.
Digital and biological computing in organizations.
Kampfner, Roberto R
2002-01-01
Michael Conrad unveiled many of the fundamental characteristics of biological computing. Underlying the behavioral variability and the adaptability of biological systems are these characteristics, including the ability of biological information processing to exploit quantum features at the atomic level, the powerful 3-D pattern recognition capabilities of macromolecules, the computational efficiency, and the ability to support biological function. Among many other things, Conrad formalized and explicated the underlying principles of biological adaptability, characterized the differences between biological and digital computing in terms of a fundamental tradeoff between adaptability and programmability of information processing, and discussed the challenges of interfacing digital computers and human society. This paper is about the encounter of biological and digital computing. The focus is on the nature of the biological information processing infrastructure of organizations and how it can be extended effectively with digital computing. In order to achieve this goal effectively, however, we need to embed properly digital computing into the information processing aspects of human and social behavior and intelligence, which are fundamentally biological. Conrad's legacy provides a firm, strong, and inspiring foundation for this endeavor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masanet, Eric; Therkelsen, Peter; Worrell, Ernst
The U.S. baking industry—defined in this Energy Guide as facilities engaged in the manufacture of commercial bakery products such as breads, rolls, frozen cakes, pies, pastries, and cookies and crackers—consumes over $800 million worth of purchased fuels and electricity per year. Energy efficiency improvement is an important way to reduce these costs and to increase predictable earnings, especially in times of high energy price volatility. There are a variety of opportunities available at individual plants to reduce energy consumption in a cost-effective manner. This Energy Guide discusses energy efficiency practices and energy-efficient technologies that can be implemented at the component,more » process, facility, and organizational levels. Many measure descriptions include expected savings in energy and energy-related costs, based on case study data from real-world applications in food processing facilities and related industries worldwide. Typical measure payback periods and references to further information in the technical literature are also provided, when available. A summary of basic, proven measures for improving plant-level water efficiency is also provided. The information in this Energy Guide is intended to help energy and plant managers in the U.S. baking industry reduce energy and water consumption in a cost-effective manner while maintaining the quality of products manufactured. Further research on the economics of all measures—as well as on their applicability to different production practices—is needed to assess their cost effectiveness at individual plants.« less
Technical Data Interoperability (TDI) Pathfinder Via Emerging Standards
NASA Technical Reports Server (NTRS)
Conroy, Mike; Gill, Paul; Hill, Bradley; Ibach, Brandon; Jones, Corey; Ungar, David; Barch, Jeffrey; Ingalls, John; Jacoby, Joseph; Manning, Josh;
2014-01-01
The TDI project (TDI) investigates trending technical data standards for applicability to NASA vehicles, space stations, payloads, facilities, and equipment. TDI tested COTS software compatible with a certain suite of related industry standards for capabilities of individual benefits and interoperability. These standards not only esnable Information Technology (IT) efficiencies, but also address efficient structures and standard content for business processes. We used source data from generic industry samples as well as NASA and European Space Agency (ESA) data from space systems.
Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C
NASA Technical Reports Server (NTRS)
Natesh, R.; Guyer, T.; Stringfellow, G. B.
1982-01-01
Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.
Conducting Nursing Research to Advance and Inform Health Policy.
Ellenbecker, Carol Hall; Edward, Jean
2016-11-01
The primary roles of nurse scientists in conducting health policy research are to increase knowledge in the discipline and provide evidence for informing and advancing health policies with the goal of improving the health outcomes of society. Health policy research informs, characterizes, explains, or tests hypotheses by employing a variety of research designs. Health policy research focuses on improving the access to care, the quality and cost of care, and the efficiency with which care is delivered. In this article, we explain how nurses might envision their research in a policy process framework, describe research designs that nurse researchers might use to inform and advance health policies, and provide examples of research conducted by nurse researchers to explicate key concepts in the policy process framework. Health policies are well informed and advanced when nurse researchers have a good understanding of the political process. The policy process framework provides a context for improving the focus and design of research and better explicating the connection between research evidence and policy. Nurses should focus their research on addressing problems of importance that are on the healthcare agenda, work with interdisciplinary teams of researchers, synthesize, and widely disseminate results.
NASA Astrophysics Data System (ADS)
Wu, Zikai; Hou, Baoyu; Zhang, Hongjuan; Jin, Feng
2014-04-01
Deterministic network models have been attractive media for discussing dynamical processes' dependence on network structural features. On the other hand, the heterogeneity of weights affect dynamical processes taking place on networks. In this paper, we present a family of weighted expanded Koch networks based on Koch networks. They originate from a r-polygon, and each node of current generation produces m r-polygons including the node and whose weighted edges are scaled by factor w in subsequent evolutionary step. We derive closed-form expressions for average weighted shortest path length (AWSP). In large network, AWSP stays bounded with network order growing (0 < w < 1). Then, we focus on a special random walks and trapping issue on the networks. In more detail, we calculate exactly the average receiving time (ART). ART exhibits a sub-linear dependence on network order (0 < w < 1), which implies that nontrivial weighted expanded Koch networks are more efficient than un-weighted expanded Koch networks in receiving information. Besides, efficiency of receiving information at hub nodes is also dependent on parameters m and r. These findings may pave the way for controlling information transportation on general weighted networks.
A Survey on Data Storage and Information Discovery in the WSANs-Based Edge Computing Systems
Liang, Junbin; Liu, Renping; Ni, Wei; Li, Yin; Li, Ran; Ma, Wenpeng; Qi, Chuanda
2018-01-01
In the post-Cloud era, the proliferation of Internet of Things (IoT) has pushed the horizon of Edge computing, which is a new computing paradigm with data processed at the edge of the network. As the important systems of Edge computing, wireless sensor and actuator networks (WSANs) play an important role in collecting and processing the sensing data from the surrounding environment as well as taking actions on the events happening in the environment. In WSANs, in-network data storage and information discovery schemes with high energy efficiency, high load balance and low latency are needed because of the limited resources of the sensor nodes and the real-time requirement of some specific applications, such as putting out a big fire in a forest. In this article, the existing schemes of WSANs on data storage and information discovery are surveyed with detailed analysis on their advancements and shortcomings, and possible solutions are proposed on how to achieve high efficiency, good load balance, and perfect real-time performances at the same time, hoping that it can provide a good reference for the future research of the WSANs-based Edge computing systems. PMID:29439442
A ganglion-cell-based primary image representation method and its contribution to object recognition
NASA Astrophysics Data System (ADS)
Wei, Hui; Dai, Zhi-Long; Zuo, Qing-Song
2016-10-01
A visual stimulus is represented by the biological visual system at several levels: in the order from low to high levels, they are: photoreceptor cells, ganglion cells (GCs), lateral geniculate nucleus cells and visual cortical neurons. Retinal GCs at the early level need to represent raw data only once, but meet a wide number of diverse requests from different vision-based tasks. This means the information representation at this level is general and not task-specific. Neurobiological findings have attributed this universal adaptation to GCs' receptive field (RF) mechanisms. For the purposes of developing a highly efficient image representation method that can facilitate information processing and interpretation at later stages, here we design a computational model to simulate the GC's non-classical RF. This new image presentation method can extract major structural features from raw data, and is consistent with other statistical measures of the image. Based on the new representation, the performances of other state-of-the-art algorithms in contour detection and segmentation can be upgraded remarkably. This work concludes that applying sophisticated representation schema at early state is an efficient and promising strategy in visual information processing.
A Survey on Data Storage and Information Discovery in the WSANs-Based Edge Computing Systems.
Ma, Xingpo; Liang, Junbin; Liu, Renping; Ni, Wei; Li, Yin; Li, Ran; Ma, Wenpeng; Qi, Chuanda
2018-02-10
In the post-Cloud era, the proliferation of Internet of Things (IoT) has pushed the horizon of Edge computing, which is a new computing paradigm with data are processed at the edge of the network. As the important systems of Edge computing, wireless sensor and actuator networks (WSANs) play an important role in collecting and processing the sensing data from the surrounding environment as well as taking actions on the events happening in the environment. In WSANs, in-network data storage and information discovery schemes with high energy efficiency, high load balance and low latency are needed because of the limited resources of the sensor nodes and the real-time requirement of some specific applications, such as putting out a big fire in a forest. In this article, the existing schemes of WSANs on data storage and information discovery are surveyed with detailed analysis on their advancements and shortcomings, and possible solutions are proposed on how to achieve high efficiency, good load balance, and perfect real-time performances at the same time, hoping that it can provide a good reference for the future research of the WSANs-based Edge computing systems.
Persistent maritime traffic monitoring for the Canadian Arctic
NASA Astrophysics Data System (ADS)
Ulmke, M.; Battistello, G.; Biermann, J.; Mohrdieck, C.; Pelot, R.; Koch, W.
2017-05-01
This paper presents results of the Canadian-German research project PASSAGES (Protection and Advanced Surveillance System for the Arctic: Green, Efficient, Secure)1 on an advanced surveillance system for safety and security of maritime operations in Arctic areas. The motivation for a surveillance system of the Northwest Passage is the projected growth of maritime traffic along Arctic sea routes and the need for securing Canada's sovereignty by controlling its arctic waters as well as for protecting the safety of international shipping and the intactness of the arctic marine environment. To ensure border security and to detect and prevent illegal activities it is necessary to develop a system for surveillance and reconnaissance that brings together all related means, assets, organizations, processes and structures to build one homogeneous and integrated system. The harsh arctic conditions require a new surveillance concept that fuses heterogeneous sensor data, contextual information, and available pre-processed surveillance data and combines all components to efficiently extract and provide the maximum available amount of information. The fusion of all these heterogeneous data and information will provide improved and comprehensive situation awareness for risk assessment and decision support of different stakeholder groups as governmental authorities, commercial users and Northern communities.
NASA Astrophysics Data System (ADS)
Pop, Florin; Dobre, Ciprian; Mocanu, Bogdan-Costel; Citoteanu, Oana-Maria; Xhafa, Fatos
2016-11-01
Managing the large dimensions of data processed in distributed systems that are formed by datacentres and mobile devices has become a challenging issue with an important impact on the end-user. Therefore, the management process of such systems can be achieved efficiently by using uniform overlay networks, interconnected through secure and efficient routing protocols. The aim of this article is to advance our previous work with a novel trust model based on a reputation metric that actively uses the social links between users and the model of interaction between them. We present and evaluate an adaptive model for the trust management in structured overlay networks, based on a Mobile Cloud architecture and considering a honeycomb overlay. Such a model can be useful for supporting advanced mobile market-share e-Commerce platforms, where users collaborate and exchange reliable information about, for example, products of interest and supporting ad-hoc business campaigns
Dent, Kevin; Lestou, Vaia; Humphreys, Glyn W
2010-02-01
It has been argued that area hMT+/V5 in humans acts as a motion filter, enabling targets defined by a conjunction of motion and form to be efficiently selected. We present data indicating that (a) damage to parietal cortex leads to a selective problem in processing motion-form conjunctions, and (b) that the presence of a structurally and functional intact hMT+/V5 is not sufficient for efficient search for motion-form conjunctions. We suggest that, in addition to motion-processing areas (e.g., hMT+/V5), the posterior parietal cortex is necessary for efficient search with motion-form conjunctions, so that damage to either brain region may bring about deficits in search. We discuss the results in terms of the involvement of the posterior parietal cortex in the top-down guidance of search or in the binding of motion and form information.
Dynamic CDM strategies in an EHR environment.
Bieker, Michael; Bailey, Spencer
2012-02-01
A dynamic charge description master (CDM) integrates information from clinical ancillary systems into the charge-capture process, so an organization can reduce its reliance on the patient accounting system as the sole source of billing information. By leveraging the information from electronic ancillary systems, providers can eliminate the need for paper charge-capture forms and see increased accuracy and efficiency in the maintenance of billing information. Before embarking on a dynamic CDM strategy, organizations should first determine their goals for implementing an EHR system, include revenue cycle leaders on the EHR implementation team, and carefully weigh the pros and cons of CDM design decisions.
Exploring information technology adoption by family physicians: survey instrument valuation.
Dixon, D. R.; Stewart, M.
2000-01-01
As the information needs of family physicians become more complex, there will be a greater need to successfully implement the technologies needed to manage that information. The ability to stratify primary care physicians can enable the implementation process to be more efficient. This research tested a new instrument on 101 family physicians, and was able to stratify physicians into high, intermediate, and low information technology (IT) usage groups. It is expected that this stratification would allow managers of IT implementation to target specific adoption strategies for each group. The instrument is available from ddixon@julian.uwo.ca. PMID:11079870
NASA Astrophysics Data System (ADS)
Ercan, İlke; Suyabatmaz, Enes
2018-06-01
The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.
Identifying Home Care Clinicians’ Information Needs for Managing Fall Risks
Alhuwail, Dari
2016-01-01
Summary Objectives To help manage the risk of falls in home care, this study aimed to (i) identify home care clinicians’ information needs and how they manage missing or inaccurate data, (ii) identify problems that impact effectiveness and efficiency associated with retaining, exchanging, or processing information about fall risks in existing workflows and currently adopted health information technology (IT) solutions, and (iii) offer informatics-based recommendations to improve fall risk management interventions. Methods A case study was carried out in a single not-for-profit suburban Medicare-certified home health agency with three branches. Qualitative data were collected over a six month period through observations, semi-structured interviews, and focus groups. The Framework method was used for analysis. Maximum variation sampling was adopted to recruit a diverse sample of clinicians. Results Overall, the information needs for fall risk management were categorized into physiological, care delivery, educational, social, environmental, and administrative domains. Examples include a brief fall-related patient history, weight-bearing status, medications that affect balance, availability of caregivers at home, and the influence of patients’ cultures on fall management interventions. The unavailability and inaccuracy of critical information related to fall risks can delay necessary therapeutic services aimed at reducing patients’ risk for falling and thereby jeopardizing their safety. Currently adopted IT solutions did not adequately accommodate data related to fall risk management. Conclusion The results highlight the essential information for fall risk management in home care. Home care workflows and health IT solutions must effectively and efficiently retain, exchange, and process information necessary for fall risk management. Interoperability and integration of the various health IT solutions to make data sharing accessible to all clinicians is critical for fall risk management. Findings from this study can help home health agencies better understand their information needs to manage fall risks. PMID:27437035
GUILD: GUidance for Information about Linking Data sets†
Gilbert, Ruth; Lafferty, Rosemary; Hagger-Johnson, Gareth; Zhang, Li-Chun; Smith, Peter; Dibben, Chris; Goldstein, Harvey
2018-01-01
Abstract Record linkage of administrative and survey data is increasingly used to generate evidence to inform policy and services. Although a powerful and efficient way of generating new information from existing data sets, errors related to data processing before, during and after linkage can bias results. However, researchers and users of linked data rarely have access to information that can be used to assess these biases or take them into account in analyses. As linked administrative data are increasingly used to provide evidence to guide policy and services, linkage error, which disproportionately affects disadvantaged groups, can undermine evidence for public health. We convened a group of researchers and experts from government data providers to develop guidance about the information that needs to be made available about the data linkage process, by data providers, data linkers, analysts and the researchers who write reports. The guidance goes beyond recommendations for information to be included in research reports. Our aim is to raise awareness of information that may be required at each step of the linkage pathway to improve the transparency, reproducibility, and accuracy of linkage processes, and the validity of analyses and interpretation of results. PMID:28369581
Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage
Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming
2018-01-01
Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query. PMID:29652810
Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage.
Guo, Yeting; Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming
2018-04-13
Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query.
2016-02-23
supports the warfighter; promotes accountability , integrity, and efficiency; advises the Secretary of Defense and Congress; and informs the public...cause correction, and status accounting of individual product quality deficiencies. The process primarily focuses on the following roles...the contractor incorrectly manufactured all 100 parts and the contractor agreed to replace them if returned. To properly account for all 100
What makes a competent clinical teacher?
Wealthall, Stephen; Henning, Marcus
2012-01-01
Background Clinical teaching competency is a professional necessity ensuring that clinicians’ knowledge, skills and attitudes are effectively transmitted from experts to novices. The aim of this paper is to consider how clinical skills are transmitted from a historical and reflective perspective and to link these ideas with student and teacher perceptions of competence in clinical teaching. Methods The reflections are informed by a Delphi process and professional development survey designed to capture students’ and clinicians’ ideas about the attributes of a competent clinical teacher. In addition, the survey process obtained information on the importance and ‘teachability’ of these characteristics. Results Four key characteristics of the competent teacher emerged from the Delphi process: clinically competent, efficient organizer, group communicator and person–centred. In a subsequent survey, students were found to be more optimistic about the ‘teachability’ of these characteristics than clinicians and scored the attribute of person-centredness higher than clinicians. Clinicians, on the other hand, ascribed higher levels of importance to clinical competency, efficient organization and group communication than students. Conclusions The Delphi process created a non-threatening system for gathering student and clinician expectations of teachers and created a foundation for developing methods for evaluating clinical competency. This provided insights into differences between teachers’ and students’ expectations, their importance, and professional development. PMID:26451184
Lee, Young Han
2012-01-01
The objectives are (1) to introduce an easy open-source macro program as connection software and (2) to illustrate the practical usages in radiologic reading environment by simulating the radiologic reading process. The simulation is a set of radiologic reading process to do a practical task in the radiologic reading room. The principal processes are: (1) to view radiologic images on the Picture Archiving and Communicating System (PACS), (2) to connect the HIS/EMR (Hospital Information System/Electronic Medical Record) system, (3) to make an automatic radiologic reporting system, and (4) to record and recall information of interesting cases. This simulation environment was designed by using open-source macro program as connection software. The simulation performed well on the Window-based PACS workstation. Radiologists practiced the steps of the simulation comfortably by utilizing the macro-powered radiologic environment. This macro program could automate several manual cumbersome steps in the radiologic reading process. This program successfully acts as connection software for the PACS software, EMR/HIS, spreadsheet, and other various input devices in the radiologic reading environment. A user-friendly efficient radiologic reading environment could be established by utilizing open-source macro program as connection software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Photogrammetric Processing of Planetary Linear Pushbroom Images Based on Approximate Orthophotos
NASA Astrophysics Data System (ADS)
Geng, X.; Xu, Q.; Xing, S.; Hou, Y. F.; Lan, C. Z.; Zhang, J. J.
2018-04-01
It is still a great challenging task to efficiently produce planetary mapping products from orbital remote sensing images. There are many disadvantages in photogrammetric processing of planetary stereo images, such as lacking ground control information and informative features. Among which, image matching is the most difficult job in planetary photogrammetry. This paper designs a photogrammetric processing framework for planetary remote sensing images based on approximate orthophotos. Both tie points extraction for bundle adjustment and dense image matching for generating digital terrain model (DTM) are performed on approximate orthophotos. Since most of planetary remote sensing images are acquired by linear scanner cameras, we mainly deal with linear pushbroom images. In order to improve the computational efficiency of orthophotos generation and coordinates transformation, a fast back-projection algorithm of linear pushbroom images is introduced. Moreover, an iteratively refined DTM and orthophotos scheme was adopted in the DTM generation process, which is helpful to reduce search space of image matching and improve matching accuracy of conjugate points. With the advantages of approximate orthophotos, the matching results of planetary remote sensing images can be greatly improved. We tested the proposed approach with Mars Express (MEX) High Resolution Stereo Camera (HRSC) and Lunar Reconnaissance Orbiter (LRO) Narrow Angle Camera (NAC) images. The preliminary experimental results demonstrate the feasibility of the proposed approach.
Electrode/workpiece combinations
NASA Astrophysics Data System (ADS)
Benedict, J. J.
1989-10-01
Of the many machine tool operations available in the shop today, plunge cut Electrical Discharge Machining (EDM) has become an increasingly useful method of materials fabrication. It is a necessary tool for the research and development type of work performed at the Lawrence Livermore National Laboratory (LLNL). With advancing technology, plunge cut EDMs are more efficient, faster, have greater accuracy and are able to produce better surface finishes. They have been in the past and will continue to be an important part of the production of quality parts in both the Precision and NC Shop. It should be kept in mind that as a non-traditional machining process, EDMing is a time consuming process that can be a very expensive method of producing parts. For this reason, it must be used in the most efficient manner in order to make it a cost-effective means of fabrication, although technology has advanced to the point of state-of-the-art equipment, there is currently a void in available technical information needed for use with this process. The type of information sought after concerns the area of electrode/workpiece combinations. This is in reference to the task of choosing the correct electrode material for the specific workpiece material encountered. A brief description of the EDM process will help in understanding the electrode/workpiece relationship.
Do humans make good decisions?
Summerfield, Christopher; Tsetsos, Konstantinos
2014-01-01
Human performance on perceptual classification tasks approaches that of an ideal observer, but economic decisions are often inconsistent and intransitive, with preferences reversing according to the local context. We discuss the view that suboptimal choices may result from the efficient coding of decision-relevant information, a strategy that allows expected inputs to be processed with higher gain than unexpected inputs. Efficient coding leads to ‘robust’ decisions that depart from optimality but maximise the information transmitted by a limited-capacity system in a rapidly-changing world. We review recent work showing that when perceptual environments are variable or volatile, perceptual decisions exhibit the same suboptimal context-dependence as economic choices, and propose a general computational framework that accounts for findings across the two domains. PMID:25488076
No-Regrets Remodeling, 2nd Edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-12-01
No-Regrets Remodeling, sponsored by Oak Ridge National Laboratory, is an informative publication that walks homeowners and/or remodelers through various home remodeling projects. In addition to remodeling information, the publication provides instruction on how to incorporate energy efficiency into the remodeling process. The goal of the publication is to improve homeowner satisfaction after completing a remodeling project and to provide the homeowner with a home that saves energy and is comfortable and healthy.
Sanocki, Thomas; Dyson, Mary C
2012-01-01
Letter identification is a critical front end of the reading process. In general, conceptualizations of the identification process have emphasized arbitrary sets of distinctive features. However, a richer view of letter processing incorporates principles from the field of type design, including an emphasis on uniformities across letters within a font. The importance of uniformities is supported by a small body of research indicating that consistency of font increases letter identification efficiency. We review design concepts and the relevant literature, with the goal of stimulating further thinking about letter processing during reading.
75 FR 57438 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-21
... collection of this information is to assist state and federal Natural Resource Trustees in more efficiently... alternatives for natural resource injuries and service losses requiring restoration during the restoration planning phase of the NRDA process. Affected Public: State, local or tribal government; business or other...
Systems and Cascades in Cognitive Development and Academic Achievement
ERIC Educational Resources Information Center
Bornstein, Marc H.; Hahn, Chun-Shin; Wolke, Dieter
2013-01-01
A large-scale ("N" = 552) controlled multivariate prospective 14-year longitudinal study of a developmental cascade embedded in a developmental system showed that information-processing efficiency in infancy (4 months), general mental development in toddlerhood (18 months), behavior difficulties in early childhood (36 months),…
Characterizing Mechanical and Flow Properties using Injection Falloff Tests, March 28, 2011
This presentation asserts that Injection Fall-off Testing is an efficient way to derive in-situ information on most rock types, after-closure analysis can derive rock transmissibility and pore fluid pressure, and this is used to assist in the HF process.
76 FR 58252 - Applications for New Awards; Statewide, Longitudinal Data Systems Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-20
... DEPARTMENT OF EDUCATION Applications for New Awards; Statewide, Longitudinal Data Systems Program... analysis and informed decision- making at all levels of the education system, increase the efficiency with... accountability systems, and simplify the processes used by SEAs to make education data transparent through...
Technical Note: scuda: A software platform for cumulative dose assessment.
Park, Seyoun; McNutt, Todd; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon
2016-10-01
Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (scuda) that can be seamlessly integrated into the clinical workflow. scuda consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.
Technical Note: SCUDA: A software platform for cumulative dose assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Seyoun; McNutt, Todd; Quon, Harry
Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our imagemore » PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides accurate and efficient monitoring of anatomical changes and computation of actually delivered dose to the patient, thus realizing an efficient cumulative dose computation workflow. Evaluation on HN cases demonstrated the utility of our platform for monitoring the treatment quality and detecting significant dosimetric variations that are keys to successful ART.« less
Telematic integration of health data: a practicable contribution.
Guerriero, Lorenzo; Ferdeghini, Ezio M; Viola, Silvia R; Porro, Ivan; Testi, Angela; Bedini, Remo
2011-09-01
The patients' clinical and healthcare data should virtually be available everywhere, both to provide a more efficient and effective medical approach to their pathologies, as well as to make public healthcare decision makers able to verify the efficacy and efficiency of the adopted healthcare processes. Unfortunately, customised solutions adopted by many local Health Information Systems in Italy make it difficult to share the stored data outside their own environment. In the last years, worldwide initiatives have aimed to overcome such sharing limitation. An important issue during the passage towards standardised, integrated information systems is the possible loss of previously collected data. The herein presented project realises a suitable architecture able to guarantee reliable, automatic, user-transparent storing and retrieval of information from both modern and legacy systems. The technical and management solutions provided by the project avoid data loss and overlapping, and allow data integration and organisation suitable for data-mining and data-warehousing analysis.
Chen, Qiaomei; Pei, Zhiqiang; Xu, Yanshuang; Li, Zhen; Yang, Yang
2017-01-01
Efficient and cost-effective solar steam generation requires self-floating evaporators which can convert light into heat, prevent unnecessary heat loss and greatly accelerate evaporation without solar concentrators. Currently, the most efficient evaporators (efficiency of ∼80% under 1 sun) are invariably built from inorganic materials, which are difficult to mold into monolithic sheets. Here, we present a new polymer which can be easily solution processed into a self-floating monolithic foam. The single-component foam can be used as an evaporator with an efficiency at 1 sun comparable to that of the best graphene-based evaporators. Even at 0.5 sun, the efficiency can reach 80%. Moreover, the foam is mechanically strong, thermally stable to 300 °C and chemically resistant to organic solvents. PMID:29629127