Sample records for information processing methods

  1. Methods, media and systems for managing a distributed application running in a plurality of digital processing devices

    DOEpatents

    Laadan, Oren; Nieh, Jason; Phung, Dan

    2012-10-02

    Methods, media and systems for managing a distributed application running in a plurality of digital processing devices are provided. In some embodiments, a method includes running one or more processes associated with the distributed application in virtualized operating system environments on a plurality of digital processing devices, suspending the one or more processes, and saving network state information relating to network connections among the one or more processes. The method further include storing process information relating to the one or more processes, recreating the network connections using the saved network state information, and restarting the one or more processes using the stored process information.

  2. New Method for Knowledge Management Focused on Communication Pattern in Product Development

    NASA Astrophysics Data System (ADS)

    Noguchi, Takashi; Shiba, Hajime

    In the field of manufacturing, the importance of utilizing knowledge and know-how has been growing. To meet this background, there is a need for new methods to efficiently accumulate and extract effective knowledge and know-how. To facilitate the extraction of knowledge and know-how needed by engineers, we first defined business process information which includes schedule/progress information, document data, information about communication among parties concerned, and information which corresponds to these three types of information. Based on our definitions, we proposed an IT system (FlexPIM: Flexible and collaborative Process Information Management) to register and accumulate business process information with the least effort. In order to efficiently extract effective information from huge volumes of accumulated business process information, focusing attention on “actions” and communication patterns, we propose a new extraction method using communication patterns. And the validity of this method has been verified for some communication patterns.

  3. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1995-09-01

    vital processes of a business. process, IDEF, method, methodology, modeling, knowledge acquisition, requirements definition, information systems... knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be leveraged to...integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key enablers for high quality systems

  4. Energy Survey of Machine Tools: Separating Power Information of the Main Transmission System During Machining Process

    NASA Astrophysics Data System (ADS)

    Liu, Shuang; Liu, Fei; Hu, Shaohua; Yin, Zhenbiao

    The major power information of the main transmission system in machine tools (MTSMT) during machining process includes effective output power (i.e. cutting power), input power and power loss from the mechanical transmission system, and the main motor power loss. These information are easy to obtain in the lab but difficult to evaluate in a manufacturing process. To solve this problem, a separation method is proposed here to extract the MTSMT power information during machining process. In this method, the energy flow and the mathematical models of major power information of MTSMT during the machining process are set up first. Based on the mathematical models and the basic data tables obtained from experiments, the above mentioned power information during machining process can be separated just by measuring the real time total input power of the spindle motor. The operation program of this method is also given.

  5. Pilot evaluation of a method to assess prescribers' information processing of medication alerts.

    PubMed

    Russ, Alissa L; Melton, Brittany L; Daggy, Joanne K; Saleem, Jason J

    2017-02-01

    Prescribers commonly receive alerts during medication ordering. Prescribers work in a complex, time-pressured environment; to enhance the effectiveness of safety alerts, the effort needed to cognitively process these alerts should be minimized. Methods to evaluate the extent to which computerized alerts support prescribers' information processing are lacking. To develop a methodological protocol to assess the extent to which alerts support prescribers' information processing at-a-glance; specifically, the incorporation of information into their working memory. We hypothesized that the method would be feasible and that we would be able to detect a significant difference in prescribers' information processing with a revised alert display that incorporates warning design guidelines compared to the original alert display. A counterbalanced, within-subject study was conducted with 20 prescribers in a human-computer interaction laboratory. We tested a single alert that was displayed in two different ways. Prescribers were informed that an alert would appear for 10s. After the alert was shown, a white screen was displayed, and prescribers were asked to verbally describe what they saw; indicate how many total warnings; and describe anything else they remembered about the alert. We measured information processing via the accuracy of prescribers' free recall and their ability to identify that three warning messages were present. Two analysts independently evaluated participants' responses against a comprehensive catalog of alert elements and then discussed discrepancies until reaching consensus. This feasibility study demonstrated that the method seemed to be effective for evaluating prescribers' information processing of medication alert displays. With this method, we were able to detect significant differences in prescribers' recall of alert information. The proportion of total data elements that prescribers were able to accurately recall was significantly greater for the revised versus original alert display (p=0.006). With the revised display, more prescribers accurately reported that three warnings were shown (p=0.002). The methodological protocol was feasible for evaluating the alert display and yielded important findings on prescribers' information processing. Study methods supplement traditional usability evaluation methods and may be useful for evaluating information processing of other healthcare technologies. Published by Elsevier Inc.

  6. Evaluation of a Spatial Data Management System for Basic Skills Education

    DTIC Science & Technology

    1986-03-01

    levels (see Craik & Lockhart , 1972). These methods include verbal and imaginal elaboration (Weinstein, 1978; Weinstein et al., 1979), and a variety of...strategies at a more specific level . I . Information processing strategies are methods to aid acquisition, retention, or retrieval of information. These...methods generally are designed to force students to process information at deeper, semantic or imaginal, levels of processing , rather than at shallower

  7. Talk as a Metacognitive Strategy during the Information Search Process of Adolescents

    ERIC Educational Resources Information Center

    Bowler, Leanne

    2010-01-01

    Introduction: This paper describes a metacognitive strategy related to the social dimension of the information search process of adolescents. Method: A case study that used naturalistic methods to explore the metacognitive thinking nd associated emotions of ten adolescents. The study was framed by Kuhlthau's Information Search Process model and…

  8. The Effects of Presentation Method and Information Density on Visual Search Ability and Working Memory Load

    ERIC Educational Resources Information Center

    Chang, Ting-Wen; Kinshuk; Chen, Nian-Shing; Yu, Pao-Ta

    2012-01-01

    This study investigates the effects of successive and simultaneous information presentation methods on learner's visual search ability and working memory load for different information densities. Since the processing of information in the brain depends on the capacity of visual short-term memory (VSTM), the limited information processing capacity…

  9. Composing Models of Geographic Physical Processes

    NASA Astrophysics Data System (ADS)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  10. An analytical approach to customer requirement information processing

    NASA Astrophysics Data System (ADS)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  11. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787

  12. System approach to modeling of industrial technologies

    NASA Astrophysics Data System (ADS)

    Toropov, V. S.; Toropov, E. S.

    2018-03-01

    The authors presented a system of methods for modeling and improving industrial technologies. The system consists of information and software. The information part is structured information about industrial technologies. The structure has its template. The template has several essential categories used to improve the technological process and eliminate weaknesses in the process chain. The base category is the physical effect that takes place when the technical process proceeds. The programming part of the system can apply various methods of creative search to the content stored in the information part of the system. These methods pay particular attention to energy transformations in the technological process. The system application will allow us to systematize the approach to improving technologies and obtaining new technical solutions.

  13. Improving Informed Consent with Minority Participants: Results from Researcher and Community Surveys

    PubMed Central

    Quinn, Sandra Crouse; Garza, Mary A.; Butler, James; Fryer, Craig S.; Casper, Erica T.; Thomas, Stephen B.; Barnard, David; Kim, Kevin H.

    2013-01-01

    Strengthening the informed consent process is one avenue for improving recruitment of minorities into research. This study examines that process from two different perspectives, that of researchers and that of African American and Latino community members. Through the use of two separate surveys, we compared strategies used by researchers with the preferences and attitudes of community members during the informed consent process. Our data suggest that researchers can improve the informed consent process by incorporating methods preferred by the community members along with methods shown in the literature for increasing comprehension. With this approach, the informed consent process may increase both participants’ comprehension of the material and overall satisfaction, fostering greater trust in research and openness to future research opportunities. PMID:23324203

  14. "Glitch Logic" and Applications to Computing and Information Security

    NASA Technical Reports Server (NTRS)

    Stoica, Adrian; Katkoori, Srinivas

    2009-01-01

    This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.

  15. A Study on Improving Information Processing Abilities Based on PBL

    ERIC Educational Resources Information Center

    Kim, Du Gyu; Lee, JaeMu

    2014-01-01

    This study examined an instruction method for the improvement of information processing abilities in elementary school students. Current elementary students are required to develop information processing abilities to create new knowledge for this digital age. There is, however, a shortage of instruction strategies for these information processing…

  16. Improving informed consent: Stakeholder views

    PubMed Central

    Anderson, Emily E.; Newman, Susan B.; Matthews, Alicia K.

    2017-01-01

    Purpose Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders—research participants and those responsible for obtaining informed consent—to inform potential development of a multimedia informed consent “app.” Methods This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. Results We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Conclusions Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms. PMID:28949896

  17. Problems and Processes in Medical Encounters: The CASES method of dialogue analysis

    PubMed Central

    Laws, M. Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B.

    2013-01-01

    Objective To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Methods Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The Comprehensive Analysis of the Structure of Encounters System (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads – the problems or issues addressed; and processes within threads –basic tasks of clinical care labeled Presentation, Information, Resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. Results 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. Conclusions In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Practice Implications Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. PMID:23391684

  18. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  19. PD-atricians: Leveraging Physicians and Participatory Design to Develop Novel Clinical Information Tools

    PubMed Central

    Pollack, Ari H; Miller, Andrew; Mishra, Sonali R.; Pratt, Wanda

    2016-01-01

    Participatory design, a method by which system users and stakeholders meaningfully contribute to the development of a new process or technology, has great potential to revolutionize healthcare technology, yet has seen limited adoption. We conducted a design session with eleven physicians working to create a novel clinical information tool utilizing participatory design methods. During the two-hour session, the physicians quickly engaged in the process and generated a large quantity of information, informing the design of a future tool. By utilizing facilitators experienced in design methodology, with detailed domain expertise, and well integrated into the healthcare organization, the participatory design session engaged a group of users who are often disenfranchised with existing processes as well as health information technology in general. We provide insight into why participatory design works with clinicians and provide guiding principles for how to implement these methods in healthcare organizations interested in advancing health information technology. PMID:28269900

  20. Method and Application for Dynamic Comprehensive Evaluation with Subjective and Objective Information

    PubMed Central

    Liu, Dinglin; Zhao, Xianglian

    2013-01-01

    In an effort to deal with more complicated evaluation situations, scientists have focused their efforts on dynamic comprehensive evaluation research. How to make full use of the subjective and objective information has become one of the noteworthy content. In this paper, a dynamic comprehensive evaluation method with subjective and objective information is proposed. We use the combination weighting method to determine the index weight. Analysis hierarchy process method is applied to dispose the subjective information, and criteria importance through intercriteria correlation method is used to handle the objective information. And for the time weight determination, we consider both time distance and information size to embody the principle of esteeming the present over the past. And then the linear weighted average model is constructed to make the evaluation process more practicable. Finally, an example is presented to illustrate the effectiveness of this method. Overall, the results suggest that the proposed method is reasonable and effective. PMID:24386176

  1. Evaluation of the clinical process in a critical care information system using the Lean method: a case study

    PubMed Central

    2012-01-01

    Background There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. Methods We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. Results We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. Conclusions The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy. PMID:23259846

  2. Modular error embedding

    DOEpatents

    Sandford, II, Maxwell T.; Handel, Theodore G.; Ettinger, J. Mark

    1999-01-01

    A method of embedding auxiliary information into the digital representation of host data containing noise in the low-order bits. The method applies to digital data representing analog signals, for example digital images. The method reduces the error introduced by other methods that replace the low-order bits with auxiliary information. By a substantially reverse process, the embedded auxiliary data can be retrieved easily by an authorized user through use of a digital key. The modular error embedding method includes a process to permute the order in which the host data values are processed. The method doubles the amount of auxiliary information that can be added to host data values, in comparison with bit-replacement methods for high bit-rate coding. The invention preserves human perception of the meaning and content of the host data, permitting the addition of auxiliary data in the amount of 50% or greater of the original host data.

  3. Method for Examination and Documentation of Basic Information and Metadata from Published Reports Relevant to the Study of Stormwater Runoff Quality

    USGS Publications Warehouse

    Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.

    1999-01-01

    A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.

  4. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  5. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  6. Unconscious neural processing differs with method used to render stimuli invisible

    PubMed Central

    Fogelson, Sergey V.; Kohler, Peter J.; Miller, Kevin J.; Granger, Richard; Tse, Peter U.

    2014-01-01

    Visual stimuli can be kept from awareness using various methods. The extent of processing that a given stimulus receives in the absence of awareness is typically used to make claims about the role of consciousness more generally. The neural processing elicited by a stimulus, however, may also depend on the method used to keep it from awareness, and not only on whether the stimulus reaches awareness. Here we report that the method used to render an image invisible has a dramatic effect on how category information about the unseen stimulus is encoded across the human brain. We collected fMRI data while subjects viewed images of faces and tools, that were rendered invisible using either continuous flash suppression (CFS) or chromatic flicker fusion (CFF). In a third condition, we presented the same images under normal fully visible viewing conditions. We found that category information about visible images could be extracted from patterns of fMRI responses throughout areas of neocortex known to be involved in face or tool processing. However, category information about stimuli kept from awareness using CFS could be recovered exclusively within occipital cortex, whereas information about stimuli kept from awareness using CFF was also decodable within temporal and frontal regions. We conclude that unconsciously presented objects are processed differently depending on how they are rendered subjectively invisible. Caution should therefore be used in making generalizations on the basis of any one method about the neural basis of consciousness or the extent of information processing without consciousness. PMID:24982647

  7. Unconscious neural processing differs with method used to render stimuli invisible.

    PubMed

    Fogelson, Sergey V; Kohler, Peter J; Miller, Kevin J; Granger, Richard; Tse, Peter U

    2014-01-01

    Visual stimuli can be kept from awareness using various methods. The extent of processing that a given stimulus receives in the absence of awareness is typically used to make claims about the role of consciousness more generally. The neural processing elicited by a stimulus, however, may also depend on the method used to keep it from awareness, and not only on whether the stimulus reaches awareness. Here we report that the method used to render an image invisible has a dramatic effect on how category information about the unseen stimulus is encoded across the human brain. We collected fMRI data while subjects viewed images of faces and tools, that were rendered invisible using either continuous flash suppression (CFS) or chromatic flicker fusion (CFF). In a third condition, we presented the same images under normal fully visible viewing conditions. We found that category information about visible images could be extracted from patterns of fMRI responses throughout areas of neocortex known to be involved in face or tool processing. However, category information about stimuli kept from awareness using CFS could be recovered exclusively within occipital cortex, whereas information about stimuli kept from awareness using CFF was also decodable within temporal and frontal regions. We conclude that unconsciously presented objects are processed differently depending on how they are rendered subjectively invisible. Caution should therefore be used in making generalizations on the basis of any one method about the neural basis of consciousness or the extent of information processing without consciousness.

  8. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach.

    PubMed

    Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-07-20

    Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.

  9. Systems and Methods for Radar Data Communication

    NASA Technical Reports Server (NTRS)

    Bunch, Brian (Inventor); Szeto, Roland (Inventor); Miller, Brad (Inventor)

    2013-01-01

    A radar information processing system is operable to process high bandwidth radar information received from a radar system into low bandwidth radar information that may be communicated to a low bandwidth connection coupled to an electronic flight bag (EFB). An exemplary embodiment receives radar information from a radar system, the radar information communicated from the radar system at a first bandwidth; processes the received radar information into processed radar information, the processed radar information configured for communication over a connection operable at a second bandwidth, the second bandwidth lower than the first bandwidth; and communicates the radar information from a radar system, the radar information communicated from the radar system at a first bandwidth.

  10. Mutual information estimation reveals global associations between stimuli and biological processes

    PubMed Central

    Suzuki, Taiji; Sugiyama, Masashi; Kanamori, Takafumi; Sese, Jun

    2009-01-01

    Background Although microarray gene expression analysis has become popular, it remains difficult to interpret the biological changes caused by stimuli or variation of conditions. Clustering of genes and associating each group with biological functions are often used methods. However, such methods only detect partial changes within cell processes. Herein, we propose a method for discovering global changes within a cell by associating observed conditions of gene expression with gene functions. Results To elucidate the association, we introduce a novel feature selection method called Least-Squares Mutual Information (LSMI), which computes mutual information without density estimaion, and therefore LSMI can detect nonlinear associations within a cell. We demonstrate the effectiveness of LSMI through comparison with existing methods. The results of the application to yeast microarray datasets reveal that non-natural stimuli affect various biological processes, whereas others are no significant relation to specific cell processes. Furthermore, we discover that biological processes can be categorized into four types according to the responses of various stimuli: DNA/RNA metabolism, gene expression, protein metabolism, and protein localization. Conclusion We proposed a novel feature selection method called LSMI, and applied LSMI to mining the association between conditions of yeast and biological processes through microarray datasets. In fact, LSMI allows us to elucidate the global organization of cellular process control. PMID:19208155

  11. A consensus reaching model for 2-tuple linguistic multiple attribute group decision making with incomplete weight information

    NASA Astrophysics Data System (ADS)

    Zhang, Wancheng; Xu, Yejun; Wang, Huimin

    2016-01-01

    The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.

  12. Radiology information system: a workflow-based approach.

    PubMed

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  13. Investigating the impact of spatial priors on the performance of model-based IVUS elastography

    PubMed Central

    Richards, M S; Doyley, M M

    2012-01-01

    This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648

  14. A Novel College Network Resource Management Method using Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Chen

    At present information construction of college mainly has construction of college networks and management information system; there are many problems during the process of information. Cloud computing is development of distributed processing, parallel processing and grid computing, which make data stored on the cloud, make software and services placed in the cloud and build on top of various standards and protocols, you can get it through all kinds of equipments. This article introduces cloud computing and function of cloud computing, then analyzes the exiting problems of college network resource management, the cloud computing technology and methods are applied in the construction of college information sharing platform.

  15. An Information System Development Method Connecting Business Process Modeling and its Experimental Evaluation

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao

    Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.

  16. Problems and processes in medical encounters: the cases method of dialogue analysis.

    PubMed

    Laws, M Barton; Taubin, Tatiana; Bezreh, Tanya; Lee, Yoojin; Beach, Mary Catherine; Wilson, Ira B

    2013-05-01

    To develop methods to reliably capture structural and dynamic temporal features of clinical interactions. Observational study of 50 audio-recorded routine outpatient visits to HIV specialty clinics, using innovative analytic methods. The comprehensive analysis of the structure of encounters system (CASES) uses transcripts coded for speech acts, then imposes larger-scale structural elements: threads--the problems or issues addressed; and processes within threads--basic tasks of clinical care labeled presentation, information, resolution (decision making) and Engagement (interpersonal exchange). Threads are also coded for the nature of resolution. 61% of utterances are in presentation processes. Provider verbal dominance is greatest in information and resolution processes, which also contain a high proportion of provider directives. About half of threads result in no action or decision. Information flows predominantly from patient to provider in presentation processes, and from provider to patient in information processes. Engagement is rare. In this data, resolution is provider centered; more time for patient participation in resolution, or interpersonal engagement, would have to come from presentation. Awareness of the use of time in clinical encounters, and the interaction processes associated with various tasks, may help make clinical communication more efficient and effective. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  17. A Multilevel Comprehensive Assessment of International Accreditation for Business Programmes-Based on AMBA Accreditation of GDUFS

    ERIC Educational Resources Information Center

    Jiang, Yong

    2017-01-01

    Traditional mathematical methods built around exactitude have limitations when applied to the processing of educational information, due to their uncertainty and imperfection. Alternative mathematical methods, such as grey system theory, have been widely applied in processing incomplete information systems and have proven effective in a number of…

  18. Foreign Language Methods and an Information Processing Model of Memory.

    ERIC Educational Resources Information Center

    Willebrand, Julia

    The major approaches to language teaching (audiolingual method, generative grammar, Community Language Learning and Silent Way) are investigated to discover whether or not they are compatible in structure with an information-processing model of memory (IPM). The model of memory used was described by Roberta Klatzky in "Human Memory:…

  19. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    PubMed Central

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and artificial systems. PMID:25068489

  20. From user needs to system specifications: multi-disciplinary thematic seminars as a collaborative design method for development of health information systems.

    PubMed

    Scandurra, I; Hägglund, M; Koch, S

    2008-08-01

    This paper presents a new multi-disciplinary method for user needs analysis and requirements specification in the context of health information systems based on established theories from the fields of participatory design and computer supported cooperative work (CSCW). Whereas conventional methods imply a separate, sequential needs analysis for each profession, the "multi-disciplinary thematic seminar" (MdTS) method uses a collaborative design process. Application of the method in elderly homecare resulted in prototypes that were well adapted to the intended user groups. Vital information in the points of intersection between different care professions was elicited and a holistic view of the entire care process was obtained. Health informatics-usability specialists and clinical domain experts are necessary to apply the method. Although user needs acquisition can be time-consuming, MdTS was perceived to efficiently identify in-context user needs, and transformed these directly into requirements specifications. Consequently the method was perceived to expedite the entire ICT implementation process.

  1. A Time-Space Domain Information Fusion Method for Specific Emitter Identification Based on Dempster-Shafer Evidence Theory.

    PubMed

    Jiang, Wen; Cao, Ying; Yang, Lin; He, Zichang

    2017-08-28

    Specific emitter identification plays an important role in contemporary military affairs. However, most of the existing specific emitter identification methods haven't taken into account the processing of uncertain information. Therefore, this paper proposes a time-space domain information fusion method based on Dempster-Shafer evidence theory, which has the ability to deal with uncertain information in the process of specific emitter identification. In this paper, radars will generate a group of evidence respectively based on the information they obtained, and our main task is to fuse the multiple groups of evidence to get a reasonable result. Within the framework of recursive centralized fusion model, the proposed method incorporates a correlation coefficient, which measures the relevance between evidence and a quantum mechanical approach, which is based on the parameters of radar itself. The simulation results of an illustrative example demonstrate that the proposed method can effectively deal with uncertain information and get a reasonable recognition result.

  2. Evaluation of the clinical process in a critical care information system using the Lean method: a case study.

    PubMed

    Yusof, Maryati Mohd; Khodambashi, Soudabeh; Mokhtar, Ariffin Marzuki

    2012-12-21

    There are numerous applications for Health Information Systems (HIS) that support specific tasks in the clinical workflow. The Lean method has been used increasingly to optimize clinical workflows, by removing waste and shortening the delivery cycle time. There are a limited number of studies on Lean applications related to HIS. Therefore, we applied the Lean method to evaluate the clinical processes related to HIS, in order to evaluate its efficiency in removing waste and optimizing the process flow. This paper presents the evaluation findings of these clinical processes, with regards to a critical care information system (CCIS), known as IntelliVue Clinical Information Portfolio (ICIP), and recommends solutions to the problems that were identified during the study. We conducted a case study under actual clinical settings, to investigate how the Lean method can be used to improve the clinical process. We used observations, interviews, and document analysis, to achieve our stated goal. We also applied two tools from the Lean methodology, namely the Value Stream Mapping and the A3 problem-solving tools. We used eVSM software to plot the Value Stream Map and A3 reports. We identified a number of problems related to inefficiency and waste in the clinical process, and proposed an improved process model. The case study findings show that the Value Stream Mapping and the A3 reports can be used as tools to identify waste and integrate the process steps more efficiently. We also proposed a standardized and improved clinical process model and suggested an integrated information system that combines database and software applications to reduce waste and data redundancy.

  3. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  4. Methods for Improving Information from ’Undesigned’ Human Factors Experiments.

    DTIC Science & Technology

    Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction

  5. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  6. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  8. IDEF3 Formalization Report

    DTIC Science & Technology

    1991-10-01

    SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department

  9. Image processing of metal surface with structured light

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Feng, Chang; Wang, Congzheng

    2014-09-01

    In structured light vision measurement system, the ideal image of structured light strip, in addition to black background , contains only the gray information of the position of the stripe. However, the actual image contains image noise, complex background and so on, which does not belong to the stripe, and it will cause interference to useful information. To extract the stripe center of mental surface accurately, a new processing method was presented. Through adaptive median filtering, the noise can be preliminary removed, and the noise which introduced by CCD camera and measured environment can be further removed with difference image method. To highlight fine details and enhance the blurred regions between the stripe and noise, the sharping algorithm is used which combine the best features of Laplacian operator and Sobel operator. Morphological opening operation and closing operation are used to compensate the loss of information.Experimental results show that this method is effective in the image processing, not only to restrain the information but also heighten contrast. It is beneficial for the following processing.

  10. New Model of Information Technology Governance in the Government of Gorontalo City using Framework COBIT 4.1

    NASA Astrophysics Data System (ADS)

    Bouty, A. A.; Koniyo, M. H.; Novian, D.

    2018-02-01

    This study aims to determine the level of maturity of information technology governance in Gorontalo city government by applying the COBIT framework 4.1. The research method is the case study method, by conducting surveys and data collection at 25 institution in Gorontalo City. The results of this study is the analysis of information technology needs based on the measurement of maturity level. The results of the measurement of the maturity level of information technology governance shows that there are still many business processes running at lower level, from 9 existing business processes there are 4 processes at level 2 (repetitive but intuitive) and 3 processes at level 1 (Initial/Ad hoc). With these results, is expected that the government of Gorontalo city immediately make improvements to the governance of information technology so that it can run more effectively and efficiently.

  11. The role of production process and information on quality expectations and perceptions of sparkling wines.

    PubMed

    Vecchio, Riccardo; Lisanti, Maria Tiziana; Caracciolo, Francesco; Cembalo, Luigi; Gambuti, Angelita; Moio, Luigi; Siani, Tiziana; Marotta, Giuseppe; Nazzaro, Concetta; Piombino, Paola

    2018-05-28

    The present research aims to analyse, by combining sensory and experimental economics techniques, to what extent production process, and the information about it, may affect consumer preferences. Sparkling wines produced by Champenoise and Charmat methods were the object of the study. A quantitative descriptive sensory analysis with a trained panel and non-hypothetical auctions combined with hedonic ratings involving young wine consumers (N=100), under different information scenarios(Blind, Info and Info Taste), were performed. Findings show that the production process impacts both the sensory profile of sparkling wines and consumer expectations. In particular, the hedonic ratings revealed that when tasting the products, both with no information on the production process (Blind) and with such information (Info Taste), the consumers preferred the Charmat wines. On the contrary, when detailed information on the production methods was given without tasting (Info), consumers liked more the two Champenoise wines. It can be concluded that sensory and non-sensory attributes of sparkling wines affect consumers' preferences. Specifically, the study suggests that production process information strongly impacts liking expectations, while not affecting informed liking. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. IDEF3 formalization report

    NASA Technical Reports Server (NTRS)

    Menzel, Christopher; Mayer, Richard J.; Edwards, Douglas D.

    1991-01-01

    The Process Description Capture Method (IDEF3) is one of several Integrated Computer-Aided Manufacturing (ICAM) DEFinition methods developed by the Air Force to support systems engineering activities, and in particular, to support information systems development. These methods have evolved as a distillation of 'good practice' experience by information system developers and are designed to raise the performance level of the novice practitioner to one comparable with that of an expert. IDEF3 is meant to serve as a knowledge acquisition and requirements definition tool that structures the user's understanding of how a given process, event, or system works around process descriptions. A special purpose graphical language accompanying the method serves to highlight temporal precedence and causality relationships relative to the process or event being described.

  13. A Review of Qualitative Data Gathering Methods and Their Applications To Support Organizational Strategic Planning Processes. Study Number Six.

    ERIC Educational Resources Information Center

    Wright, Phillip C.; Geroy, Gary D.

    Exploring existing methodologies to determine whether they can be adapted or adopted to support strategic goal setting, this paper focuses on information gathering techniques as they relate to the human resource development professional's input into strategic planning processes. The information gathering techniques are all qualitative methods and…

  14. Combination of uncertainty theories and decision-aiding methods for natural risk management in a context of imperfect information

    NASA Astrophysics Data System (ADS)

    Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille

    2017-04-01

    In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.

  15. Performance Measurement of Location Enabled e-Government Processes: A Use Case on Traffic Safety Monitoring

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, D.; Vancauwenberghe, G.

    2016-12-01

    The European Union Location Framework (EULF), as part of the Interoperable Solutions for European Public Administrations (ISA) Programme of the EU (EC DG DIGIT), aims to enhance the interactions between governments, businesses and citizens by embedding location information into e-Government processes. The challenge remains to find scientific sound and at the same time practicable approaches to estimate or measure the impact of location enablement of e-Government processes on the performance of the processes. A method has been defined to estimate process performance in terms of variables describing the efficiency, effectiveness, as well as the quality of the output of the work processes. A series of use cases have been identified, corresponding to existing e-Government work processes in which location information could bring added value. In a first step, the processes are described by means of BPMN (Business Process Model and Notation) to better understand the process steps, the actors involved, the spatial data flows, as well as the required input and the generated output. In a second step the processes are assessed in terms of the (sub-optimal) use of location information and the potential enhancement of the process by better integrating location information and services. The process performance is measured ex ante (before using location enabled e-Government services) and ex-post (after the integration of such services) in order to estimate and measure the impact of location information. The paper describes the method for performance measurement and highlights how the method is applied to one use case, i.e. the process of traffic safety monitoring. The use case is analysed and assessed in terms of location enablement and its potential impact on process performance. The results of applying the methodology on the use case revealed that performance is highly impacted by factors such as the way location information is collected, managed and shared throughout the process, and the degree to which spatial data are harmonized. The work led also to the formulation of some recommendations to enrich the BPMN standard with additional methods for annotating processes, and to the proposal of the development of some tools for automatic process performance. In that context some planned future work is highlighted as well.

  16. An Adaptive Altitude Information Fusion Method for Autonomous Landing Processes of Small Unmanned Aerial Rotorcraft

    PubMed Central

    Lei, Xusheng; Li, Jingjing

    2012-01-01

    This paper presents an adaptive information fusion method to improve the accuracy and reliability of the altitude measurement information for small unmanned aerial rotorcraft during the landing process. Focusing on the low measurement performance of sensors mounted on small unmanned aerial rotorcraft, a wavelet filter is applied as a pre-filter to attenuate the high frequency noises in the sensor output. Furthermore, to improve altitude information, an adaptive extended Kalman filter based on a maximum a posteriori criterion is proposed to estimate measurement noise covariance matrix in real time. Finally, the effectiveness of the proposed method is proved by static tests, hovering flight and autonomous landing flight tests. PMID:23201993

  17. A method of demand-driven and data-centric Web service configuration for flexible business process implementation

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Li Da; Fei, Xiang; Jiang, Lihong; Cai, Hongming; Wang, Shuai

    2017-08-01

    Facing the rapidly changing business environments, implementation of flexible business process is crucial, but difficult especially in data-intensive application areas. This study aims to provide scalable and easily accessible information resources to leverage business process management. In this article, with a resource-oriented approach, enterprise data resources are represented as data-centric Web services, grouped on-demand of business requirement and configured dynamically to adapt to changing business processes. First, a configurable architecture CIRPA involving information resource pool is proposed to act as a scalable and dynamic platform to virtualise enterprise information resources as data-centric Web services. By exposing data-centric resources as REST services in larger granularities, tenant-isolated information resources could be accessed in business process execution. Second, dynamic information resource pool is designed to fulfil configurable and on-demand data accessing in business process execution. CIRPA also isolates transaction data from business process while supporting diverse business processes composition. Finally, a case study of using our method in logistics application shows that CIRPA provides an enhanced performance both in static service encapsulation and dynamic service execution in cloud computing environment.

  18. Method for Evaluating Information to Solve Problems of Control, Monitoring and Diagnostics

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-06-01

    The article describes a method for evaluating information to solve problems of control, monitoring and diagnostics. It is necessary for reducing the dimensionality of informational indicators of situations, bringing them to relative units, for calculating generalized information indicators on their basis, ranking them by characteristic levels, for calculating the efficiency criterion of a system functioning in real time. The design of information evaluation system has been developed on its basis that allows analyzing, processing and assessing information about the object. Such object can be a complex technical, economic and social system. The method and the based system thereof can find a wide application in the field of analysis, processing and evaluation of information on the functioning of the systems, regardless of their purpose, goals, tasks and complexity. For example, they can be used to assess the innovation capacities of industrial enterprises and management decisions.

  19. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  20. Inference of Gene Regulatory Networks Using Bayesian Nonparametric Regression and Topology Information.

    PubMed

    Fan, Yue; Wang, Xiao; Peng, Qinke

    2017-01-01

    Gene regulatory networks (GRNs) play an important role in cellular systems and are important for understanding biological processes. Many algorithms have been developed to infer the GRNs. However, most algorithms only pay attention to the gene expression data but do not consider the topology information in their inference process, while incorporating this information can partially compensate for the lack of reliable expression data. Here we develop a Bayesian group lasso with spike and slab priors to perform gene selection and estimation for nonparametric models. B-spline basis functions are used to capture the nonlinear relationships flexibly and penalties are used to avoid overfitting. Further, we incorporate the topology information into the Bayesian method as a prior. We present the application of our method on DREAM3 and DREAM4 datasets and two real biological datasets. The results show that our method performs better than existing methods and the topology information prior can improve the result.

  1. Markov Processes in Image Processing

    NASA Astrophysics Data System (ADS)

    Petrov, E. P.; Kharina, N. L.

    2018-05-01

    Digital images are used as an information carrier in different sciences and technologies. The aspiration to increase the number of bits in the image pixels for the purpose of obtaining more information is observed. In the paper, some methods of compression and contour detection on the basis of two-dimensional Markov chain are offered. Increasing the number of bits on the image pixels will allow one to allocate fine object details more precisely, but it significantly complicates image processing. The methods of image processing do not concede by the efficiency to well-known analogues, but surpass them in processing speed. An image is separated into binary images, and processing is carried out in parallel with each without an increase in speed, when increasing the number of bits on the image pixels. One more advantage of methods is the low consumption of energy resources. Only logical procedures are used and there are no computing operations. The methods can be useful in processing images of any class and assignment in processing systems with a limited time and energy resources.

  2. Evaluation of Patient Handoff Methods on an Inpatient Teaching Service

    PubMed Central

    Craig, Steven R.; Smith, Hayden L.; Downen, A. Matthew; Yost, W. John

    2012-01-01

    Background The patient handoff process can be a highly variable and unstructured period at risk for communication errors. The morning sign-in process used by resident physicians at teaching hospitals typically involves less rigorous handoff protocols than the resident evening sign-out process. Little research has been conducted on best practices for handoffs during morning sign-in exchanges between resident physicians. Research must evaluate optimal protocols for the resident morning sign-in process. Methods Three morning handoff protocols consisting of written, electronic, and face-to-face methods were implemented over 3 study phases during an academic year. Study participants included all interns covering the internal medicine inpatient teaching service at a tertiary hospital. Study measures entailed intern survey-based interviews analyzed for failures in handoff protocols with or without missed pertinent information. Descriptive and comparative analyses examined study phase differences. Results A scheduled face-to-face handoff process had the fewest protocol deviations and demonstrated best communication of essential patient care information between cross-covering teams compared to written and electronic sign-in protocols. Conclusion Intern patient handoffs were more reliable when the sign-in protocol included scheduled face-to-face meetings. This method provided the best communication of patient care information and allowed for open exchanges of information. PMID:23267259

  3. Surveillance of industrial processes with correlated parameters

    DOEpatents

    White, Andrew M.; Gross, Kenny C.; Kubic, William L.; Wigeland, Roald A.

    1996-01-01

    A system and method for surveillance of an industrial process. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions.

  4. [The use of automated processing of information obtained during space flights for the monitoring and evaluation of airborne pollution].

    PubMed

    Bagmanov, B Kh; Mikhaĭlova, A Iu; Pavlov, S V

    1997-01-01

    The article describes experience on use of automated processing of information obtained during spaceflights for analysis of urban air pollution. The authors present a method for processing of information obtained during spaceflights and show how to identify foci of industrial release and area of their spread within and beyond the cities.

  5. Indicators and Metrics for Evaluating the Sustainability of Chemical Processes

    EPA Science Inventory

    A metric-based method, called GREENSCOPE, has been developed for evaluating process sustainability. Using lab-scale information and engineering assumptions the method evaluates full-scale epresentations of processes in environmental, efficiency, energy and economic areas. The m...

  6. Terminology model discovery using natural language processing and visualization techniques.

    PubMed

    Zhou, Li; Tao, Ying; Cimino, James J; Chen, Elizabeth S; Liu, Hongfang; Lussier, Yves A; Hripcsak, George; Friedman, Carol

    2006-12-01

    Medical terminologies are important for unambiguous encoding and exchange of clinical information. The traditional manual method of developing terminology models is time-consuming and limited in the number of phrases that a human developer can examine. In this paper, we present an automated method for developing medical terminology models based on natural language processing (NLP) and information visualization techniques. Surgical pathology reports were selected as the testing corpus for developing a pathology procedure terminology model. The use of a general NLP processor for the medical domain, MedLEE, provides an automated method for acquiring semantic structures from a free text corpus and sheds light on a new high-throughput method of medical terminology model development. The use of an information visualization technique supports the summarization and visualization of the large quantity of semantic structures generated from medical documents. We believe that a general method based on NLP and information visualization will facilitate the modeling of medical terminologies.

  7. Approaching the Affective Factors of Information Seeking: The Viewpoint of the Information Search Process Model

    ERIC Educational Resources Information Center

    Savolainen, Reijo

    2015-01-01

    Introduction: The article contributes to the conceptual studies of affective factors in information seeking by examining Kuhlthau's information search process model. Method: This random-digit dial telephone survey of 253 people (75% female) living in a rural, medically under-serviced area of Ontario, Canada, follows-up a previous interview study…

  8. Improving informed consent: Stakeholder views.

    PubMed

    Anderson, Emily E; Newman, Susan B; Matthews, Alicia K

    2017-01-01

    Innovation will be required to improve the informed consent process in research. We aimed to obtain input from key stakeholders-research participants and those responsible for obtaining informed consent-to inform potential development of a multimedia informed consent "app." This descriptive study used a mixed-methods approach. Five 90-minute focus groups were conducted with volunteer samples of former research participants and researchers/research staff responsible for obtaining informed consent. Participants also completed a brief survey that measured background information and knowledge and attitudes regarding research and the use of technology. Established qualitative methods were used to conduct the focus groups and data analysis. We conducted five focus groups with 41 total participants: three groups with former research participants (total n = 22), and two groups with researchers and research coordinators (total n = 19). Overall, individuals who had previously participated in research had positive views regarding their experiences. However, further discussion elicited that the informed consent process often did not meet its intended objectives. Findings from both groups are presented according to three primary themes: content of consent forms, experience of the informed consent process, and the potential of technology to improve the informed consent process. A fourth theme, need for lay input on informed consent, emerged from the researcher groups. Our findings add to previous research that suggests that the use of interactive technology has the potential to improve the process of informed consent. However, our focus-group findings provide additional insight that technology cannot replace the human connection that is central to the informed consent process. More research that incorporates the views of key stakeholders is needed to ensure that multimedia consent processes do not repeat the mistakes of paper-based consent forms.

  9. System and method for integrating hazard-based decision making tools and processes

    DOEpatents

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  10. The Research on Linux Memory Forensics

    NASA Astrophysics Data System (ADS)

    Zhang, Jun; Che, ShengBing

    2018-03-01

    Memory forensics is a branch of computer forensics. It does not depend on the operating system API, and analyzes operating system information from binary memory data. Based on the 64-bit Linux operating system, it analyzes system process and thread information from physical memory data. Using ELF file debugging information and propose a method for locating kernel structure member variable, it can be applied to different versions of the Linux operating system. The experimental results show that the method can successfully obtain the sytem process information from physical memory data, and can be compatible with multiple versions of the Linux kernel.

  11. A new stationary gridline artifact suppression method based on the 2D discrete wavelet transform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Hui, E-mail: corinna@seu.edu.cn; Key Laboratory of Computer Network and Information Integration; Centre de Recherche en Information Biomédicale sino-français, Laboratoire International Associé, Inserm, Université de Rennes 1, Rennes 35000

    2015-04-15

    Purpose: In digital x-ray radiography, an antiscatter grid is inserted between the patient and the image receptor to reduce scattered radiation. If the antiscatter grid is used in a stationary way, gridline artifacts will appear in the final image. In most of the gridline removal image processing methods, the useful information with spatial frequencies close to that of the gridline is usually lost or degraded. In this study, a new stationary gridline suppression method is designed to preserve more of the useful information. Methods: The method is as follows. The input image is first recursively decomposed into several smaller subimagesmore » using a multiscale 2D discrete wavelet transform. The decomposition process stops when the gridline signal is found to be greater than a threshold in one or several of these subimages using a gridline detection module. An automatic Gaussian band-stop filter is then applied to the detected subimages to remove the gridline signal. Finally, the restored image is achieved using the corresponding 2D inverse discrete wavelet transform. Results: The processed images show that the proposed method can remove the gridline signal efficiently while maintaining the image details. The spectra of a 1D Fourier transform of the processed images demonstrate that, compared with some existing gridline removal methods, the proposed method has better information preservation after the removal of the gridline artifacts. Additionally, the performance speed is relatively high. Conclusions: The experimental results demonstrate the efficiency of the proposed method. Compared with some existing gridline removal methods, the proposed method can preserve more information within an acceptable execution time.« less

  12. Information Loss: Exploring the Information Systems Management's Neglect Affecting Softcopy Reproduction of Heritage-Data

    ERIC Educational Resources Information Center

    Oskooie, Kamran Rezai

    2012-01-01

    This exploratory mixed methods study quantified and explored leadership interest in legacy-data conversion and information processing. Questionnaires were administered electronically to 92 individuals in design, manufacturing, and other professions from the manufacturing, processing, Internet, computing, software and technology divisions. Research…

  13. Using Teacher Effectiveness Data for Information-Rich Hiring

    ERIC Educational Resources Information Center

    Cannata, Marisa; Rubin, Mollie; Goldring, Ellen; Grissom, Jason A.; Neumerski, Christine M.; Drake, Timothy A.; Schuermann, Patrick

    2017-01-01

    Purpose: New teacher effectiveness measures have the potential to influence how principals hire teachers as they provide new and richer information about candidates to a traditionally information-poor process. This article examines how the hiring process is changing as a result of teacher evaluation reforms. Research Methods: Data come from…

  14. Improving Program Performance through Management Information. A Workbook.

    ERIC Educational Resources Information Center

    Bienia, Nancy

    Designed specifically for state and local managers and supervisors who plan, direct, and operate child support enforcement programs, this workbook provides a four-part, step-by-step process for identifying needed information and methods of using the information to operate an effective program. The process consists of: (1) determining what…

  15. Bringing in the "CIA": A New Process to Improve Staff Communication.

    PubMed

    Hunter, Rebecca; Mitchell, Jinjer; Loomis, Elena

    2015-01-01

    Nurses consistently express dissatisfaction with the overwhelming amount and rate of change in health care today. Nurse educators identified this as a problem at a 475-bed hospital and developed a process to present changes in information in a new and exciting method. This article reports on the identification and implementation of the new communication model and the lessons learned during the process. A new method for communication dissemination was designed utilizing a "Coordinator Information Advisory Group" concept.

  16. Surveillance of industrial processes with correlated parameters

    DOEpatents

    White, A.M.; Gross, K.C.; Kubic, W.L.; Wigeland, R.A.

    1996-12-17

    A system and method for surveillance of an industrial process are disclosed. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions. 10 figs.

  17. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  18. Use of Multimedia Technology in the Doctor-Patient Relationship for Obtaining Patient Informed Consent.

    PubMed

    Michalski, Andrzej; Stopa, Marcin; Miśkowiak, Bogdan

    2016-10-26

    Patient informed consent for surgery or for high-risk methods of treatment or diagnosis means that unlawful breach of the patient's personal interests is avoided and the patient accepts the risk of surgery and takes the brunt of it. Patient awareness - their knowledge of the condition and circumstances of continued therapeutic procedure, including offered and available methods of treatment and their possible complications - constitutes a particular aspect of the informed-consent process. The rapid development of technologies and methods of treatment may cause communication problems between the doctor and the patient regarding the scope and method of patient education prior to surgery. The use of multimedia technology (e.g., videos of surgical procedures, computer animation, and graphics), in addition to media used in preoperative patient education, may be a factor in improving the quality of the informed consent process. Studies conducted in clinical centers show that with use of multimedia technology, patients remember more of the information presented. The use of new technology also makes it possible to reduce the difference in the amount of information assimilated by patients with different levels of education. The use of media is a way to improve the quality of preoperative patient education and, at the same time, a step towards their further empowerment in the healing process.

  19. Measuring the Return on Information Technology: A Knowledge-Based Approach for Revenue Allocation at the Process and Firm Level

    DTIC Science & Technology

    2005-07-01

    approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of

  20. Information-Processing Theory and Perspectives on Development: A Look at Concepts and Methods--The View of a Developmental Ethologist.

    ERIC Educational Resources Information Center

    Jesness, Bradley

    This paper examines concepts in information-processing theory which are likely to be relevant to development and characterizes the methods and data upon which the concepts are based. Among the concepts examined are those which have slight empirical grounds. Other concepts examined are those which seem to have empirical bases but which are…

  1. Teaching Information Systems Development via Process Variants

    ERIC Educational Resources Information Center

    Tan, Wee-Kek; Tan, Chuan-Hoo

    2010-01-01

    Acquiring the knowledge to assemble an integrated Information System (IS) development process that is tailored to the specific needs of a project has become increasingly important. It is therefore necessary for educators to impart to students this crucial skill. However, Situational Method Engineering (SME) is an inherently complex process that…

  2. Modeling Business Processes in Public Administration

    NASA Astrophysics Data System (ADS)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  3. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  4. Mechanoluminescence assisting agile optimization of processing design on surgical epiphysis plates

    NASA Astrophysics Data System (ADS)

    Terasaki, Nao; Toyomasu, Takashi; Sonohata, Motoki

    2018-04-01

    We propose a novel method for agile optimization of processing design by visualization of mechanoluminescence. To demonstrate the effect of the new method, epiphysis plates were processed to form dots (diameters: 1 and 1.5 mm) and the mechanical information was evaluated. As a result, the appearance of new strain concentration was successfully visualized on the basis of mechanoluminescence, and complex mechanical information was instinctively understood by surgeons as the designers. In addition, it was clarified by mechanoluminescence analysis that small dots do not have serious mechanical effects such as strength reduction. Such detail mechanical information evaluated on the basis of mechanoluminescence was successfully applied to the judgement of the validity of the processing design. This clearly proves the effectiveness of the new methodology using mechanoluminescence for assisting agile optimization of the processing design.

  5. Method and apparatus for measuring micro structures, anisotropy and birefringence in polymers using laser scattered light

    DOEpatents

    Grek, Boris; Bartolick, Joseph; Kennedy, Alan D.

    2000-01-01

    A method and apparatus for measuring microstructures, anistropy and birefringence in polymers using laser scattered light includes a laser which provides a beam that can be conditioned and is directed at a fiber or film which causes the beam to scatter. Backscatter light is received and processed with detectors and beam splitters to obtain data. The data is directed to a computer where it is processed to obtain information about the fiber or film, such as the birefringence and diameter. This information provides a basis for modifications to the production process to enhance the process.

  6. Spatio-Temporal Process Simulation of Dam-Break Flood Based on SPH

    NASA Astrophysics Data System (ADS)

    Wang, H.; Ye, F.; Ouyang, S.; Li, Z.

    2018-04-01

    On the basis of introducing the SPH (Smooth Particle Hydrodynamics) simulation method, the key research problems were given solutions in this paper, which ere the spatial scale and temporal scale adapting to the GIS(Geographical Information System) application, the boundary condition equations combined with the underlying surface, and the kernel function and parameters applicable to dam-break flood simulation. In this regards, a calculation method of spatio-temporal process emulation with elaborate particles for dam-break flood was proposed. Moreover the spatio-temporal process was dynamic simulated by using GIS modelling and visualization. The results show that the method gets more information, objectiveness and real situations.

  7. Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR

    NASA Astrophysics Data System (ADS)

    Gao, Yang; Zhong, Ruofei; Tang, Tao; Wang, Liuzhao; Liu, Xianlin

    2017-08-01

    Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness (p) and completeness (r) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR.

  8. Improved Discrete Approximation of Laplacian of Gaussian

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L., Jr.

    2004-01-01

    An improved method of computing a discrete approximation of the Laplacian of a Gaussian convolution of an image has been devised. The primary advantage of the method is that without substantially degrading the accuracy of the end result, it reduces the amount of information that must be processed and thus reduces the amount of circuitry needed to perform the Laplacian-of- Gaussian (LOG) operation. Some background information is necessary to place the method in context. The method is intended for application to the LOG part of a process of real-time digital filtering of digitized video data that represent brightnesses in pixels in a square array. The particular filtering process of interest is one that converts pixel brightnesses to binary form, thereby reducing the amount of information that must be performed in subsequent correlation processing (e.g., correlations between images in a stereoscopic pair for determining distances or correlations between successive frames of the same image for detecting motions). The Laplacian is often included in the filtering process because it emphasizes edges and textures, while the Gaussian is often included because it smooths out noise that might not be consistent between left and right images or between successive frames of the same image.

  9. Joint Extraction of Entities and Relations Using Reinforcement Learning and Deep Learning.

    PubMed

    Feng, Yuntian; Zhang, Hongjun; Hao, Wenning; Chen, Gang

    2017-01-01

    We use both reinforcement learning and deep learning to simultaneously extract entities and relations from unstructured texts. For reinforcement learning, we model the task as a two-step decision process. Deep learning is used to automatically capture the most important information from unstructured texts, which represent the state in the decision process. By designing the reward function per step, our proposed method can pass the information of entity extraction to relation extraction and obtain feedback in order to extract entities and relations simultaneously. Firstly, we use bidirectional LSTM to model the context information, which realizes preliminary entity extraction. On the basis of the extraction results, attention based method can represent the sentences that include target entity pair to generate the initial state in the decision process. Then we use Tree-LSTM to represent relation mentions to generate the transition state in the decision process. Finally, we employ Q -Learning algorithm to get control policy π in the two-step decision process. Experiments on ACE2005 demonstrate that our method attains better performance than the state-of-the-art method and gets a 2.4% increase in recall-score.

  10. Joint Extraction of Entities and Relations Using Reinforcement Learning and Deep Learning

    PubMed Central

    Zhang, Hongjun; Chen, Gang

    2017-01-01

    We use both reinforcement learning and deep learning to simultaneously extract entities and relations from unstructured texts. For reinforcement learning, we model the task as a two-step decision process. Deep learning is used to automatically capture the most important information from unstructured texts, which represent the state in the decision process. By designing the reward function per step, our proposed method can pass the information of entity extraction to relation extraction and obtain feedback in order to extract entities and relations simultaneously. Firstly, we use bidirectional LSTM to model the context information, which realizes preliminary entity extraction. On the basis of the extraction results, attention based method can represent the sentences that include target entity pair to generate the initial state in the decision process. Then we use Tree-LSTM to represent relation mentions to generate the transition state in the decision process. Finally, we employ Q-Learning algorithm to get control policy π in the two-step decision process. Experiments on ACE2005 demonstrate that our method attains better performance than the state-of-the-art method and gets a 2.4% increase in recall-score. PMID:28894463

  11. MEASUREMENT OF INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPY MACHINES

    EPA Science Inventory

    The article provides background information on indoor air emissions from office equipment, with emphasis on dry-process photocopy machines. The test method is described in detail along with results of a study to evaluate the test method using four dry-process photocopy machines. ...

  12. Management Challenges Fiscal Year 2016

    DTIC Science & Technology

    2017-01-01

    of Office of Inspector General Management Challenges 14 │ FY 2016 classified information revealed “methods to our adversaries that could impact our...electronic data to perform operations and to process, maintain, and report essential information . Office of Inspector General Management ...that collects, processes, stores, disseminates, and manages critical information . It includes owned and leased communications and computing

  13. Shannon information entropy in heavy-ion collisions

    NASA Astrophysics Data System (ADS)

    Ma, Chun-Wang; Ma, Yu-Gang

    2018-03-01

    The general idea of information entropy provided by C.E. Shannon "hangs over everything we do" and can be applied to a great variety of problems once the connection between a distribution and the quantities of interest is found. The Shannon information entropy essentially quantify the information of a quantity with its specific distribution, for which the information entropy based methods have been deeply developed in many scientific areas including physics. The dynamical properties of heavy-ion collisions (HICs) process make it difficult and complex to study the nuclear matter and its evolution, for which Shannon information entropy theory can provide new methods and observables to understand the physical phenomena both theoretically and experimentally. To better understand the processes of HICs, the main characteristics of typical models, including the quantum molecular dynamics models, thermodynamics models, and statistical models, etc., are briefly introduced. The typical applications of Shannon information theory in HICs are collected, which cover the chaotic behavior in branching process of hadron collisions, the liquid-gas phase transition in HICs, and the isobaric difference scaling phenomenon for intermediate mass fragments produced in HICs of neutron-rich systems. Even though the present applications in heavy-ion collision physics are still relatively simple, it would shed light on key questions we are seeking for. It is suggested to further develop the information entropy methods in nuclear reactions models, as well as to develop new analysis methods to study the properties of nuclear matters in HICs, especially the evolution of dynamics system.

  14. Dynamic Information and Library Processing.

    ERIC Educational Resources Information Center

    Salton, Gerard

    This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…

  15. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  16. Coherent-state information concentration and purification in atomic memory

    NASA Astrophysics Data System (ADS)

    Herec, Jiří; Filip, Radim

    2006-12-01

    We propose a feasible method of coherent-state information concentration and purification utilizing quantum memory. The method allows us to optimally concentrate and purify information carried by many noisy copies of an unknown coherent state (randomly distributed in time) to a single copy. Thus nonclassical resources and operations can be saved, if we compare information processing with many noisy copies and a single copy with concentrated and purified information.

  17. Age and Visual Information Processing.

    ERIC Educational Resources Information Center

    Gummerman, Kent; And Others

    This paper reports on three studies concerned with aspects of human visual information processing. Study I was an effort to measure the duration of iconic storage using a partial report method in children ranging in age from 6 to 13 years. Study II was designed to detect age related changes in the rate of processing (perceptually encoding) letters…

  18. Application of fuzzy neural network technologies in management of transport and logistics processes in Arctic

    NASA Astrophysics Data System (ADS)

    Levchenko, N. G.; Glushkov, S. V.; Sobolevskaya, E. Yu; Orlov, A. P.

    2018-05-01

    The method of modeling the transport and logistics process using fuzzy neural network technologies has been considered. The analysis of the implemented fuzzy neural network model of the information management system of transnational multimodal transportation of the process showed the expediency of applying this method to the management of transport and logistics processes in the Arctic and Subarctic conditions. The modular architecture of this model can be expanded by incorporating additional modules, since the working conditions in the Arctic and the subarctic themselves will present more and more realistic tasks. The architecture allows increasing the information management system, without affecting the system or the method itself. The model has a wide range of application possibilities, including: analysis of the situation and behavior of interacting elements; dynamic monitoring and diagnostics of management processes; simulation of real events and processes; prediction and prevention of critical situations.

  19. Methods for evaluating information in managing the enterprise on the basis of a hybrid three-tier system

    NASA Astrophysics Data System (ADS)

    Vasil'ev, V. A.; Dobrynina, N. V.

    2017-01-01

    The article presents data on the influence of information upon the functioning of complex systems in the process of ensuring their effective management. Ways and methods for evaluating multidimensional information that reduce time and resources, improve the validity of the studied system management decisions, were proposed.

  20. Display system for imaging scientific telemetric information

    NASA Technical Reports Server (NTRS)

    Zabiyakin, G. I.; Rykovanov, S. N.

    1979-01-01

    A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.

  1. Aligning observed and modelled behaviour based on workflow decomposition

    NASA Astrophysics Data System (ADS)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  2. Novel method of extracting motion from natural movies.

    PubMed

    Suzuki, Wataru; Ichinohe, Noritaka; Tani, Toshiki; Hayami, Taku; Miyakawa, Naohisa; Watanabe, Satoshi; Takeichi, Hiroshige

    2017-11-01

    The visual system in primates can be segregated into motion and shape pathways. Interaction occurs at multiple stages along these pathways. Processing of shape-from-motion and biological motion is considered to be a higher-order integration process involving motion and shape information. However, relatively limited types of stimuli have been used in previous studies on these integration processes. We propose a new algorithm to extract object motion information from natural movies and to move random dots in accordance with the information. The object motion information is extracted by estimating the dynamics of local normal vectors of the image intensity projected onto the x-y plane of the movie. An electrophysiological experiment on two adult common marmoset monkeys (Callithrix jacchus) showed that the natural and random dot movies generated with this new algorithm yielded comparable neural responses in the middle temporal visual area. In principle, this algorithm provided random dot motion stimuli containing shape information for arbitrary natural movies. This new method is expected to expand the neurophysiological and psychophysical experimental protocols to elucidate the integration processing of motion and shape information in biological systems. The novel algorithm proposed here was effective in extracting object motion information from natural movies and provided new motion stimuli to investigate higher-order motion information processing. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  3. Methods of Organizational Information Security

    NASA Astrophysics Data System (ADS)

    Martins, José; Dos Santos, Henrique

    The principle objective of this article is to present a literature review for the methods used in the security of information at the level of organizations. Some of the principle problems are identified and a first group of relevant dimensions is presented for an efficient management of information security. The study is based on the literature review made, using some of the more relevant certified articles of this theme, in international reports and in the principle norms of management of information security. From the readings that were done, we identified some of the methods oriented for risk management, norms of certification and good practice of security of information. Some of the norms are oriented for the certification of the product or system and others oriented to the processes of the business. There are also studies with the proposal of Frameworks that suggest the integration of different approaches with the foundation of norms focused on technologies, in processes and taking into consideration the organizational and human environment of the organizations. In our perspective, the biggest contribute to the security of information is the development of a method of security of information for an organization in a conflicting environment. This should make available the security of information, against the possible dimensions of attack that the threats could exploit, through the vulnerability of the organizational actives. This method should support the new concepts of "Network centric warfare", "Information superiority" and "Information warfare" especially developed in this last decade, where information is seen simultaneously as a weapon and as a target.

  4. Junior high school students' cognitive process in solving the developed algebraic problems based on information processing taxonomy model

    NASA Astrophysics Data System (ADS)

    Purwoko, Saad, Noor Shah; Tajudin, Nor'ain Mohd

    2017-05-01

    This study aims to: i) develop problem solving questions of Linear Equations System of Two Variables (LESTV) based on levels of IPT Model, ii) explain the level of students' skill of information processing in solving LESTV problems; iii) explain students' skill in information processing in solving LESTV problems; and iv) explain students' cognitive process in solving LESTV problems. This study involves three phases: i) development of LESTV problem questions based on Tessmer Model; ii) quantitative survey method on analyzing students' skill level of information processing; and iii) qualitative case study method on analyzing students' cognitive process. The population of the study was 545 eighth grade students represented by a sample of 170 students of five Junior High Schools in Hilir Barat Zone, Palembang (Indonesia) that were chosen using cluster sampling. Fifteen students among them were drawn as a sample for the interview session with saturated information obtained. The data were collected using the LESTV problem solving test and the interview protocol. The quantitative data were analyzed using descriptive statistics, while the qualitative data were analyzed using the content analysis. The finding of this study indicated that students' cognitive process was just at the step of indentifying external source and doing algorithm in short-term memory fluently. Only 15.29% students could retrieve type A information and 5.88% students could retrieve type B information from long-term memory. The implication was the development problems of LESTV had validated IPT Model in modelling students' assessment by different level of hierarchy.

  5. Use of Multimedia Technology in the Doctor-Patient Relationship for Obtaining Patient Informed Consent

    PubMed Central

    Michalski, Andrzej; Stopa, Marcin; Miśkowiak, Bogdan

    2016-01-01

    Patient informed consent for surgery or for high-risk methods of treatment or diagnosis means that unlawful breach of the patient’s personal interests is avoided and the patient accepts the risk of surgery and takes the brunt of it. Patient awareness – their knowledge of the condition and circumstances of continued therapeutic procedure, including offered and available methods of treatment and their possible complications – constitutes a particular aspect of the informed-consent process. The rapid development of technologies and methods of treatment may cause communication problems between the doctor and the patient regarding the scope and method of patient education prior to surgery. The use of multimedia technology (e.g., videos of surgical procedures, computer animation, and graphics), in addition to media used in preoperative patient education, may be a factor in improving the quality of the informed consent process. Studies conducted in clinical centers show that with use of multimedia technology, patients remember more of the information presented. The use of new technology also makes it possible to reduce the difference in the amount of information assimilated by patients with different levels of education. The use of media is a way to improve the quality of preoperative patient education and, at the same time, a step towards their further empowerment in the healing process. PMID:27780964

  6. Informational technologies in modern educational structure

    NASA Astrophysics Data System (ADS)

    Fedyanin, A. B.

    2017-01-01

    The article represents the structure of informational technologies complex that is applied in modern school education, describes the most important educational methods, shows the results of their implementation. It represents the forms and methods of educational process informative support usage, examined in respects of different aspects of their using that take into account also the psychological features of students. A range of anxious facts and dangerous trends connected with the usage and distribution of the informational technologies that are to be taken into account in the educational process of informatization is also indicated in the article. Materials of the article are based on the experience of many years in operation and development of the informational educational sphere on the basis of secondary school of the physics and mathematics specialization.

  7. Information security of power enterprises of North-Arctic region

    NASA Astrophysics Data System (ADS)

    Sushko, O. P.

    2018-05-01

    The role of information technologies in providing technological security for energy enterprises is a component of the economic security for the northern Arctic region in general. Applying instruments and methods of information protection modelling of the energy enterprises' business process in the northern Arctic region (such as Arkhenergo and Komienergo), the authors analysed and identified most frequent risks of information security. With the analytic hierarchy process based on weighting factor estimations, information risks of energy enterprises' technological processes were ranked. The economic estimation of the information security within an energy enterprise considers weighting factor-adjusted variables (risks). Investments in information security systems of energy enterprises in the northern Arctic region are related to necessary security elements installation; current operating expenses on business process protection systems become materialized economic damage.

  8. 40 CFR 721.9540 - Polysulfide mixture.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... new information, and any information on methods for protecting against such risk, into the applicable... 721.9540 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES... substance is any manner or method of manufacture, import, or processing associated with any use of this...

  9. Basic Information for EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM)

    EPA Pesticide Factsheets

    Contains basic information on the role and origins of the Selected Analytical Methods including the formation of the Homeland Security Laboratory Capacity Work Group and the Environmental Evaluation Analytical Process Roadmap for Homeland Security Events

  10. Description of a method to support public health information management: organizational network analysis

    PubMed Central

    Merrill, Jacqueline; Bakken, Suzanne; Rockoff, Maxine; Gebbie, Kristine; Carley, Kathleen

    2007-01-01

    In this case study we describe a method that has potential to provide systematic support for public health information management. Public health agencies depend on specialized information that travels throughout an organization via communication networks among employees. Interactions that occur within these networks are poorly understood and are generally unmanaged. We applied organizational network analysis, a method for studying communication networks, to assess the method’s utility to support decision making for public health managers, and to determine what links existed between information use and agency processes. Data on communication links among a health department’s staff was obtained via survey with a 93% response rate, and analyzed using Organizational Risk Analyzer (ORA) software. The findings described the structure of information flow in the department’s communication networks. The analysis succeeded in providing insights into organizational processes which informed public health managers’ strategies to address problems and to take advantage of network strengths. PMID:17098480

  11. A system of automated processing of deep water hydrological information

    NASA Technical Reports Server (NTRS)

    Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.

    1974-01-01

    An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.

  12. An Ontology-Based Approach to Incorporate User-Generated Geo-Content Into Sdi

    NASA Astrophysics Data System (ADS)

    Deng, D.-P.; Lemmens, R.

    2011-08-01

    The Web is changing the way people share and communicate information because of emergence of various Web technologies, which enable people to contribute information on the Web. User-Generated Geo-Content (UGGC) is a potential resource of geographic information. Due to the different production methods, UGGC often cannot fit in geographic information model. There is a semantic gap between UGGC and formal geographic information. To integrate UGGC into geographic information, this study conducts an ontology-based process to bridge this semantic gap. This ontology-based process includes five steps: Collection, Extraction, Formalization, Mapping, and Deployment. In addition, this study implements this process on Twitter messages, which is relevant to Japan Earthquake disaster. By using this process, we extract disaster relief information from Twitter messages, and develop a knowledge base for GeoSPARQL queries in disaster relief information.

  13. Encoding techniques for complex information structures in connectionist systems

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.

  14. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    PubMed

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  15. Inferring the nature of anthropogenic threats from long-term abundance records.

    PubMed

    Shoemaker, Kevin T; Akçakaya, H Resit

    2015-02-01

    Diagnosing the processes that threaten species persistence is critical for recovery planning and risk forecasting. Dominant threats are typically inferred by experts on the basis of a patchwork of informal methods. Transparent, quantitative diagnostic tools would contribute much-needed consistency, objectivity, and rigor to the process of diagnosing anthropogenic threats. Long-term census records, available for an increasingly large and diverse set of taxa, may exhibit characteristic signatures of specific threatening processes and thereby provide information for threat diagnosis. We developed a flexible Bayesian framework for diagnosing threats on the basis of long-term census records and diverse ancillary sources of information. We tested this framework with simulated data from artificial populations subjected to varying degrees of exploitation and habitat loss and several real-world abundance time series for which threatening processes are relatively well understood: bluefin tuna (Thunnus maccoyii) and Atlantic cod (Gadus morhua) (exploitation) and Red Grouse (Lagopus lagopus scotica) and Eurasian Skylark (Alauda arvensis) (habitat loss). Our method correctly identified the process driving population decline for over 90% of time series simulated under moderate to severe threat scenarios. Successful identification of threats approached 100% for severe exploitation and habitat loss scenarios. Our method identified threats less successfully when threatening processes were weak and when populations were simultaneously affected by multiple threats. Our method selected the presumed true threat model for all real-world case studies, although results were somewhat ambiguous in the case of the Eurasian Skylark. In the latter case, incorporation of an ancillary source of information (records of land-use change) increased the weight assigned to the presumed true model from 70% to 92%, illustrating the value of the proposed framework in bringing diverse sources of information into a common rigorous framework. Ultimately, our framework may greatly assist conservation organizations in documenting threatening processes and planning species recovery. © 2014 Society for Conservation Biology.

  16. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  17. Text Genres in Information Organization

    ERIC Educational Resources Information Center

    Nahotko, Marek

    2016-01-01

    Introduction: Text genres used by so-called information organizers in the processes of information organization in information systems were explored in this research. Method: The research employed text genre socio-functional analysis. Five genre groups in information organization were distinguished. Every genre group used in information…

  18. Low-complexity video encoding method for wireless image transmission in capsule endoscope.

    PubMed

    Takizawa, Kenichi; Hamaguchi, Kiyoshi

    2010-01-01

    This paper presents a low-complexity video encoding method applicable for wireless image transmission in capsule endoscopes. This encoding method is based on Wyner-Ziv theory, in which side information available at a transmitter is treated as side information at its receiver. Therefore complex processes in video encoding, such as estimation of the motion vector, are moved to the receiver side, which has a larger-capacity battery. As a result, the encoding process is only to decimate coded original data through channel coding. We provide a performance evaluation for a low-density parity check (LDPC) coding method in the AWGN channel.

  19. Image object recognition based on the Zernike moment and neural networks

    NASA Astrophysics Data System (ADS)

    Wan, Jianwei; Wang, Ling; Huang, Fukan; Zhou, Liangzhu

    1998-03-01

    This paper first give a comprehensive discussion about the concept of artificial neural network its research methods and the relations with information processing. On the basis of such a discussion, we expound the mathematical similarity of artificial neural network and information processing. Then, the paper presents a new method of image recognition based on invariant features and neural network by using image Zernike transform. The method not only has the invariant properties for rotation, shift and scale of image object, but also has good fault tolerance and robustness. Meanwhile, it is also compared with statistical classifier and invariant moments recognition method.

  20. Informational approach to the analysis of acoustic signals

    NASA Astrophysics Data System (ADS)

    Senkevich, Yuriy; Dyuk, Vyacheslav; Mishchenko, Mikhail; Solodchuk, Alexandra

    2017-10-01

    The example of linguistic processing of acoustic signals of a seismic event would be an information approach to the processing of non-stationary signals. The method for converting an acoustic signal into an information message is described by identifying repetitive self-similar patterns. The definitions of the event selection indicators in the symbolic recording of the acoustic signal are given. The results of processing an acoustic signal by a computer program realizing the processing of linguistic data are shown. Advantages and disadvantages of using software algorithms are indicated.

  1. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2000-01-01

    Hospital information systems have to support quality improvement objectives. The design issues of health care information system can be classified into three categories: 1) time-oriented and event-labelled storage of patient data; 2) contextual support of decision-making; 3) capabilities for modular upgrading. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualize clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the field of blood transfusion. An object-oriented data model of a process has been defined in order to identify its main components: activity, sub-process, resources, constrains, guidelines, parameters and indicators. Although some aspects of activity, such as "where", "what else", and "why" are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for this approach to be generalised within the organisation, for the processes to be interrelated, and for their characteristics to be shared.

  2. Invariance algorithms for processing NDE signals

    NASA Astrophysics Data System (ADS)

    Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William

    1996-11-01

    Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.

  3. A Weld Position Recognition Method Based on Directional and Structured Light Information Fusion in Multi-Layer/Multi-Pass Welding.

    PubMed

    Zeng, Jinle; Chang, Baohua; Du, Dong; Wang, Li; Chang, Shuhe; Peng, Guodong; Wang, Wenzhu

    2018-01-05

    Multi-layer/multi-pass welding (MLMPW) technology is widely used in the energy industry to join thick components. During automatic welding using robots or other actuators, it is very important to recognize the actual weld pass position using visual methods, which can then be used not only to perform reasonable path planning for actuators, but also to correct any deviations between the welding torch and the weld pass position in real time. However, due to the small geometrical differences between adjacent weld passes, existing weld position recognition technologies such as structured light methods are not suitable for weld position detection in MLMPW. This paper proposes a novel method for weld position detection, which fuses various kinds of information in MLMPW. First, a synchronous acquisition method is developed to obtain various kinds of visual information when directional light and structured light sources are on, respectively. Then, interferences are eliminated by fusing adjacent images. Finally, the information from directional and structured light images is fused to obtain the 3D positions of the weld passes. Experiment results show that each process can be done in 30 ms and the deviation is less than 0.6 mm. The proposed method can be used for automatic path planning and seam tracking in the robotic MLMPW process as well as electron beam freeform fabrication process.

  4. Technology and Microcomputers for an Information Centre/Special Library.

    ERIC Educational Resources Information Center

    Daehn, Ralph M.

    1984-01-01

    Discusses use of microcomputer hardware and software, telecommunications methods, and advanced library methods to create a specialized information center's database of literature relating to farm machinery and food processing. Systems and services (electronic messaging, serials control, database creation, cataloging, collections, circulation,…

  5. Research on Technology Innovation Management in Big Data Environment

    NASA Astrophysics Data System (ADS)

    Ma, Yanhong

    2018-02-01

    With the continuous development and progress of the information age, the demand for information is getting larger. The processing and analysis of information data is also moving toward the direction of scale. The increasing number of information data makes people have higher demands on processing technology. The explosive growth of information data onto the current society have prompted the advent of the era of big data. At present, people have more value and significance in producing and processing various kinds of information and data in their lives. How to use big data technology to process and analyze information data quickly to improve the level of big data management is an important stage to promote the current development of information and data processing technology in our country. To some extent, innovative research on the management methods of information technology in the era of big data can enhance our overall strength and make China be an invincible position in the development of the big data era.

  6. Signal processing methods for MFE plasma diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J.V.; Casper, T.; Kane, R.

    1985-02-01

    The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL.

  7. A defocus-information-free autostereoscopic three-dimensional (3D) digital reconstruction method using direct extraction of disparity information (DEDI)

    NASA Astrophysics Data System (ADS)

    Li, Da; Cheung, Chifai; Zhao, Xing; Ren, Mingjun; Zhang, Juan; Zhou, Liqiu

    2016-10-01

    Autostereoscopy based three-dimensional (3D) digital reconstruction has been widely applied in the field of medical science, entertainment, design, industrial manufacture, precision measurement and many other areas. The 3D digital model of the target can be reconstructed based on the series of two-dimensional (2D) information acquired by the autostereoscopic system, which consists multiple lens and can provide information of the target from multiple angles. This paper presents a generalized and precise autostereoscopic three-dimensional (3D) digital reconstruction method based on Direct Extraction of Disparity Information (DEDI) which can be used to any transform autostereoscopic systems and provides accurate 3D reconstruction results through error elimination process based on statistical analysis. The feasibility of DEDI method has been successfully verified through a series of optical 3D digital reconstruction experiments on different autostereoscopic systems which is highly efficient to perform the direct full 3D digital model construction based on tomography-like operation upon every depth plane with the exclusion of the defocused information. With the absolute focused information processed by DEDI method, the 3D digital model of the target can be directly and precisely formed along the axial direction with the depth information.

  8. Method and system for knowledge discovery using non-linear statistical analysis and a 1st and 2nd tier computer program

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2011-07-12

    The invention relates to a method and apparatus for simultaneously processing different sources of test data into informational data and then processing different categories of informational data into knowledge-based data. The knowledge-based data can then be communicated between nodes in a system of multiple computers according to rules for a type of complex, hierarchical computer system modeled on a human brain.

  9. Postprocessing for character recognition using pattern features and linguistic information

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Takatoshi; Okamoto, Masayosi; Horii, Hiroshi

    1993-04-01

    We propose a new method of post-processing for character recognition using pattern features and linguistic information. This method corrects errors in the recognition of handwritten Japanese sentences containing Kanji characters. This post-process method is characterized by having two types of character recognition. Improving the accuracy of the character recognition rate of Japanese characters is made difficult by the large number of characters, and the existence of characters with similar patterns. Therefore, it is not practical for a character recognition system to recognize all characters in detail. First, this post-processing method generates a candidate character table by recognizing the simplest features of characters. Then, it selects words corresponding to the character from the candidate character table by referring to a word and grammar dictionary before selecting suitable words. If the correct character is included in the candidate character table, this process can correct an error, however, if the character is not included, it cannot correct an error. Therefore, if this method can presume a character does not exist in a candidate character table by using linguistic information (word and grammar dictionary). It then can verify a presumed character by character recognition using complex features. When this method is applied to an online character recognition system, the accuracy of character recognition improves 93.5% to 94.7%. This proved to be the case when it was used for the editorials of a Japanese newspaper (Asahi Shinbun).

  10. Method for modeling social care processes for national information exchange.

    PubMed

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  11. Risk Informed Assessment of Regulatory and Design Requirements for Future Nuclear Power Plants - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ritterbusch, Stanley; Golay, Michael; Duran, Felicia

    2003-01-29

    OAK B188 Summary of methods proposed for risk informing the design and regulation of future nuclear power plants. All elements of the historical design and regulation process are preserved, but the methods proposed for new plants use probabilistic risk assessment methods as the primary decision making tool.

  12. Scaling of ratings: Concepts and methods

    Treesearch

    Thomas C. Brown; Terry C. Daniel

    1990-01-01

    Rating scales provide an efficient and widely used means of recording judgments. This paper reviews scaling issues within the context of a psychometric model of the rating process, describes several methods of scaling rating data, and compares the methods in terms of the assumptions they require about the rating process and the information they provide about the...

  13. A study on building data warehouse of hospital information system.

    PubMed

    Li, Ping; Wu, Tao; Chen, Mu; Zhou, Bin; Xu, Wei-guo

    2011-08-01

    Existing hospital information systems with simple statistical functions cannot meet current management needs. It is well known that hospital resources are distributed with private property rights among hospitals, such as in the case of the regional coordination of medical services. In this study, to integrate and make full use of medical data effectively, we propose a data warehouse modeling method for the hospital information system. The method can also be employed for a distributed-hospital medical service system. To ensure that hospital information supports the diverse needs of health care, the framework of the hospital information system has three layers: datacenter layer, system-function layer, and user-interface layer. This paper discusses the role of a data warehouse management system in handling hospital information from the establishment of the data theme to the design of a data model to the establishment of a data warehouse. Online analytical processing tools assist user-friendly multidimensional analysis from a number of different angles to extract the required data and information. Use of the data warehouse improves online analytical processing and mitigates deficiencies in the decision support system. The hospital information system based on a data warehouse effectively employs statistical analysis and data mining technology to handle massive quantities of historical data, and summarizes from clinical and hospital information for decision making. This paper proposes the use of a data warehouse for a hospital information system, specifically a data warehouse for the theme of hospital information to determine latitude, modeling and so on. The processing of patient information is given as an example that demonstrates the usefulness of this method in the case of hospital information management. Data warehouse technology is an evolving technology, and more and more decision support information extracted by data mining and with decision-making technology is required for further research.

  14. Cognitive Processes Associated with Child Neglect

    ERIC Educational Resources Information Center

    Hildyard, Kathryn; Wolfe, David

    2007-01-01

    Objective: To compare neglectful and non-neglectful mothers on information processing tasks related to child emotions, behaviors, the caregiving relationship, and recall of child-related information. Method: A natural group design was used. Neglectful mothers (N = 34) were chosen from active, chronic caseloads; non-neglectful comparison mothers (N…

  15. 40 CFR 721.3085 - Brominated phthalate ester.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information, and any information on methods for protecting against such risk, into a MSDS as described in... Section 721.3085 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES... substance is any manner or method of manufacture, import, or processing associated with any use of these...

  16. 40 CFR 721.3815 - Furan, 2-(ethoxymethyl)- tetrahydro-.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... new information, and any information on methods for protecting against such risk, into a Material...-. 721.3815 Section 721.3815 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC... substance is any manner or method of manufacture, import, or processing associated with any use of this...

  17. 40 CFR 721.4620 - Dialkylamino alkanoate metal salt.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this new information, and any information on methods for protecting against such risk, into a Material....4620 Section 721.4620 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC... significant new use of this substance is any manner or method of manufacture, import, or processing associated...

  18. 40 CFR 721.4600 - Recovered metal hydroxide.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this new information, and any information on methods for protecting against such risk, into an MSDS as... Section 721.4600 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES... new use of this substance is any manner or method of manufacture, import, or processing associated...

  19. 40 CFR 721.6060 - Alkylaryl substituted phosphite.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this new information, and any information on methods for protecting against such risk, into an MSDS as....6060 Section 721.6060 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC... significant new use of this substance is any manner or method of manufacture, import, or processing associated...

  20. Long-term care information systems: an overview of the selection process.

    PubMed

    Nahm, Eun-Shim; Mills, Mary Etta; Feege, Barbara

    2006-06-01

    Under the current Medicare Prospective Payment System method and the ever-changing managed care environment, the long-term care information system is vital to providing quality care and to surviving in business. system selection process should be an interdisciplinary effort involving all necessary stakeholders for the proposed system. The system selection process can be modeled following the Systems Developmental Life Cycle: identifying problems, opportunities, and objectives; determining information requirements; analyzing system needs; designing the recommended system; and developing and documenting software.

  1. Information processing using a single dynamical node as complex system

    PubMed Central

    Appeltant, L.; Soriano, M.C.; Van der Sande, G.; Danckaert, J.; Massar, S.; Dambre, J.; Schrauwen, B.; Mirasso, C.R.; Fischer, I.

    2011-01-01

    Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing. PMID:21915110

  2. AOD furnace splash soft-sensor in the smelting process based on improved BP neural network

    NASA Astrophysics Data System (ADS)

    Ma, Haitao; Wang, Shanshan; Wu, Libin; Yu, Ying

    2017-11-01

    In view of argon oxygen refining low carbon ferrochrome production process, in the splash of smelting process as the research object, based on splash mechanism analysis in the smelting process , using multi-sensor information fusion and BP neural network modeling techniques is proposed in this paper, using the vibration signal, the audio signal and the flame image signal in the furnace as the characteristic signal of splash, the vibration signal, the audio signal and the flame image signal in the furnace integration and modeling, and reconstruct splash signal, realize the splash soft measurement in the smelting process, the simulation results show that the method can accurately forecast splash type in the smelting process, provide a new method of measurement for forecast splash in the smelting process, provide more accurate information to control splash.

  3. Image interpolation and denoising for division of focal plane sensors using Gaussian processes.

    PubMed

    Gilboa, Elad; Cunningham, John P; Nehorai, Arye; Gruev, Viktor

    2014-06-16

    Image interpolation and denoising are important techniques in image processing. These methods are inherent to digital image acquisition as most digital cameras are composed of a 2D grid of heterogeneous imaging sensors. Current polarization imaging employ four different pixelated polarization filters, commonly referred to as division of focal plane polarization sensors. The sensors capture only partial information of the true scene, leading to a loss of spatial resolution as well as inaccuracy of the captured polarization information. Interpolation is a standard technique to recover the missing information and increase the accuracy of the captured polarization information. Here we focus specifically on Gaussian process regression as a way to perform a statistical image interpolation, where estimates of sensor noise are used to improve the accuracy of the estimated pixel information. We further exploit the inherent grid structure of this data to create a fast exact algorithm that operates in ����(N(3/2)) (vs. the naive ���� (N³)), thus making the Gaussian process method computationally tractable for image data. This modeling advance and the enabling computational advance combine to produce significant improvements over previously published interpolation methods for polarimeters, which is most pronounced in cases of low signal-to-noise ratio (SNR). We provide the comprehensive mathematical model as well as experimental results of the GP interpolation performance for division of focal plane polarimeter.

  4. Using the theory of planned behavior to determine factors influencing processed foods consumption behavior

    PubMed Central

    Kim, Og Yeon; Shim, Soonmi

    2014-01-01

    BACKGROUND/OBJECTIVES The purpose of this study is to identify how level of information affected intention, using the Theory of Planned Behavior. SUBJECTS/METHODS The study was conducted survey in diverse community centers and shopping malls in Seoul, which yielded N = 209 datasets. To compare processed foods consumption behavior, we divided samples into two groups based on level of information about food additives (whether respondents felt that information on food additives was sufficient or not). We analyzed differences in attitudes toward food additives and toward purchasing processed foods, subjective norms, perceived behavioral control, and behavioral intentions to processed foods between sufficient information group and lack information group. RESULTS The results confirmed that more than 78% of respondents thought information on food additives was insufficient. However, the group who felt information was sufficient had more positive attitudes about consuming processed foods and behavioral intentions than the group who thought information was inadequate. This study found people who consider that they have sufficient information on food additives tend to have more positive attitudes toward processed foods and intention to consume processed foods. CONCLUSIONS This study suggests increasing needs for nutrition education on the appropriate use of processed foods. Designing useful nutrition education requires a good understanding of factors which influence on processed foods consumption. PMID:24944779

  5. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    PubMed

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Returning to Roots: On Social Information Processing and Moral Development

    ERIC Educational Resources Information Center

    Dodge, Kenneth A.; Rabiner, David L.

    2004-01-01

    Social information processing theory has been posited as a description of how mental operations affect behavioral responding in social situations. Arsenio and Lemerise (this issue) proposed that consideration of concepts and methods from moral domain models could enhance this description. This paper agrees with their proposition, although it…

  7. The Relationship between Environmental Turbulence, Management Support, Organizational Collaboration, Information Technology Solution Realization, and Process Performance, in Healthcare Provider Organizations

    ERIC Educational Resources Information Center

    Muglia, Victor O.

    2010-01-01

    The Problem: The purpose of this study was to investigate relationships between environmental turbulence, management support, organizational collaboration, information technology solution realization, and process performance in healthcare provider organizations. Method: A descriptive/correlational study of Hospital medical services process…

  8. 40 CFR 98.176 - Data reporting requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., you must report the following information for each process: (1) The carbon content of each process input and output used to determine CO2 emissions. (2) Whether the carbon content was determined from information from the supplier or by laboratory analysis, and if by laboratory analysis, the method used. (3...

  9. An information transfer based novel framework for fault root cause tracing of complex electromechanical systems in the processing industry

    NASA Astrophysics Data System (ADS)

    Wang, Rongxi; Gao, Xu; Gao, Jianmin; Gao, Zhiyong; Kang, Jiani

    2018-02-01

    As one of the most important approaches for analyzing the mechanism of fault pervasion, fault root cause tracing is a powerful and useful tool for detecting the fundamental causes of faults so as to prevent any further propagation and amplification. Focused on the problems arising from the lack of systematic and comprehensive integration, an information transfer-based novel data-driven framework for fault root cause tracing of complex electromechanical systems in the processing industry was proposed, taking into consideration the experience and qualitative analysis of conventional fault root cause tracing methods. Firstly, an improved symbolic transfer entropy method was presented to construct a directed-weighted information model for a specific complex electromechanical system based on the information flow. Secondly, considering the feedback mechanisms in the complex electromechanical systems, a method for determining the threshold values of weights was developed to explore the disciplines of fault propagation. Lastly, an iterative method was introduced to identify the fault development process. The fault root cause was traced by analyzing the changes in information transfer between the nodes along with the fault propagation pathway. An actual fault root cause tracing application of a complex electromechanical system is used to verify the effectiveness of the proposed framework. A unique fault root cause is obtained regardless of the choice of the initial variable. Thus, the proposed framework can be flexibly and effectively used in fault root cause tracing for complex electromechanical systems in the processing industry, and formulate the foundation of system vulnerability analysis and condition prediction, as well as other engineering applications.

  10. SPECIAL ISSUE ON OPTICAL PROCESSING OF INFORMATION: Information transmission systems based on two-mode lasers with controlled emission frequencies

    NASA Astrophysics Data System (ADS)

    Naumov, N. V.; Petrovskii, V. N.; Protsenko, E. D.; Shananin, R. A.

    1995-10-01

    Various information transmission systems, based on two-mode lasers with controlled emission frequencies, are proposed. It is suggested that these systems can be implemented by modulation of the intermode spacing of a two-mode laser. An experimental investigation is reported of frequency control methods. It is shown that these methods should make it possible to construct information transmission systems with high transmission rates subject to weak nonlinear distortions of the information-carrying signal.

  11. Petri net-based method for the analysis of the dynamics of signal propagation in signaling pathways.

    PubMed

    Hardy, Simon; Robillard, Pierre N

    2008-01-15

    Cellular signaling networks are dynamic systems that propagate and process information, and, ultimately, cause phenotypical responses. Understanding the circuitry of the information flow in cells is one of the keys to understanding complex cellular processes. The development of computational quantitative models is a promising avenue for attaining this goal. Not only does the analysis of the simulation data based on the concentration variations of biological compounds yields information about systemic state changes, but it is also very helpful for obtaining information about the dynamics of signal propagation. This article introduces a new method for analyzing the dynamics of signal propagation in signaling pathways using Petri net theory. The method is demonstrated with the Ca(2+)/calmodulin-dependent protein kinase II (CaMKII) regulation network. The results constitute temporal information about signal propagation in the network, a simplified graphical representation of the network and of the signal propagation dynamics and a characterization of some signaling routes as regulation motifs.

  12. Information theoretic analysis of edge detection in visual communication

    NASA Astrophysics Data System (ADS)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  13. Remote Sensing: A valuable tool in the Forest Service decision making process. [in Utah

    NASA Technical Reports Server (NTRS)

    Stanton, F. L.

    1975-01-01

    Forest Service studies for integrating remotely sensed data into existing information systems highlight a need to: (1) re-examine present methods of collecting and organizing data, (2) develop an integrated information system for rapidly processing and interpreting data, (3) apply existing technological tools in new ways, and (4) provide accurate and timely information for making right management decisions. The Forest Service developed an integrated information system using remote sensors, microdensitometers, computer hardware and software, and interactive accessories. Their efforts substantially reduce the time it takes for collecting and processing data.

  14. 40 CFR 721.5450 - α-Olefin sulfonate, sodium salt.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this new information, and any information on methods for protecting against such risk, into an MSDS as....5450 Section 721.5450 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC... significant new use of this substance is any manner or method of manufacture, import, or processing associated...

  15. Information Design Theories

    ERIC Educational Resources Information Center

    Pettersson, Rune

    2014-01-01

    Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…

  16. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  17. A methodology proposal for collaborative business process elaboration using a model-driven approach

    NASA Astrophysics Data System (ADS)

    Mu, Wenxin; Bénaben, Frédérick; Pingaud, Hervé

    2015-05-01

    Business process management (BPM) principles are commonly used to improve processes within an organisation. But they can equally be applied to supporting the design of an Information System (IS). In a collaborative situation involving several partners, this type of BPM approach may be useful to support the design of a Mediation Information System (MIS), which would ensure interoperability between the partners' ISs (which are assumed to be service oriented). To achieve this objective, the first main task is to build a collaborative business process cartography. The aim of this article is to present a method for bringing together collaborative information and elaborating collaborative business processes from the information gathered (by using a collaborative situation framework, an organisational model, an informational model, a functional model and a metamodel and by using model transformation rules).

  18. MANTECH project book

    NASA Astrophysics Data System (ADS)

    The effective integration of processes, systems, and procedures used in the production of aerospace systems using computer technology is managed by the Integration Technology Division (MTI). Under its auspices are the Information Management Branch, which is actively involved with information management, information sciences and integration, and the Implementation Branch, whose technology areas include computer integrated manufacturing, engineering design, operations research, and material handling and assembly. The Integration Technology Division combines design, manufacturing, and supportability functions within the same organization. The Processing and Fabrication Division manages programs to improve structural and nonstructural materials processing and fabrication. Within this division, the Metals Branch directs the manufacturing methods program for metals and metal matrix composites processing and fabrication. The Nonmetals Branch directs the manufacturing methods programs, which include all manufacturing processes for producing and utilizing propellants, plastics, resins, fibers, composites, fluid elastomers, ceramics, glasses, and coatings. The objective of the Industrial Base Analysis Division is to act as focal point for the USAF industrial base program for productivity, responsiveness, and preparedness planning.

  19. Extracting important information from Chinese Operation Notes with natural language processing methods.

    PubMed

    Wang, Hui; Zhang, Weide; Zeng, Qiang; Li, Zuofeng; Feng, Kaiyan; Liu, Lei

    2014-04-01

    Extracting information from unstructured clinical narratives is valuable for many clinical applications. Although natural Language Processing (NLP) methods have been profoundly studied in electronic medical records (EMR), few studies have explored NLP in extracting information from Chinese clinical narratives. In this study, we report the development and evaluation of extracting tumor-related information from operation notes of hepatic carcinomas which were written in Chinese. Using 86 operation notes manually annotated by physicians as the training set, we explored both rule-based and supervised machine-learning approaches. Evaluating on unseen 29 operation notes, our best approach yielded 69.6% in precision, 58.3% in recall and 63.5% F-score. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Rapid review programs to support health care and policy decision making: a descriptive analysis of processes and methods.

    PubMed

    Polisena, Julie; Garritty, Chantelle; Kamel, Chris; Stevens, Adrienne; Abou-Setta, Ahmed M

    2015-03-14

    Health care decision makers often need to make decisions in limited timeframes and cannot await the completion of a full evidence review. Rapid reviews (RRs), utilizing streamlined systematic review methods, are increasingly being used to synthesize the evidence with a shorter turnaround time. Our primary objective was to describe the processes and methods used internationally to produce RRs. In addition, we sought to understand the underlying themes associated with these programs. We contacted representatives of international RR programs from a broad realm in health care to gather information about the methods and processes used to produce RRs. The responses were summarized narratively to understand the characteristics associated with their processes and methods. The summaries were compared and contrasted to highlight potential themes and trends related to the different RR programs. Twenty-nine international RR programs were included in our sample with a broad organizational representation from academia, government, research institutions, and non-for-profit organizations. Responses revealed that the main objectives for RRs were to inform decision making with regards to funding health care technologies, services and policy, and program development. Central themes that influenced the methods used by RR programs, and report type and dissemination were the imposed turnaround time to complete a report, resources available, the complexity and sensitivity of the research topics, and permission from the requestor. Our study confirmed that there is no standard approach to conduct RRs. Differences in processes and methods across programs may be the result of the novelty of RR methods versus other types of evidence syntheses, customization of RRs for various decision makers, and definition of 'rapid' by organizations, since it impacts both the timelines and the evidence synthesis methods. Future research should investigate the impact of current RR methods and reporting to support informed health care decision making, the effects of potential biases that may be introduced with streamlined methods, and the effectiveness of RR reporting guidelines on transparency.

  1. Engineering Review Information System

    NASA Technical Reports Server (NTRS)

    Grems, III, Edward G. (Inventor); Henze, James E. (Inventor); Bixby, Jonathan A. (Inventor); Roberts, Mark (Inventor); Mann, Thomas (Inventor)

    2015-01-01

    A disciplinal engineering review computer information system and method by defining a database of disciplinal engineering review process entities for an enterprise engineering program, opening a computer supported engineering item based upon the defined disciplinal engineering review process entities, managing a review of the opened engineering item according to the defined disciplinal engineering review process entities, and closing the opened engineering item according to the opened engineering item review.

  2. Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks

    PubMed Central

    Zhang, Fu-Guo; Zeng, An

    2015-01-01

    The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case. PMID:26125631

  3. Information Filtering via Heterogeneous Diffusion in Online Bipartite Networks.

    PubMed

    Zhang, Fu-Guo; Zeng, An

    2015-01-01

    The rapid expansion of Internet brings us overwhelming online information, which is impossible for an individual to go through all of it. Therefore, recommender systems were created to help people dig through this abundance of information. In networks composed by users and objects, recommender algorithms based on diffusion have been proven to be one of the best performing methods. Previous works considered the diffusion process from user to object, and from object to user to be equivalent. We show in this work that it is not the case and we improve the quality of the recommendation by taking into account the asymmetrical nature of this process. We apply this idea to modify the state-of-the-art recommendation methods. The simulation results show that the new methods can outperform these existing methods in both recommendation accuracy and diversity. Finally, this modification is checked to be able to improve the recommendation in a realistic case.

  4. Patent information retrieval: approaching a method and analysing nanotechnology patent collaborations.

    PubMed

    Ozcan, Sercan; Islam, Nazrul

    2017-01-01

    Many challenges still remain in the processing of explicit technological knowledge documents such as patents. Given the limitations and drawbacks of the existing approaches, this research sets out to develop an improved method for searching patent databases and extracting patent information to increase the efficiency and reliability of nanotechnology patent information retrieval process and to empirically analyse patent collaboration. A tech-mining method was applied and the subsequent analysis was performed using Thomson data analyser software. The findings show that nations such as Korea and Japan are highly collaborative in sharing technological knowledge across academic and corporate organisations within their national boundaries, and China presents, in some cases, a great illustration of effective patent collaboration and co-inventorship. This study also analyses key patent strengths by country, organisation and technology.

  5. Simplified Processing Method for Meter Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Kimberly M.; Colotelo, Alison H. A.; Downs, Janelle L.

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  6. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.

  7. Use of fuzzy sets in modeling of GIS objects

    NASA Astrophysics Data System (ADS)

    Mironova, Yu N.

    2018-05-01

    The paper discusses modeling and methods of data visualization in geographic information systems. Information processing in Geoinformatics is based on the use of models. Therefore, geoinformation modeling is a key in the chain of GEODATA processing. When solving problems, using geographic information systems often requires submission of the approximate or insufficient reliable information about the map features in the GIS database. Heterogeneous data of different origin and accuracy have some degree of uncertainty. In addition, not all information is accurate: already during the initial measurements, poorly defined terms and attributes (e.g., "soil, well-drained") are used. Therefore, there are necessary methods for working with uncertain requirements, classes, boundaries. The author proposes using spatial information fuzzy sets. In terms of a characteristic function, a fuzzy set is a natural generalization of ordinary sets, when one rejects the binary nature of this feature and assumes that it can take any value in the interval.

  8. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  9. [Modeling the academic performance of medical students in basic sciences and pre-clinical courses: a longitudinal study].

    PubMed

    Zúñiga, Denisse; Mena, Beltrán; Oliva, Rose; Pedrals, Nuria; Padilla, Oslando; Bitran, Marcela

    2009-10-01

    The study of predictors of academic performance is relevant for medical education. Most studies of academic performance use global ratings as outcome measure, and do not evaluate the influence of the assessment methods. To model by multivariate analysis, the academic performance of medical considering, besides academic and demographic variables, the methods used to assess students' learning and their preferred modes of information processing. Two hundred seventy two students admitted to the medical school of the Pontificia Universidad Católica de Chile from 2000 to 2003. Six groups of variables were studied to model the students' performance in five basic science courses (Anatomy, Biology, Calculus, Chemistry and Physics) and two pre-clinical courses (Integrated Medical Clinic I and IT). The assessment methods examined were multiple choice question tests, Objective Structured Clinical Examination and tutor appraisal. The results of the university admission tests (high school grades, mathematics and biology tests), the assessment methods used, the curricular year and previous application to medical school, were predictors of academic performance. The information processing modes influenced academic performance, but only in interaction with other variables. Perception (abstract or concrete) interacted with the assessment methods, and information use (active or reflexive), with sex. The correlation between the real and predicted grades was 0.7. In addition to the academic results obtained prior to university entrance, the methods of assessment used in the university and the information processing modes influence the academic performance of medical students in basic and preclinical courses.

  10. Textural Analysis and Substrate Classification in the Nearshore Region of Lake Superior Using High-Resolution Multibeam Bathymetry

    NASA Astrophysics Data System (ADS)

    Dennison, Andrew G.

    Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.

  11. Method Engineering: A Service-Oriented Approach

    NASA Astrophysics Data System (ADS)

    Cauvet, Corine

    In the past, a large variety of methods have been published ranging from very generic frameworks to methods for specific information systems. Method Engineering has emerged as a research discipline for designing, constructing and adapting methods for Information Systems development. Several approaches have been proposed as paradigms in method engineering. The meta modeling approach provides means for building methods by instantiation, the component-based approach aims at supporting the development of methods by using modularization constructs such as method fragments, method chunks and method components. This chapter presents an approach (SO2M) for method engineering based on the service paradigm. We consider services as autonomous computational entities that are self-describing, self-configuring and self-adapting. They can be described, published, discovered and dynamically composed for processing a consumer's demand (a developer's requirement). The method service concept is proposed to capture a development process fragment for achieving a goal. Goal orientation in service specification and the principle of service dynamic composition support method construction and method adaptation to different development contexts.

  12. Frequency Domain Analysis of Sensor Data for Event Classification in Real-Time Robot Assisted Deburring

    PubMed Central

    Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi

    2017-01-01

    Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809

  13. Uncertainty in Reference and Information Service

    ERIC Educational Resources Information Center

    VanScoy, Amy

    2015-01-01

    Introduction: Uncertainty is understood as an important component of the information seeking process, but it has not been explored as a component of reference and information service. Method: Interpretative phenomenological analysis was used to examine the practitioner perspective of reference and information service for eight academic research…

  14. Implementation of an anonymisation tool for clinical trials using a clinical trial processor integrated with an existing trial patient data information system.

    PubMed

    Aryanto, Kadek Y E; Broekema, André; Oudkerk, Matthijs; van Ooijen, Peter M A

    2012-01-01

    To present an adapted Clinical Trial Processor (CTP) test set-up for receiving, anonymising and saving Digital Imaging and Communications in Medicine (DICOM) data using external input from the original database of an existing clinical study information system to guide the anonymisation process. Two methods are presented for an adapted CTP test set-up. In the first method, images are pushed from the Picture Archiving and Communication System (PACS) using the DICOM protocol through a local network. In the second method, images are transferred through the internet using the HTTPS protocol. In total 25,000 images from 50 patients were moved from the PACS, anonymised and stored within roughly 2 h using the first method. In the second method, an average of 10 images per minute were transferred and processed over a residential connection. In both methods, no duplicated images were stored when previous images were retransferred. The anonymised images are stored in appropriate directories. The CTP can transfer and process DICOM images correctly in a very easy set-up providing a fast, secure and stable environment. The adapted CTP allows easy integration into an environment in which patient data are already included in an existing information system.

  15. Hybrid ontology for semantic information retrieval model using keyword matching indexing system.

    PubMed

    Uthayan, K R; Mala, G S Anandha

    2015-01-01

    Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology.

  16. Hybrid Ontology for Semantic Information Retrieval Model Using Keyword Matching Indexing System

    PubMed Central

    Uthayan, K. R.; Anandha Mala, G. S.

    2015-01-01

    Ontology is the process of growth and elucidation of concepts of an information domain being common for a group of users. Establishing ontology into information retrieval is a normal method to develop searching effects of relevant information users require. Keywords matching process with historical or information domain is significant in recent calculations for assisting the best match for specific input queries. This research presents a better querying mechanism for information retrieval which integrates the ontology queries with keyword search. The ontology-based query is changed into a primary order to predicate logic uncertainty which is used for routing the query to the appropriate servers. Matching algorithms characterize warm area of researches in computer science and artificial intelligence. In text matching, it is more dependable to study semantics model and query for conditions of semantic matching. This research develops the semantic matching results between input queries and information in ontology field. The contributed algorithm is a hybrid method that is based on matching extracted instances from the queries and information field. The queries and information domain is focused on semantic matching, to discover the best match and to progress the executive process. In conclusion, the hybrid ontology in semantic web is sufficient to retrieve the documents when compared to standard ontology. PMID:25922851

  17. Ontology-Driven Information Integration

    NASA Technical Reports Server (NTRS)

    Tissot, Florence; Menzel, Chris

    2005-01-01

    Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.

  18. A review of the information-gathering process for the provision of medicines for self-medication via community pharmacies in developing countries.

    PubMed

    Brata, Cecilia; Gudka, Sajni; Schneider, Carl R; Everett, Alan; Fisher, Colleen; Clifford, Rhonda M

    2013-01-01

    Currently, no review has been completed regarding the information-gathering process for the provision of medicines for self-medication in community pharmacies in developing countries. To review the rate of information gathering and the types of information gathered when patients present for self-medication requests. Six databases were searched for studies that described the rate of information gathering and/or the types of information gathered in the provision of medicines for self-medication in community pharmacies in developing countries. The types of information reported were classified as: signs and symptoms, patient identity, action taken, medications, medical history, and others. Twenty-two studies met the inclusion criteria. Variations in the study populations, types of scenarios, research methods, and data reporting were observed. The reported rate of information gathering varied from 18% to 97%, depending on the research methods used. Information on signs and symptoms and patient identity was more frequently reported to be gathered compared with information on action taken, medications, and medical history. Evidence showed that the information-gathering process for the provision of medicines for self-medication via community pharmacies in developing countries is inconsistent. There is a need to determine the barriers to appropriate information-gathering practice as well as to develop strategies to implement effective information-gathering processes. It is also recommended that international and national pharmacy organizations, including pharmacy academics and pharmacy researchers, develop a consensus on the types of information that should be reported in the original studies. This will facilitate comparison across studies so that areas that need improvement can be identified. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Retrieving clear-air turbulence information from regular commercial aircraft using Mode-S and ADS-B broadcast

    NASA Astrophysics Data System (ADS)

    Kopeć, J. M.; Kwiatkowski, K.; de Haan, S.; Malinowski, S. P.

    2015-11-01

    Navigational information broadcast by commercial aircraft in the form of Mode-S and ADS-B messages can be considered a new and valid source of upper air turbulence measurements. A set of three processing methods is proposed and analysed using a quality record of turbulence encounters made by a research aircraft. The proposed methods are based on processing the vertical acceleration or the background wind into the eddy dissipation rate. All the necessary parameters are conveyed in the Mode-S/ADS-B messages. The comparison of the results of application of the processing against a reference eddy dissipation rate obtained using on-board accelerometer indicate a significant potential of those methods. The advantages and limitation of the presented approaches are discussed.

  20. Laser apparatus and method for microscopic and spectroscopic analysis and processing of biological cells

    DOEpatents

    Gourley, Paul L.; Gourley, Mark F.

    1997-01-01

    An apparatus and method for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis thereof.

  1. Laser apparatus and method for microscopic and spectroscopic analysis and processing of biological cells

    DOEpatents

    Gourley, P.L.; Gourley, M.F.

    1997-03-04

    An apparatus and method are disclosed for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis. 20 figs.

  2. Kinetic Study of Adsorption Processes in Solution: An Undergraduate Physical Chemistry Experiment.

    ERIC Educational Resources Information Center

    Casado, Julio; And Others

    1985-01-01

    Background information, apparatus needed, procedures used, and results obtained are provided for a simple kinetic method for the monitoring of adsorption processes. The method, which involved adsorption of crystal violet onto activated carbon, is suitable for classroom and/or research purposes. (JN)

  3. Durable High-Density Data Storage

    NASA Technical Reports Server (NTRS)

    Lamartine, Bruce C.; Stutz, Roger A.

    1996-01-01

    The focus ion beam (FIB) micromilling process for data storage provides a new non-magnetic storage method for archiving large amounts of data. The process stores data on robust materials such as steel, silicon, and gold coated silicon. The storage process was developed to provide a method to insure the long term storage life of data. We estimate that the useful life of data written on silicon or gold-coated silicon to be on the order of a few thousand years without the need to rewrite the data every few years. The process uses an ion beam to carve material from the surface, much like stone cutters in ancient civilizations removed material from stone. The deeper the information is carved into the media, the longer the expected life of the information. The process can record information in three formats: (1) binary at densities of 23 Gbits/square inch, (2) alphanumeric at optical or non-optical density, and (3) graphical at optical and non-optical density. The formats can be mixed on the same media; and thus, it is possible to record, in a human-viewable format, instructions that can be read using an optical microscope. These instructions provide guidance on reading the remaining higher density information.

  4. An Instructional Merger: HyperCard and the Integrative Teaching Model.

    ERIC Educational Resources Information Center

    Massie, Carolyn M.; Volk, Larry G.

    Teaching methods have been developed and tested that encourage students to process information and refine their thinking skills. The information processing model is known as the Integrative Teaching Model. By combining the computer technology in the HyperCard application for data display and retrieval, instructional delivery of this teaching model…

  5. Interactivity, Information Processing, and Learning on the World Wide Web.

    ERIC Educational Resources Information Center

    Tremayne, Mark; Dunwoody, Sharon

    2001-01-01

    Examines the role of interactivity in the presentation of science news on the World Wide Web. Proposes and tests a model of interactive information processing that suggests that characteristics of users and Web sites influence interactivity, which influences knowledge acquisition. Describes use of a think-aloud method to study participants' mental…

  6. The study on serum and urine of renal interstitial fibrosis rats induced by unilateral ureteral obstruction based on metabonomics and network analysis methods.

    PubMed

    Xiang, Zheng; Sun, Hao; Cai, Xiaojun; Chen, Dahui

    2016-04-01

    Transmission of biological information is a biochemical process of multistep cascade from genes/proteins to metabolites. However, because most metabolites reflect the terminal information of the biochemical process, it is difficult to describe the transmission process of disease information in terms of the metabolomics strategy. In this paper, by incorporating network and metabolomics methods, an integrated approach was proposed to systematically investigate and explain the molecular mechanism of renal interstitial fibrosis. Through analysis of the network, the cascade transmission process of disease information starting from genes/proteins to metabolites was putatively identified and uncovered. The results indicated that renal fibrosis was involved in metabolic pathways of glycerophospholipid metabolism, biosynthesis of unsaturated fatty acids and arachidonic acid metabolism, riboflavin metabolism, tyrosine metabolism, and sphingolipid metabolism. These pathways involve kidney disease genes such as TGF-β1 and P2RX7. Our results showed that combining metabolomics and network analysis can provide new strategies and ideas for the interpretation of pathogenesis of disease with full consideration of "gene-protein-metabolite."

  7. The integration processing of the visual and auditory information in videos of real-world events: an ERP study.

    PubMed

    Liu, Baolin; Wang, Zhongning; Jin, Zhixing

    2009-09-11

    In real life, the human brain usually receives information through visual and auditory channels and processes the multisensory information, but studies on the integration processing of the dynamic visual and auditory information are relatively few. In this paper, we have designed an experiment, where through the presentation of common scenario, real-world videos, with matched and mismatched actions (images) and sounds as stimuli, we aimed to study the integration processing of synchronized visual and auditory information in videos of real-world events in the human brain, through the use event-related potentials (ERPs) methods. Experimental results showed that videos of mismatched actions (images) and sounds would elicit a larger P400 as compared to videos of matched actions (images) and sounds. We believe that the P400 waveform might be related to the cognitive integration processing of mismatched multisensory information in the human brain. The results also indicated that synchronized multisensory information would interfere with each other, which would influence the results of the cognitive integration processing.

  8. Real-time nondestructive monitoring of the gas tungsten arc welding (GTAW) process by combined airborne acoustic emission and non-contact ultrasonics

    NASA Astrophysics Data System (ADS)

    Zhang, Lu; Basantes-Defaz, Alexandra-Del-Carmen; Abbasi, Zeynab; Yuhas, Donald; Ozevin, Didem; Indacochea, Ernesto

    2018-03-01

    Welding is a key manufacturing process for many industries and may introduce defects into the welded parts causing significant negative impacts, potentially ruining high-cost pieces. Therefore, a real-time process monitoring method is important to implement for avoiding producing a low-quality weld. Due to high surface temperature and possible contamination of surface by contact transducers, the welding process should be monitored via non-contact transducers. In this paper, airborne acoustic emission (AE) transducers tuned at 60 kHz and non-contact ultrasonic testing (UT) transducers tuned at 500 kHz are implemented for real time weld monitoring. AE is a passive nondestructive evaluation method that listens for the process noise, and provides information about the uniformity of manufacturing process. UT provides more quantitative information about weld defects. One of the most common weld defects as burn-through is investigated. The influences of weld defects on AE signatures (time-driven data) and UT signals (received signal energy, change in peak frequency) are presented. The level of burn-through damage is defined by using single method or combine AE/UT methods.

  9. A fuzzy-theory-based method for studying the effect of information transmission on nonlinear crowd dispersion dynamics

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lo, Siuming

    2017-01-01

    Emergencies involved in mass events are related to a variety of factors and processes. An important factor is the transmission of information on danger that has an influence on nonlinear crowd dynamics during the process of crowd dispersion. Due to much uncertainty in this process, there is an urgent need to propose a method to investigate the influence. In this paper, a novel fuzzy-theory-based method is presented to study crowd dynamics under the influence of information transmission. Fuzzy functions and rules are designed for the ambiguous description of human states. Reasonable inference is employed to decide the output values of decision making such as pedestrian movement speed and directions. Through simulation under four-way pedestrian situations, good crowd dispersion phenomena are achieved. Simulation results under different conditions demonstrate that information transmission cannot always induce successful crowd dispersion in all situations. This depends on whether decision strategies in response to information on danger are unified and effective, especially in dense crowds. Results also suggest that an increase in drift strength at low density and the percentage of pedestrians, who choose one of the furthest unoccupied Von Neumann neighbors from the dangerous source as the drift direction at high density, is helpful in crowd dispersion. Compared with previous work, our comprehensive study improves an in-depth understanding of nonlinear crowd dynamics under the effect of information on danger.

  10. ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES

    EPA Science Inventory

    Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...

  11. Information Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture Method Report

    DTIC Science & Technology

    1992-05-01

    methodology, knowledge acquisition, 140 requirements definition, information systems, information engineering, 16. PRICE CODE systems engineering...and knowledge resources. Like manpower, materials, and machines, information and knowledge assets are recognized as vital resources that can be...evolve towards an information -integrated enterprise. These technologies are designed to leverage information and knowledge resources as the key

  12. Modelling health care processes for eliciting user requirements: a way to link a quality paradigm and clinical information system design.

    PubMed

    Staccini, P; Joubert, M; Quaranta, J F; Fieschi, D; Fieschi, M

    2001-12-01

    Healthcare institutions are looking at ways to increase their efficiency by reducing costs while providing care services with a high level of safety. Thus, hospital information systems have to support quality improvement objectives. The elicitation of the requirements has to meet users' needs in relation to both the quality (efficacy, safety) and the monitoring of all health care activities (traceability). Information analysts need methods to conceptualise clinical information systems that provide actors with individual benefits and guide behavioural changes. A methodology is proposed to elicit and structure users' requirements using a process-oriented analysis, and it is applied to the blood transfusion process. An object-oriented data model of a process has been defined in order to organise the data dictionary. Although some aspects of activity, such as 'where', 'what else', and 'why' are poorly represented by the data model alone, this method of requirement elicitation fits the dynamic of data input for the process to be traced. A hierarchical representation of hospital activities has to be found for the processes to be interrelated, and for their characteristics to be shared, in order to avoid data redundancy and to fit the gathering of data with the provision of care.

  13. Potential capabilities for compression of information of certain data processing systems

    NASA Technical Reports Server (NTRS)

    Khodarev, Y. K.; Yevdokimov, V. P.; Pokras, V. M.

    1974-01-01

    This article undertakes to study a generalized block diagram of a data collection and processing system of a spacecraft in which a number of sensors or outputs of scientific instruments are cyclically interrogated by a commutator, methods of writing the supplementary information in a frame on the example of a certain hypothetical telemetry system, and the influence of statistics of number of active channels in a frame on frame compression factor. The separation of the data compression factor of the collection and processing system of spacecraft into two parts used in this work allows determination of the compression factor of an active frame depending not only on the statistics of activity of channels in the telemetry frame, but also on the method of introduction of the additional address and time information to each frame.

  14. Estimating costs and performance of systems for machine processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Ballard, R. J.; Eastwood, L. F., Jr.

    1977-01-01

    This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.

  15. Task–Technology Fit of Video Telehealth for Nurses in an Outpatient Clinic Setting

    PubMed Central

    Finkelstein, Stanley M.

    2014-01-01

    Abstract Background: Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task–technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task–technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. Materials and Methods: The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time–motion study. Qualitative and quantitative results were merged and analyzed within the task–technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Results: Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task–technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Conclusions: Telehealth must provide the right information to the right clinician at the right time. Evaluating task–technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology. PMID:24841219

  16. A Qualitative Case Study Approach To Examine Information Resources Management. (Utilisation d'une Approche Qualitative par Methode de cas pour Etudier la Gestion des Ressources D'information).

    ERIC Educational Resources Information Center

    Bergeron, Pierrette

    1997-01-01

    Illustrates how a qualitative approach was used to study the complex and poorly defined concept of information resources management. Explains the general approach to data collection, its advantages and limitations, and the process used to analyze the data. Presents results, along with lessons learned through using method. (Author/AEF)

  17. Aligned and Unaligned Coherence: A New Diagnostic Tool

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton

    2006-01-01

    The study of combustion noise from turbofan engines has become important again as the noise from other sources like the fan and jet are reduced. A method has been developed to help identify combustion noise spectra using an aligned and unaligned coherence technique. When used with the well known three signal coherent power method and coherent power method it provides new information by separating tonal information from random process information. Examples are presented showing the underlying tonal structure which is buried under broadband noise and jet noise. The method is applied to data from a Pratt and Whitney PW4098 turbofan engine.

  18. Mutual information based feature selection for medical image retrieval

    NASA Astrophysics Data System (ADS)

    Zhi, Lijia; Zhang, Shaomin; Li, Yan

    2018-04-01

    In this paper, authors propose a mutual information based method for lung CT image retrieval. This method is designed to adapt to different datasets and different retrieval task. For practical applying consideration, this method avoids using a large amount of training data. Instead, with a well-designed training process and robust fundamental features and measurements, the method in this paper can get promising performance and maintain economic training computation. Experimental results show that the method has potential practical values for clinical routine application.

  19. Dimension Reduction of Multivariable Optical Emission Spectrometer Datasets for Industrial Plasma Processes

    PubMed Central

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453

  20. Investigating habitat value to inform contaminant remediation options: approach

    Treesearch

    Rebecca A. Efroymson; Mark J. Peterson; Christopher J. Welsh; Daniel L. Druckenbrod; Michael G. Ryon; John G. Smith; William W. Hargrove; Neil R. Giffen; W. Kelly Roy; Harry D. Quarles

    2008-01-01

    Habitat valuation methods are most often developed and used to prioritize candidate lands for conservation. In this study the intent of habitat valuation was to inform the decision-making process for remediation of chemical contaminants on specific lands or surface water bodies. Methods were developed to summarize dimensions of habitat value for six representative...

  1. Electronic laboratory data quality and the value of a health information exchange to support public health reporting processes.

    PubMed

    Dixon, Brian E; McGowan, Julie J; Grannis, Shaun J

    2011-01-01

    There is increasing interest in leveraging electronic health data across disparate sources for a variety of uses. A fallacy often held by data consumers is that clinical data quality is homogeneous across sources. We examined one attribute of data quality, completeness, in the context of electronic laboratory reporting of notifiable disease information. We evaluated 7.5 million laboratory reports from clinical information systems for their completeness with respect to data needed for public health reporting processes. We also examined the impact of health information exchange (HIE) enhancement methods that attempt to improve completeness. The laboratory data were heterogeneous in their completeness. Fields identifying the patient and test results were usually complete. Fields containing patient demographics, patient contact information, and provider contact information were suboptimal. Data processed by the HIE were often more complete, suggesting that HIEs can support improvements to existing public health reporting processes.

  2. Medicare Part D Beneficiaries' Plan Switching Decisions and Information Processing.

    PubMed

    Han, Jayoung; Urmie, Julie

    2017-03-01

    Medicare Part D beneficiaries tend not to switch plans despite the government's efforts to engage beneficiaries in the plan switching process. Understanding current and alternative plan features is a necessary step to make informed plan switching decisions. This study explored beneficiaries' plan switching using a mixed-methods approach, with a focus on the concept of information processing. We found large variation in beneficiary comprehension of plan information among both switchers and nonswitchers. Knowledge about alternative plans was especially poor, with only about half of switchers and 2 in 10 nonswitchers being well informed about plans other than their current plan. We also found that helpers had a prominent role in plan decision making-nearly twice as many switchers as nonswitchers worked with helpers for their plan selection. Our study suggests that easier access to helpers as well as helpers' extensive involvement in the decision-making process promote informed plan switching decisions.

  3. Evidence for online processing during causal learning.

    PubMed

    Liu, Pei-Pei; Luhmann, Christian C

    2015-03-01

    Many models of learning describe both the end product of learning (e.g., causal judgments) and the cognitive mechanisms that unfold on a trial-by-trial basis. However, the methods employed in the literature typically provide only indirect evidence about the unfolding cognitive processes. Here, we utilized a simultaneous secondary task to measure cognitive processing during a straightforward causal-learning task. The results from three experiments demonstrated that covariation information is not subject to uniform cognitive processing. Instead, we observed systematic variation in the processing dedicated to individual pieces of covariation information. In particular, observations that are inconsistent with previously presented covariation information appear to elicit greater cognitive processing than do observations that are consistent with previously presented covariation information. In addition, the degree of cognitive processing appears to be driven by learning per se, rather than by nonlearning processes such as memory and attention. Overall, these findings suggest that monitoring learning processes at a finer level may provide useful psychological insights into the nature of learning.

  4. Evolutionary Computing Methods for Spectral Retrieval

    NASA Technical Reports Server (NTRS)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  5. An Analysis of the Order Cycle at Coast Guard Supply Center Curtis Bay, Maryland. How to Measure Customer Service.

    DTIC Science & Technology

    1994-12-01

    Order Cycle ..... 20 2. Order Processing and the Information System .......... ................. 21 3. The Order Cycle at SCCB ...... ......... 21 v...order transmittal time, order processing time, order assembly time, stock availability, production time, and delivery time. CUSTOMER L7 2 I i u *r1...methods, inventory stocking policies, order processing procedures, transport modes, and scheduling methods [Ref. 15]. 20 2. Order Processing and the

  6. Organizing Space Shuttle parametric data for maintainability

    NASA Technical Reports Server (NTRS)

    Angier, R. C.

    1983-01-01

    A model of organization and management of Space Shuttle data is proposed. Shuttle avionics software is parametrically altered by a reconfiguration process for each flight. As the flight rate approaches an operational level, current methods of data management would become increasingly complex. An alternative method is introduced, using modularized standard data, and its implications for data collection, integration, validation, and reconfiguration processes are explored. Information modules are cataloged for later use, and may be combined in several levels for maintenance. For each flight, information modules can then be selected from the catalog at a high level. These concepts take advantage of the reusability of Space Shuttle information to reduce the cost of reconfiguration as flight experience increases.

  7. Project Method in Preparation of Future Preschool Teachers

    ERIC Educational Resources Information Center

    Anisimova, Ellina; Ibatullin, Rinat

    2018-01-01

    This article covers the issue of formation of information competence of future preschool teachers. Efficiency of using information technologies in educational process depends on the level of information competence of a teacher. A modern teacher has to use information technologies reasonably, that contribute to enriching of development of cognitive…

  8. Audio-based, unsupervised machine learning reveals cyclic changes in earthquake mechanisms in the Geysers geothermal field, California

    NASA Astrophysics Data System (ADS)

    Holtzman, B. K.; Paté, A.; Paisley, J.; Waldhauser, F.; Repetto, D.; Boschi, L.

    2017-12-01

    The earthquake process reflects complex interactions of stress, fracture and frictional properties. New machine learning methods reveal patterns in time-dependent spectral properties of seismic signals and enable identification of changes in faulting processes. Our methods are based closely on those developed for music information retrieval and voice recognition, using the spectrogram instead of the waveform directly. Unsupervised learning involves identification of patterns based on differences among signals without any additional information provided to the algorithm. Clustering of 46,000 earthquakes of $0.3

  9. Optical recording of information on paper by CO2 and YAG-lasers

    NASA Astrophysics Data System (ADS)

    Bayev, S. G.; Bessemltsev, V. P.; Koronkevich, D. V.; Tkachuk, Y. N.

    1984-09-01

    Methods for outputting information from computers that have the advantages of typographic printing processes, but are distinguished by the lack of an intermediate medium are investigated. Methods for recording graphic and half-tone images are investigated that are based on layers of ink deposited on the paper in advance, as well as fixing a temperature-sensitive dye on the paper by using a focused laser beam with radiation power density of .000001 w/sq.cm. to heat the surface. IR process lasers provide good efficiency and resolution.

  10. Autoplan: A self-processing network model for an extended blocks world planning environment

    NASA Technical Reports Server (NTRS)

    Dautrechy, C. Lynne; Reggia, James A.; Mcfadden, Frank

    1990-01-01

    Self-processing network models (neural/connectionist models, marker passing/message passing networks, etc.) are currently undergoing intense investigation for a variety of information processing applications. These models are potentially very powerful in that they support a large amount of explicit parallel processing, and they cleanly integrate high level and low level information processing. However they are currently limited by a lack of understanding of how to apply them effectively in many application areas. The formulation of self-processing network methods for dynamic, reactive planning is studied. The long-term goal is to formulate robust, computationally effective information processing methods for the distributed control of semiautonomous exploration systems, e.g., the Mars Rover. The current research effort is focusing on hierarchical plan generation, execution and revision through local operations in an extended blocks world environment. This scenario involves many challenging features that would be encountered in a real planning and control environment: multiple simultaneous goals, parallel as well as sequential action execution, action sequencing determined not only by goals and their interactions but also by limited resources (e.g., three tasks, two acting agents), need to interpret unanticipated events and react appropriately through replanning, etc.

  11. Three-dimensional image signals: processing methods

    NASA Astrophysics Data System (ADS)

    Schiopu, Paul; Manea, Adrian; Craciun, Anca-Ileana; Craciun, Alexandru

    2010-11-01

    Over the years extensive studies have been carried out to apply coherent optics methods in real-time processing, communications and transmission image. This is especially true when a large amount of information needs to be processed, e.g., in high-resolution imaging. The recent progress in data-processing networks and communication systems has considerably increased the capacity of information exchange. We describe the results of literature investigation research of processing methods for the signals of the three-dimensional images. All commercially available 3D technologies today are based on stereoscopic viewing. 3D technology was once the exclusive domain of skilled computer-graphics developers with high-end machines and software. The images capture from the advanced 3D digital camera can be displayed onto screen of the 3D digital viewer with/ without special glasses. For this is needed considerable processing power and memory to create and render the complex mix of colors, textures, and virtual lighting and perspective necessary to make figures appear three-dimensional. Also, using a standard digital camera and a technique called phase-shift interferometry we can capture "digital holograms." These are holograms that can be stored on computer and transmitted over conventional networks. We present some research methods to process "digital holograms" for the Internet transmission and results.

  12. Hyperspectral image processing methods

    USDA-ARS?s Scientific Manuscript database

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  13. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  14. Mapping Miles and Huberman's Within-Case and Cross-Case Analysis Methods onto the Literature Review Process

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Weinbaum, Rebecca K.

    2016-01-01

    Recently, several authors have attempted to make the literature review process more transparent by providing a step-by-step guide to conducting literature reviews. However, although these works are very informative, none of them delineate how to display information extracted from literature reviews in a reader-friendly and visually appealing…

  15. A flexible statistical model for alignment of label-free proteomics data – incorporating ion mobility and product ion information

    PubMed Central

    2013-01-01

    Background The goal of many proteomics experiments is to determine the abundance of proteins in biological samples, and the variation thereof in various physiological conditions. High-throughput quantitative proteomics, specifically label-free LC-MS/MS, allows rapid measurement of thousands of proteins, enabling large-scale studies of various biological systems. Prior to analyzing these information-rich datasets, raw data must undergo several computational processing steps. We present a method to address one of the essential steps in proteomics data processing - the matching of peptide measurements across samples. Results We describe a novel method for label-free proteomics data alignment with the ability to incorporate previously unused aspects of the data, particularly ion mobility drift times and product ion information. We compare the results of our alignment method to PEPPeR and OpenMS, and compare alignment accuracy achieved by different versions of our method utilizing various data characteristics. Our method results in increased match recall rates and similar or improved mismatch rates compared to PEPPeR and OpenMS feature-based alignment. We also show that the inclusion of drift time and product ion information results in higher recall rates and more confident matches, without increases in error rates. Conclusions Based on the results presented here, we argue that the incorporation of ion mobility drift time and product ion information are worthy pursuits. Alignment methods should be flexible enough to utilize all available data, particularly with recent advancements in experimental separation methods. PMID:24341404

  16. A flexible statistical model for alignment of label-free proteomics data--incorporating ion mobility and product ion information.

    PubMed

    Benjamin, Ashlee M; Thompson, J Will; Soderblom, Erik J; Geromanos, Scott J; Henao, Ricardo; Kraus, Virginia B; Moseley, M Arthur; Lucas, Joseph E

    2013-12-16

    The goal of many proteomics experiments is to determine the abundance of proteins in biological samples, and the variation thereof in various physiological conditions. High-throughput quantitative proteomics, specifically label-free LC-MS/MS, allows rapid measurement of thousands of proteins, enabling large-scale studies of various biological systems. Prior to analyzing these information-rich datasets, raw data must undergo several computational processing steps. We present a method to address one of the essential steps in proteomics data processing--the matching of peptide measurements across samples. We describe a novel method for label-free proteomics data alignment with the ability to incorporate previously unused aspects of the data, particularly ion mobility drift times and product ion information. We compare the results of our alignment method to PEPPeR and OpenMS, and compare alignment accuracy achieved by different versions of our method utilizing various data characteristics. Our method results in increased match recall rates and similar or improved mismatch rates compared to PEPPeR and OpenMS feature-based alignment. We also show that the inclusion of drift time and product ion information results in higher recall rates and more confident matches, without increases in error rates. Based on the results presented here, we argue that the incorporation of ion mobility drift time and product ion information are worthy pursuits. Alignment methods should be flexible enough to utilize all available data, particularly with recent advancements in experimental separation methods.

  17. Image acquisitions, processing and analysis in the process of obtaining characteristics of horse navicular bone

    NASA Astrophysics Data System (ADS)

    Zaborowicz, M.; Włodarek, J.; Przybylak, A.; Przybył, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Boniecki, P.; Koszela, K.; Przybył, J.; Skwarcz, J.

    2015-07-01

    The aim of this study was investigate the possibility of using methods of computer image analysis for the assessment and classification of morphological variability and the state of health of horse navicular bone. Assumption was that the classification based on information contained in the graphical form two-dimensional digital images of navicular bone and information of horse health. The first step in the research was define the classes of analyzed bones, and then using methods of computer image analysis for obtaining characteristics from these images. This characteristics were correlated with data concerning the animal, such as: side of hooves, number of navicular syndrome (scale 0-3), type, sex, age, weight, information about lace, information about heel. This paper shows the introduction to the study of use the neural image analysis in the diagnosis of navicular bone syndrome. Prepared method can provide an introduction to the study of non-invasive way to assess the condition of the horse navicular bone.

  18. High-resolution remotely sensed small target detection by imitating fly visual perception mechanism.

    PubMed

    Huang, Fengchen; Xu, Lizhong; Li, Min; Tang, Min

    2012-01-01

    The difficulty and limitation of small target detection methods for high-resolution remote sensing data have been a recent research hot spot. Inspired by the information capture and processing theory of fly visual system, this paper endeavors to construct a characterized model of information perception and make use of the advantages of fast and accurate small target detection under complex varied nature environment. The proposed model forms a theoretical basis of small target detection for high-resolution remote sensing data. After the comparison of prevailing simulation mechanism behind fly visual systems, we propose a fly-imitated visual system method of information processing for high-resolution remote sensing data. A small target detector and corresponding detection algorithm are designed by simulating the mechanism of information acquisition, compression, and fusion of fly visual system and the function of pool cell and the character of nonlinear self-adaption. Experiments verify the feasibility and rationality of the proposed small target detection model and fly-imitated visual perception method.

  19. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  20. Designing Class Methods from Dataflow Diagrams

    NASA Astrophysics Data System (ADS)

    Shoval, Peretz; Kabeli-Shani, Judith

    A method for designing the class methods of an information system is described. The method is part of FOOM - Functional and Object-Oriented Methodology. In the analysis phase of FOOM, two models defining the users' requirements are created: a conceptual data model - an initial class diagram; and a functional model - hierarchical OO-DFDs (object-oriented dataflow diagrams). Based on these models, a well-defined process of methods design is applied. First, the OO-DFDs are converted into transactions, i.e., system processes that supports user task. The components and the process logic of each transaction are described in detail, using pseudocode. Then, each transaction is decomposed, according to well-defined rules, into class methods of various types: basic methods, application-specific methods and main transaction (control) methods. Each method is attached to a proper class; messages between methods express the process logic of each transaction. The methods are defined using pseudocode or message charts.

  1. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer

    PubMed Central

    Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer

    2014-01-01

    Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086

  2. A quantum informational approach for dissecting chemical reactions

    NASA Astrophysics Data System (ADS)

    Duperrouzel, Corinne; Tecmer, Paweł; Boguslawski, Katharina; Barcza, Gergely; Legeza, Örs; Ayers, Paul W.

    2015-02-01

    We present a conceptionally different approach to dissect bond-formation processes in metal-driven catalysis using concepts from quantum information theory. Our method uses the entanglement and correlation among molecular orbitals to analyze changes in electronic structure that accompany chemical processes. As a proof-of-principle example, the evolution of nickel-ethene bond-formation is dissected, which allows us to monitor the interplay of back-bonding and π-donation along the reaction coordinate. Furthermore, the reaction pathway of nickel-ethene complexation is analyzed using quantum chemistry methods, revealing the presence of a transition state. Our study supports the crucial role of metal-to-ligand back-donation in the bond-forming process of nickel-ethene.

  3. Knowledge information management toolkit and method

    DOEpatents

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  4. Evaluation of soil erosion risk using Analytic Network Process and GIS: a case study from Spanish mountain olive plantations.

    PubMed

    Nekhay, Olexandr; Arriaza, Manuel; Boerboom, Luc

    2009-07-01

    The study presents an approach that combined objective information such as sampling or experimental data with subjective information such as expert opinions. This combined approach was based on the Analytic Network Process method. It was applied to evaluate soil erosion risk and overcomes one of the drawbacks of USLE/RUSLE soil erosion models, namely that they do not consider interactions among soil erosion factors. Another advantage of this method is that it can be used if there are insufficient experimental data. The lack of experimental data can be compensated for through the use of expert evaluations. As an example of the proposed approach, the risk of soil erosion was evaluated in olive groves in Southern Spain, showing the potential of the ANP method for modelling a complex physical process like soil erosion.

  5. Analysis of acoustic emission signals and monitoring of machining processes

    PubMed

    Govekar; Gradisek; Grabec

    2000-03-01

    Monitoring of a machining process on the basis of sensor signals requires a selection of informative inputs in order to reliably characterize and model the process. In this article, a system for selection of informative characteristics from signals of multiple sensors is presented. For signal analysis, methods of spectral analysis and methods of nonlinear time series analysis are used. With the aim of modeling relationships between signal characteristics and the corresponding process state, an adaptive empirical modeler is applied. The application of the system is demonstrated by characterization of different parameters defining the states of a turning machining process, such as: chip form, tool wear, and onset of chatter vibration. The results show that, in spite of the complexity of the turning process, the state of the process can be well characterized by just a few proper characteristics extracted from a representative sensor signal. The process characterization can be further improved by joining characteristics from multiple sensors and by application of chaotic characteristics.

  6. A conversation-based process tracing method for use with naturalistic decisions: an evaluation study.

    PubMed

    Williamson, J; Ranyard, R; Cuthbert, L

    2000-05-01

    This study is an evaluation of a process tracing method developed for naturalistic decisions, in this case a consumer choice task. The method is based on Huber et al.'s (1997) Active Information Search (AIS) technique, but develops it by providing spoken rather than written answers to respondents' questions, and by including think aloud instructions. The technique is used within a conversation-based situation, rather than the respondent thinking aloud 'into an empty space', as is conventionally the case in think aloud techniques. The method results in a concurrent verbal protocol as respondents make their decisions, and a retrospective report in the form of a post-decision summary. The method was found to be virtually non-reactive in relation to think aloud, although the variable of Preliminary Attribute Elicitation showed some evidence of reactivity. This was a methodological evaluation, and as such the data reported are essentially descriptive. Nevertheless, the data obtained indicate that the method is capable of producing information about decision processes which could have theoretical importance in terms of evaluating models of decision-making.

  7. Simplified LCA and matrix methods in identifying the environmental aspects of a product system.

    PubMed

    Hur, Tak; Lee, Jiyong; Ryu, Jiyeon; Kwon, Eunsun

    2005-05-01

    In order to effectively integrate environmental attributes into the product design and development processes, it is crucial to identify the significant environmental aspects related to a product system within a relatively short period of time. In this study, the usefulness of life cycle assessment (LCA) and a matrix method as tools for identifying the key environmental issues of a product system were examined. For this, a simplified LCA (SLCA) method that can be applied to Electrical and Electronic Equipment (EEE) was developed to efficiently identify their significant environmental aspects for eco-design, since a full scale LCA study is usually very detailed, expensive and time-consuming. The environmentally responsible product assessment (ERPA) method, which is one of the matrix methods, was also analyzed. Then, the usefulness of each method in eco-design processes was evaluated and compared using the case studies of the cellular phone and vacuum cleaner systems. It was found that the SLCA and the ERPA methods provided different information but they complemented each other to some extent. The SLCA method generated more information on the inherent environmental characteristics of a product system so that it might be useful for new design/eco-innovation when developing a completely new product or method where environmental considerations play a major role from the beginning. On the other hand, the ERPA method gave more information on the potential for improving a product so that it could be effectively used in eco-redesign which intends to alleviate environmental impacts of an existing product or process.

  8. Method and system for providing work machine multi-functional user interface

    DOEpatents

    Hoff, Brian D [Peoria, IL; Akasam, Sivaprasad [Peoria, IL; Baker, Thomas M [Peoria, IL

    2007-07-10

    A method is performed to provide a multi-functional user interface on a work machine for displaying suggested corrective action. The process includes receiving status information associated with the work machine and analyzing the status information to determine an abnormal condition. The process also includes displaying a warning message on the display device indicating the abnormal condition and determining one or more corrective actions to handle the abnormal condition. Further, the process includes determining an appropriate corrective action among the one or more corrective actions and displaying a recommendation message on the display device reflecting the appropriate corrective action. The process may also include displaying a list including the remaining one or more corrective actions on the display device to provide alternative actions to an operator.

  9. Suitability aero-geophysical methods for generating conceptual soil maps and their use in the modeling of process-related susceptibility maps

    NASA Astrophysics Data System (ADS)

    Tilch, Nils; Römer, Alexander; Jochum, Birgit; Schattauer, Ingrid

    2014-05-01

    In the past years, several times large-scale disasters occurred in Austria, which were characterized not only by flooding, but also by numerous shallow landslides and debris flows. Therefore, for the purpose of risk prevention, national and regional authorities also require more objective and realistic maps with information about spatially variable susceptibility of the geosphere for hazard-relevant gravitational mass movements. There are many and various proven methods and models (e.g. neural networks, logistic regression, heuristic methods) available to create such process-related (e.g. flat gravitational mass movements in soil) suszeptibility maps. But numerous national and international studies show a dependence of the suitability of a method on the quality of process data and parameter maps (f.e. Tilch & Schwarz 2011, Schwarz & Tilch 2011). In this case, it is important that also maps with detailed and process-oriented information on the process-relevant geosphere will be considered. One major disadvantage is that only occasionally area-wide process-relevant information exists. Similarly, in Austria often only soil maps for treeless areas are available. However, in almost all previous studies, randomly existing geological and geotechnical maps were used, which often have been specially adapted to the issues and objectives. This is one reason why very often conceptual soil maps must be derived from geological maps with only hard rock information, which often have a rather low quality. Based on these maps, for example, adjacent areas of different geological composition and process-relevant physical properties are razor sharp delineated, which in nature appears quite rarly. In order to obtain more realistic information about the spatial variability of the process-relevant geosphere (soil cover) and its physical properties, aerogeophysical measurements (electromagnetic, radiometric), carried out by helicopter, from different regions of Austria were interpreted. Previous studies show that, especially with radiometric measurements, the two-dimensional spatial variability of the nature of the process-relevant soil, close to the surface can be determined. In addition, the electromagnetic measurements are more important to obtain three-dimensional information of the deeper geological conditions and to improve the area-specific geological knowledge and understanding. The validation of these measurements is done with terrestrial geoelectrical measurements. So both aspects, radiometric and electromagnetic measurements, are important and subsequently, interpretation of the geophysical results can be used as the parameter maps in the modeling of more realistic susceptibility maps with respect to various processes. Within this presentation, results of geophysical measurements, the outcome and the derived parameter maps, as well as first process-oriented susceptibility maps in terms of gravitational soil mass movements will be presented. As an example results which were obtained with a heuristic method in an area in Vorarlberg (Western Austria) will be shown. References: Schwarz, L. & Tilch, N. (2011): Why are good process data so important for the modelling of landslide susceptibility maps?- EGU-Postersession "Landslide hazard and risk assessment, and landslide management" (NH 3.6), Vienna. [http://www.geologie.ac.at/fileadmin/user_upload/dokumente/pdf/poster/poster_2011_egu_schwarz_tilch_1.pdf] Tilch, N. & Schwarz, L. (2011): Spatial and scale-dependent variability in data quality and their influence on susceptibility maps for gravitational mass movements in soil, modelled by heuristic method.- EGU-Postersession "Landslide hazard and risk assessment, and landslide management" (NH 3.6); Vienna. [http://www.geologie.ac.at/fileadmin/user_upload/dokumente/pdf/poster/poster_2011_egu_tilch_schwarz.pdf

  10. Adolescent and Young Women's Contraceptive Decision-Making Processes: Choosing "The Best Method for Her".

    PubMed

    Melo, Juliana; Peters, Marissa; Teal, Stephanie; Guiahi, Maryam

    2015-08-01

    To evaluate influences on adolescent and young women's contraceptive decision-making processes. We conducted 21 individual interviews with women who presented to an adolescent-focused Title X family planning clinic seeking a new contraceptive method. Data were collected using a semi-structured interview guide, audio-taped and transcribed. Three researchers independently coded the transcripts using grounded theory; codes were organized into overarching themes and discrepancies were resolved. After identification of themes, we organized the conceptual framework of the decision-making process using the transtheoretical model of behavior change in which participants move through 4 stages: (1) contemplation, (2) preparation, (3) action, and (4) maintenance. When contemplating contraception, most of our participants were highly motivated to avoid pregnancy. During preparation, participants gathered information related to their contraceptive concerns. Participants cited peers as primary informants and healthcare providers as experts in the field. Participants integrated information received with their personal concerns about contraception initiation; the most common concerns were effectiveness, method duration, convenience, and side effects. When participants acted on choosing a contraceptive method they described how it fit their individual needs. They considered their contraceptive experiences unique and not necessarily applicable to others. During maintenance, they acted as informants for other peers, but most commonly expressed that each individual must choose "the best method for her." When adolescent and young women select a contraceptive method they balance the benefits and risks of available methods portrayed by peers and provider in the context of their personal concerns. Peer influence appeared to be greatest when participants shared contraceptive concerns and goals. Copyright © 2015 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.

  11. Hidden messenger revealed in Hawking radiation: A resolution to the paradox of black hole information loss

    NASA Astrophysics Data System (ADS)

    Zhang, Baocheng; Cai, Qing-yu; You, Li; Zhan, Ming-sheng

    2009-05-01

    Using standard statistical method, we discover the existence of correlations among Hawking radiations (of tunneled particles) from a black hole. The information carried by such correlations is quantified by mutual information between sequential emissions. Through a careful counting of the entropy taken out by the emitted particles, we show that the black hole radiation as tunneling is an entropy conservation process. While information is leaked out through the radiation, the total entropy is conserved. Thus, we conclude the black hole evaporation process is unitary.

  12. Determining e-Portfolio Elements in Learning Process Using Fuzzy Delphi Analysis

    ERIC Educational Resources Information Center

    Mohamad, Syamsul Nor Azlan; Embi, Mohamad Amin; Nordin, Norazah

    2015-01-01

    The present article introduces the Fuzzy Delphi method results obtained in the study on determining e-Portfolio elements in learning process for art and design context. This method bases on qualified experts that assure the validity of the collected information. In particular, the confirmation of elements is based on experts' opinion and…

  13. Design of solar thermal dryers for 24-hour food drying processes (abstract)

    USDA-ARS?s Scientific Manuscript database

    Solar drying is a ubiquitous method that has been adopted for many years as a food preservation method. Most of the published articles in the literature provide insight on the performance of solar dryers in service but little information on the dryer construction material selection process or mater...

  14. Decision Makers' Allocation of Home-Care Therapy Services: A Process Map

    PubMed Central

    Poss, Jeff; Egan, Mary; Rappolt, Susan; Berg, Katherine

    2013-01-01

    ABSTRACT Purpose: To explore decision-making processes currently used in allocating occupational and physical therapy services in home care for complex long-stay clients in Ontario. Method: An exploratory study using key-informant interviews and client vignettes was conducted with home-care decision makers (case managers and directors) from four home-care regions in Ontario. The interview data were analyzed using the framework analysis method. Results: The decision-making process for allocating therapy services has four stages: intake, assessment, referral to service provider, and reassessment. There are variations in the management processes deployed at each stage. The major variation is in the process of determining the volume of therapy services across home-care regions, primarily as a result of financial constraints affecting the home-care programme. Government funding methods and methods of information sharing also significantly affect home-care therapy allocation. Conclusion: Financial constraints in home care are the primary contextual factor affecting allocation of therapy services across home-care regions. Given the inflation of health care costs, new models of funding and service delivery need to be developed to ensure that the right person receives the right care before deteriorating and requiring more costly long-term care. PMID:24403672

  15. Designing quantum information processing via structural physical approximation.

    PubMed

    Bae, Joonwoo

    2017-10-01

    In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.

  16. Designing quantum information processing via structural physical approximation

    NASA Astrophysics Data System (ADS)

    Bae, Joonwoo

    2017-10-01

    In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.

  17. P-Code-Enhanced Encryption-Mode Processing of GPS Signals

    NASA Technical Reports Server (NTRS)

    Young, Lawrence; Meehan, Thomas; Thomas, Jess B.

    2003-01-01

    A method of processing signals in a Global Positioning System (GPS) receiver has been invented to enable the receiver to recover some of the information that is otherwise lost when GPS signals are encrypted at the transmitters. The need for this method arises because, at the option of the military, precision GPS code (P-code) is sometimes encrypted by a secret binary code, denoted the A code. Authorized users can recover the full signal with knowledge of the A-code. However, even in the absence of knowledge of the A-code, one can track the encrypted signal by use of an estimate of the A-code. The present invention is a method of making and using such an estimate. In comparison with prior such methods, this method makes it possible to recover more of the lost information and obtain greater accuracy.

  18. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  19. 40 CFR 721.4040 - Glycols, polyethylene-, 3-sulfo-2-hydroxypropyl-p-(1,1,3,3-tetra-methylbutyl)phenyl ether, sodium...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information, and any information on methods for protecting against such risk, into an MSDS as described at... Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT SIGNIFICANT NEW USES OF... substance is any manner or method of manufacture, import, or processing associated with any use of this...

  20. 40 CFR 721.3152 - Ethanaminium, N-ethyl-2-hydroxy-N,N-bis(2-hydroxyethyl)-, diester with C12-18 fatty acids, ethyl...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... must incorporate this new information, and any information on methods for protecting against such risk....3152 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT... significant new use of this substance is any manner or method of manufacture, import, or processing associated...

  1. [Dissemination of medical information in Europe, the USA and Japan, 1850-1870: focusing on information concerning the hypodermic injection method].

    PubMed

    Tsukisawa, Miyoko

    2011-12-01

    Modern medicine was introduced in Japan in the second half of the nineteenth century. In order to investigate this historical process, this paper focuses on the dissemination of information of a new medical technology developed in the mid-nineteenth century; it does so by making comparisons of the access to medical information between Europe, the USA and Japan. The hypodermic injection method was introduced in the clinical field in Europe and the USA as a newly developed therapeutic method during the 1850s and 1870s. This study analyzed information on the medical assessments of this method by clinicians of these periods. The crucial factor in accumulating this information was to develop a worldwide inter-medical communication circle with the aid of the medical journals. Information on the hypodermic injection method was introduced in Japan almost simultaneously with its introduction in Europe and the USA. However, because of the geographical distance and the language barrier, Japanese clinicians lacked access to this worldwide communication circle, and they accepted this new method without adequate medical technology assessments.

  2. The architecture of the management system of complex steganographic information

    NASA Astrophysics Data System (ADS)

    Evsutin, O. O.; Meshcheryakov, R. V.; Kozlova, A. S.; Solovyev, T. M.

    2017-01-01

    The aim of the study is to create a wide area information system that allows one to control processes of generation, embedding, extraction, and detection of steganographic information. In this paper, the following problems are considered: the definition of the system scope and the development of its architecture. For creation of algorithmic maintenance of the system, classic methods of steganography are used to embed information. Methods of mathematical statistics and computational intelligence are used to identify the embedded information. The main result of the paper is the development of the architecture of the management system of complex steganographic information. The suggested architecture utilizes cloud technology in order to provide service using the web-service via the Internet. It is meant to provide streams of multimedia data processing that are streams with many sources of different types. The information system, built in accordance with the proposed architecture, will be used in the following areas: hidden transfer of documents protected by medical secrecy in telemedicine systems; copyright protection of online content in public networks; prevention of information leakage caused by insiders.

  3. From the Research Laboratory to the Operating Company: How Information Travels.

    ERIC Educational Resources Information Center

    Coppin, Ann S.; Palmer, Linda L.

    1980-01-01

    Reviews transmission processes of Chevron Oil Field Research Company (COFRC) research results from laboratories to end-user operating companies worldwide. Information dissemination methods described included informal communication, intercompany meetings, visits by COFRC personnel to operating company offices, distribution of written reports,…

  4. Using the Work System Method with Freshman Information Systems Students

    ERIC Educational Resources Information Center

    Recker, Jan; Alter, Steven

    2012-01-01

    Recent surveys of information technology management professionals show that understanding business domains in terms of business productivity and cost reduction potential, knowledge of different vertical industry segments and their information requirements, understanding of business processes and client-facing skills are more critical for…

  5. Linear information retrieval method in X-ray grating-based phase contrast imaging and its interchangeability with tomographic reconstruction

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Gao, K.; Wang, Z. L.; Shao, Q. G.; Hu, R. F.; Wei, C. X.; Zan, G. B.; Wali, F.; Luo, R. H.; Zhu, P. P.; Tian, Y. C.

    2017-06-01

    In X-ray grating-based phase contrast imaging, information retrieval is necessary for quantitative research, especially for phase tomography. However, numerous and repetitive processes have to be performed for tomographic reconstruction. In this paper, we report a novel information retrieval method, which enables retrieving phase and absorption information by means of a linear combination of two mutually conjugate images. Thanks to the distributive law of the multiplication as well as the commutative law and associative law of the addition, the information retrieval can be performed after tomographic reconstruction, thus simplifying the information retrieval procedure dramatically. The theoretical model of this method is established in both parallel beam geometry for Talbot interferometer and fan beam geometry for Talbot-Lau interferometer. Numerical experiments are also performed to confirm the feasibility and validity of the proposed method. In addition, we discuss its possibility in cone beam geometry and its advantages compared with other methods. Moreover, this method can also be employed in other differential phase contrast imaging methods, such as diffraction enhanced imaging, non-interferometric imaging, and edge illumination.

  6. Future electro-optical sensors and processing in urban operations

    NASA Astrophysics Data System (ADS)

    Grönwall, Christina; Schwering, Piet B.; Rantakokko, Jouni; Benoist, Koen W.; Kemp, Rob A. W.; Steinvall, Ove; Letalick, Dietmar; Björkert, Stefan

    2013-10-01

    In the electro-optical sensors and processing in urban operations (ESUO) study we pave the way for the European Defence Agency (EDA) group of Electro-Optics experts (IAP03) for a common understanding of the optimal distribution of processing functions between the different platforms. Combinations of local, distributed and centralized processing are proposed. In this way one can match processing functionality to the required power, and available communication systems data rates, to obtain the desired reaction times. In the study, three priority scenarios were defined. For these scenarios, present-day and future sensors and signal processing technologies were studied. The priority scenarios were camp protection, patrol and house search. A method for analyzing information quality in single and multi-sensor systems has been applied. A method for estimating reaction times for transmission of data through the chain of command has been proposed and used. These methods are documented and can be used to modify scenarios, or be applied to other scenarios. Present day data processing is organized mainly locally. Very limited exchange of information with other platforms is present; this is performed mainly at a high information level. Main issues that arose from the analysis of present-day systems and methodology are the slow reaction time due to the limited field of view of present-day sensors and the lack of robust automated processing. Efficient handover schemes between wide and narrow field of view sensors may however reduce the delay times. The main effort in the study was in forecasting the signal processing of EO-sensors in the next ten to twenty years. Distributed processing is proposed between hand-held and vehicle based sensors. This can be accompanied by cloud processing on board several vehicles. Additionally, to perform sensor fusion on sensor data originating from different platforms, and making full use of UAV imagery, a combination of distributed and centralized processing is essential. There is a central role for sensor fusion of heterogeneous sensors in future processing. The changes that occur in the urban operations of the future due to the application of these new technologies will be the improved quality of information, with shorter reaction time, and with lower operator load.

  7. Process Mining-Based Method of Designing and Optimizing the Layouts of Emergency Departments in Hospitals.

    PubMed

    Rismanchian, Farhood; Lee, Young Hoon

    2017-07-01

    This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.

  8. Investigation of signal models and methods for evaluating structures of processing telecommunication information exchange systems under acoustic noise conditions

    NASA Astrophysics Data System (ADS)

    Kropotov, Y. A.; Belov, A. A.; Proskuryakov, A. Y.; Kolpakov, A. A.

    2018-05-01

    The paper considers models and methods for estimating signals during the transmission of information messages in telecommunication systems of audio exchange. One-dimensional probability distribution functions that can be used to isolate useful signals, and acoustic noise interference are presented. An approach to the estimation of the correlation and spectral functions of the parameters of acoustic signals is proposed, based on the parametric representation of acoustic signals and the components of the noise components. The paper suggests an approach to improving the efficiency of interference cancellation and highlighting the necessary information when processing signals from telecommunications systems. In this case, the suppression of acoustic noise is based on the methods of adaptive filtering and adaptive compensation. The work also describes the models of echo signals and the structure of subscriber devices in operational command telecommunications systems.

  9. A multistage motion vector processing method for motion-compensated frame interpolation.

    PubMed

    Huang, Ai- Mei; Nguyen, Truong Q

    2008-05-01

    In this paper, a novel, low-complexity motion vector processing algorithm at the decoder is proposed for motion-compensated frame interpolation or frame rate up-conversion. We address the problems of having broken edges and deformed structures in an interpolated frame by hierarchically refining motion vectors on different block sizes. Our method explicitly considers the reliability of each received motion vector and has the capability of preserving the structure information. This is achieved by analyzing the distribution of residual energies and effectively merging blocks that have unreliable motion vectors. The motion vector reliability information is also used as a prior knowledge in motion vector refinement using a constrained vector median filter to avoid choosing identical unreliable one. We also propose using chrominance information in our method. Experimental results show that the proposed scheme has better visual quality and is also robust, even in video sequences with complex scenes and fast motion.

  10. Uncertainty in Agricultural Impact Assessment

    NASA Technical Reports Server (NTRS)

    Wallach, Daniel; Mearns, Linda O.; Rivington, Michael; Antle, John M.; Ruane, Alexander C.

    2014-01-01

    This chapter considers issues concerning uncertainty associated with modeling and its use within agricultural impact assessments. Information about uncertainty is important for those who develop assessment methods, since that information indicates the need for, and the possibility of, improvement of the methods and databases. Such information also allows one to compare alternative methods. Information about the sources of uncertainties is an aid in prioritizing further work on the impact assessment method. Uncertainty information is also necessary for those who apply assessment methods, e.g., for projecting climate change impacts on agricultural production and for stakeholders who want to use the results as part of a decision-making process (e.g., for adaptation planning). For them, uncertainty information indicates the degree of confidence they can place in the simulated results. Quantification of uncertainty also provides stakeholders with an important guideline for making decisions that are robust across the known uncertainties. Thus, uncertainty information is important for any decision based on impact assessment. Ultimately, we are interested in knowledge about uncertainty so that information can be used to achieve positive outcomes from agricultural modeling and impact assessment.

  11. Case retrieval in medical databases by fusing heterogeneous information.

    PubMed

    Quellec, Gwénolé; Lamard, Mathieu; Cazuguel, Guy; Roux, Christian; Cochener, Béatrice

    2011-01-01

    A novel content-based heterogeneous information retrieval framework, particularly well suited to browse medical databases and support new generation computer aided diagnosis (CADx) systems, is presented in this paper. It was designed to retrieve possibly incomplete documents, consisting of several images and semantic information, from a database; more complex data types such as videos can also be included in the framework. The proposed retrieval method relies on image processing, in order to characterize each individual image in a document by their digital content, and information fusion. Once the available images in a query document are characterized, a degree of match, between the query document and each reference document stored in the database, is defined for each attribute (an image feature or a metadata). A Bayesian network is used to recover missing information if need be. Finally, two novel information fusion methods are proposed to combine these degrees of match, in order to rank the reference documents by decreasing relevance for the query. In the first method, the degrees of match are fused by the Bayesian network itself. In the second method, they are fused by the Dezert-Smarandache theory: the second approach lets us model our confidence in each source of information (i.e., each attribute) and take it into account in the fusion process for a better retrieval performance. The proposed methods were applied to two heterogeneous medical databases, a diabetic retinopathy database and a mammography screening database, for computer aided diagnosis. Precisions at five of 0.809 ± 0.158 and 0.821 ± 0.177, respectively, were obtained for these two databases, which is very promising.

  12. Strengthening the Social Information-Processing Skills of Children: A Controlled Test of the "Let's Be Friends" Program in China

    ERIC Educational Resources Information Center

    Wu, Fan; Fraser, Mark W.; Guo, Shenyang; Day, Steven H.; Galinsky, Maeda J.

    2016-01-01

    Objective: The study had two objectives (a) to adapt for Chinese children an intervention designed to strengthen the social information--processing (SIP) skills of children in the United States, and (b) to pilot test the adapted intervention in China. Methods: Adaptation of the "Making Choices" program involved reviewing Chinese…

  13. Effect of Warning Placement on the Information Processing of College Students Reading an OTC Drug Facts Panel

    ERIC Educational Resources Information Center

    Bhansali, Archita H.; Sangani, Darshan S.; Mhatre, Shivani K.; Sansgiry, Sujit S.

    2018-01-01

    Objective: To compare three over-the-counter (OTC) Drug Facts panel versions for information processing optimization among college students.Participants: University of Houston students (N = 210) participated in a cross-sectional survey from January to May 2010.Methods: A current FDA label was compared to two experimental labels developed using the…

  14. Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aimone, James Bradley; Betty, Rita

    Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information - Sandia researchers developed novel methods and metrics for studying the computational function of neurogenesis, thus generating substantial impact to the neuroscience and neural computing communities. This work could benefit applications in machine learning and other analysis activities.

  15. A Coding System for Qualitative Studies of the Information-Seeking Process in Computer Science Research

    ERIC Educational Resources Information Center

    Moral, Cristian; de Antonio, Angelica; Ferre, Xavier; Lara, Graciela

    2015-01-01

    Introduction: In this article we propose a qualitative analysis tool--a coding system--that can support the formalisation of the information-seeking process in a specific field: research in computer science. Method: In order to elaborate the coding system, we have conducted a set of qualitative studies, more specifically a focus group and some…

  16. Parallel approach to incorporating face image information into dialogue processing

    NASA Astrophysics Data System (ADS)

    Ren, Fuji

    2000-10-01

    There are many kinds of so-called irregular expressions in natural dialogues. Even if the content of a conversation is the same in words, different meanings can be interpreted by a person's feeling or face expression. To have a good understanding of dialogues, it is required in a flexible dialogue processing system to infer the speaker's view properly. However, it is difficult to obtain the meaning of the speaker's sentences in various scenes using traditional methods. In this paper, a new approach for dialogue processing that incorporates information from the speaker's face is presented. We first divide conversation statements into several simple tasks. Second, we process each simple task using an independent processor. Third, we employ some speaker's face information to estimate the view of the speakers to solve ambiguities in dialogues. The approach presented in this paper can work efficiently, because independent processors run in parallel, writing partial results to a shared memory, incorporating partial results at appropriate points, and complementing each other. A parallel algorithm and a method for employing the face information in a dialogue machine translation will be discussed, and some results will be included in this paper.

  17. Welding, brazing, and soldering handbook

    NASA Technical Reports Server (NTRS)

    Kilgore, A. B.; Koehler, M. L.; Metzler, J. W.; Sturges, S. R.

    1969-01-01

    Handbook gives information on the selection and application of welding, brazing, and soldering techniques for joining various metals. Summary descriptions of processes, criteria for process selection, and advantages of different methods are given.

  18. Community sensitization and decision-making for trial participation: a mixed-methods study from The Gambia.

    PubMed

    Dierickx, Susan; O'Neill, Sarah; Gryseels, Charlotte; Immaculate Anyango, Edna; Bannister-Tyrrell, Melanie; Okebe, Joseph; Mwesigwa, Julia; Jaiteh, Fatou; Gerrets, René; Ravinetto, Raffaella; D'Alessandro, Umberto; Peeters Grietens, Koen

    2017-08-16

    Ensuring individual free and informed decision-making for research participation is challenging. It is thought that preliminarily informing communities through 'community sensitization' procedures may improve individual decision-making. This study set out to assess the relevance of community sensitization for individual decision-making in research participation in rural Gambia. This anthropological mixed-methods study triangulated qualitative methods and quantitative survey methods in the context of an observational study and a clinical trial on malaria carried out by the Medical Research Council Unit Gambia. Although 38.7% of the respondents were present during sensitization sessions, 91.1% of the respondents were inclined to participate in the trial when surveyed after the sensitization and prior to the informed consent process. This difference can be explained by the informal transmission of information within the community after the community sensitization, expectations such as the benefits of participation based on previous research experiences, and the positive reputation of the research institute. Commonly mentioned barriers to participation were blood sampling and the potential disapproval of the household head. Community sensitization is effective in providing first-hand, reliable information to communities as the information is cascaded to those who could not attend the sessions. However, further research is needed to assess how the informal spread of information further shapes people's expectations, how the process engages with existing social relations and hierarchies (e.g. local political power structures; permissions of heads of households) and how this influences or changes individual consent. © 2017 The Authors Developing World Bioethics Published by John Wiley & Sons Ltd.

  19. Using fuzzy fractal features of digital images for the material surface analisys

    NASA Astrophysics Data System (ADS)

    Privezentsev, D. G.; Zhiznyakov, A. L.; Astafiev, A. V.; Pugin, E. V.

    2018-01-01

    Edge detection is an important task in image processing. There are a lot of approaches in this area: Sobel, Canny operators and others. One of the perspective techniques in image processing is the use of fuzzy logic and fuzzy sets theory. They allow us to increase processing quality by representing information in its fuzzy form. Most of the existing fuzzy image processing methods switch to fuzzy sets on very late stages, so this leads to some useful information loss. In this paper, a novel method of edge detection based on fuzzy image representation and fuzzy pixels is proposed. With this approach, we convert the image to fuzzy form on the first step. Different approaches to this conversion are described. Several membership functions for fuzzy pixel description and requirements for their form and view are given. A novel approach to edge detection based on Sobel operator and fuzzy image representation is proposed. Experimental testing of developed method was performed on remote sensing images.

  20. Beyond mind-reading: multi-voxel pattern analysis of fMRI data.

    PubMed

    Norman, Kenneth A; Polyn, Sean M; Detre, Greg J; Haxby, James V

    2006-09-01

    A key challenge for cognitive neuroscience is determining how mental representations map onto patterns of neural activity. Recently, researchers have started to address this question by applying sophisticated pattern-classification algorithms to distributed (multi-voxel) patterns of functional MRI data, with the goal of decoding the information that is represented in the subject's brain at a particular point in time. This multi-voxel pattern analysis (MVPA) approach has led to several impressive feats of mind reading. More importantly, MVPA methods constitute a useful new tool for advancing our understanding of neural information processing. We review how researchers are using MVPA methods to characterize neural coding and information processing in domains ranging from visual perception to memory search.

  1. Effects of flow restoration on mussel growth in a Wild and Scenic North American River

    PubMed Central

    2013-01-01

    Background Freshwater mussels remain among the most imperiled species in North America due primarily to habitat loss or degradation. Understanding how mussels respond to habitat changes can improve conservation efforts. Mussels deposit rings in their shell in which age and growth information can be read, and thus used to evaluate how mussels respond to changes in habitat. However, discrepancies between methodological approaches to obtain life history information from growth rings has led to considerable uncertainty regarding the life history characteristics of many mussel species. In this study we compared two processing methods, internal and external ring examination, to obtain age and growth information of two populations of mussels in the St. Croix River, MN, and evaluated how mussel growth responded to changes in the operation of a hydroelectric dam. Results External ring counts consistently underestimated internal ring counts by 4 years. Despite this difference, internal and external growth patterns were consistent. In 2000, the hydroelectric dam switched from operating on a peaking schedule to run-of-the-river/partial peaking. Growth patterns between an upstream and downstream site of the dam were similar both before and after the change in operation. At the downstream site, however, older mussels had higher growth rates after the change in operation than the same sized mussels collected before the change. Conclusions Because growth patterns between internal and external processing methods were consistent, we suggest that external processing is an effective method to obtain growth information despite providing inaccurate age information. External processing is advantageous over internal processing due to its non-destructive nature. Applying this information to analyze the influence of the operation change in the hydroelectric dam, we suggest that changing to run-of-the-river/partial peaking operation has benefited the growth of older mussels below the dam. PMID:23452382

  2. Comparing Sensory Information Processing and Alexithymia between People with Substance Dependency and Normal

    PubMed Central

    Bashapoor, Sajjad; Hosseini-Kiasari, Seyyedeh Tayebeh; Daneshvar, Somayeh; Kazemi-Taskooh, Zeinab

    2015-01-01

    Background Sensory information processing and alexithymia are two important factors in determining behavioral reactions. Some studies explain the effect of the sensitivity of sensory processing and alexithymia in the tendency to substance abuse. Giving that, the aim of the current study was to compare the styles of sensory information processing and alexithymia between substance-dependent people and normal ones. Methods The research method was cross-sectional and the statistical population of the current study comprised of all substance-dependent men who are present in substance quitting camps of Masal, Iran, in October 2013 (n = 78). 36 persons were selected randomly by simple randomly sampling method from this population as the study group, and 36 persons were also selected among the normal population in the same way as the comparison group. Both groups was evaluated by using Toronto alexithymia scale (TAS) and adult sensory profile, and the multivariate analysis of variance (MANOVA) test was applied to analyze data. Findings The results showed that there are significance differences between two groups in low registration (P < 0.020, F = 5.66), sensation seeking (P < 0.050, F = 1.92), and sensory avoidance (P < 0.008, F = 7.52) as a components of sensory processing and difficulty in describing emotions (P < 0.001, F = 15.01) and difficulty in identifying emotions (P < 0.002, F = 10.54) as a components of alexithymia. However, no significant difference were found between two groups in components of sensory sensitivity (P < 0.170, F = 1.92) and external oriented thinking style (P < 0.060, F = 3.60). Conclusion These results showed that substance-dependent people process sensory information in a different way than normal people and show more alexithymia features than them. PMID:26885354

  3. Progress toward scalable tomography of quantum maps using twirling-based methods and information hierarchies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez, Cecilia C.; Theoretische Physik, Universitaet des Saarlandes, D-66041 Saarbruecken; Departament de Fisica, Universitat Autonoma de Barcelona, E-08193 Bellaterra

    2010-06-15

    We present in a unified manner the existing methods for scalable partial quantum process tomography. We focus on two main approaches: the one presented in Bendersky et al. [Phys. Rev. Lett. 100, 190403 (2008)] and the ones described, respectively, in Emerson et al. [Science 317, 1893 (2007)] and Lopez et al. [Phys. Rev. A 79, 042328 (2009)], which can be combined together. The methods share an essential feature: They are based on the idea that the tomography of a quantum map can be efficiently performed by studying certain properties of a twirling of such a map. From this perspective, inmore » this paper we present extensions, improvements, and comparative analyses of the scalable methods for partial quantum process tomography. We also clarify the significance of the extracted information, and we introduce interesting and useful properties of the {chi}-matrix representation of quantum maps that can be used to establish a clearer path toward achieving full tomography of quantum processes in a scalable way.« less

  4. Scholarly Use of Information: Graduate Students' Information Seeking Behaviour

    ERIC Educational Resources Information Center

    George, Carole; Bright, Alice; Hurlbert, Terry; Linke, Erika C.; St. Clair, Gloriana; Stein, Joan

    2006-01-01

    Introduction: This study explored graduate students' information behaviour related to their process of inquiry and scholarly activities. Method: In depth, semi-structured interviews were conducted with one hundred graduate students representing all disciplines and departments from Carnegie Mellon University. Analysis: Working in pairs, we coded…

  5. User Testing of Consumer Medicine Information in Australia

    ERIC Educational Resources Information Center

    Jay, Eleanor; Aslani, Parisa; Raynor, D. K.

    2011-01-01

    Background: Consumer Medicine Information (CMI) forms an important basis for the dissemination of medicines information worldwide. Methods: This article presents an overview of the design and development of Australian CMI, and discusses "user-testing" as an iterative, formative process for CMI design. Findings: In Australia, legislation…

  6. Decoding the time-course of object recognition in the human brain: From visual features to categorical decisions.

    PubMed

    Contini, Erika W; Wardle, Susan G; Carlson, Thomas A

    2017-10-01

    Visual object recognition is a complex, dynamic process. Multivariate pattern analysis methods, such as decoding, have begun to reveal how the brain processes complex visual information. Recently, temporal decoding methods for EEG and MEG have offered the potential to evaluate the temporal dynamics of object recognition. Here we review the contribution of M/EEG time-series decoding methods to understanding visual object recognition in the human brain. Consistent with the current understanding of the visual processing hierarchy, low-level visual features dominate decodable object representations early in the time-course, with more abstract representations related to object category emerging later. A key finding is that the time-course of object processing is highly dynamic and rapidly evolving, with limited temporal generalisation of decodable information. Several studies have examined the emergence of object category structure, and we consider to what degree category decoding can be explained by sensitivity to low-level visual features. Finally, we evaluate recent work attempting to link human behaviour to the neural time-course of object processing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Method and device for diagnosing and controlling combustion instabilities in internal combustion engines operating in or transitioning to homogeneous charge combustion ignition mode

    DOEpatents

    Wagner, Robert M [Knoxville, TN; Daw, Charles S [Knoxville, TN; Green, Johney B [Knoxville, TN; Edwards, Kevin D [Knoxville, TN

    2008-10-07

    This invention is a method of achieving stable, optimal mixtures of HCCI and SI in practical gasoline internal combustion engines comprising the steps of: characterizing the combustion process based on combustion process measurements, determining the ratio of conventional and HCCI combustion, determining the trajectory (sequence) of states for consecutive combustion processes, and determining subsequent combustion process modifications using said information to steer the engine combustion toward desired behavior.

  8. Reasoning with case histories of process knowledge for efficient process development

    NASA Technical Reports Server (NTRS)

    Bharwani, Seraj S.; Walls, Joe T.; Jackson, Michael E.

    1988-01-01

    The significance of compiling case histories of empirical process knowledge and the role of such histories in improving the efficiency of manufacturing process development is discussed in this paper. Methods of representing important investigations as cases and using the information from such cases to eliminate redundancy of empirical investigations in analogous process development situations are also discussed. A system is proposed that uses such methods to capture the problem-solving framework of the application domain. A conceptual design of the system is presented and discussed.

  9. Facilitating Behavior Change with Low-Literacy Patient Education Materials

    ERIC Educational Resources Information Center

    Seligman, Hilary K.; Wallace, Andrea S.; DeWalt, Darren A.; Schillinger, Dean; Arnold, Connie L.; Shilliday, Betsy Bryant; Delgadillo, Adriana; Bengal, Nikki; Davis, Terry C.

    2007-01-01

    Objective: To describe a process for developing low-literacy health education materials that increase knowledge and activate patients toward healthier behaviors. Methods: We developed a theoretically informed process for developing educational materials. This process included convening a multidisciplinary creative team, soliciting stakeholder…

  10. Process Security in Chemical Engineering Education

    ERIC Educational Resources Information Center

    Piluso, Cristina; Uygun, Korkut; Huang, Yinlun; Lou, Helen H.

    2005-01-01

    The threats of terrorism have greatly alerted the chemical process industries to assure plant security at all levels: infrastructure-improvement-focused physical security, information-protection-focused cyber security, and design-and-operation-improvement-focused process security. While developing effective plant security methods and technologies…

  11. Processing module operating methods, processing modules, and communications systems

    DOEpatents

    McCown, Steven Harvey; Derr, Kurt W.; Moore, Troy

    2014-09-09

    A processing module operating method includes using a processing module physically connected to a wireless communications device, requesting that the wireless communications device retrieve encrypted code from a web site and receiving the encrypted code from the wireless communications device. The wireless communications device is unable to decrypt the encrypted code. The method further includes using the processing module, decrypting the encrypted code, executing the decrypted code, and preventing the wireless communications device from accessing the decrypted code. Another processing module operating method includes using a processing module physically connected to a host device, executing an application within the processing module, allowing the application to exchange user interaction data communicated using a user interface of the host device with the host device, and allowing the application to use the host device as a communications device for exchanging information with a remote device distinct from the host device.

  12. A decision method based on uncertainty reasoning of linguistic truth-valued concept lattice

    NASA Astrophysics Data System (ADS)

    Yang, Li; Xu, Yang

    2010-04-01

    Decision making with linguistic information is a research hotspot now. This paper begins by establishing the theory basis for linguistic information processing and constructs the linguistic truth-valued concept lattice for a decision information system, and further utilises uncertainty reasoning to make the decision. That is, we first utilise the linguistic truth-valued lattice implication algebra to unify the different kinds of linguistic expressions; second, we construct the linguistic truth-valued concept lattice and decision concept lattice according to the concrete decision information system and third, we establish the internal and external uncertainty reasoning methods and talk about the rationality of them. We apply these uncertainty reasoning methods into decision making and present some generation methods of decision rules. In the end, we give an application of this decision method by an example.

  13. HSQC-1,n-ADEQUATE: a new approach to long-range 13C-13C correlation by covariance processing.

    PubMed

    Martin, Gary E; Hilton, Bruce D; Willcott, M Robert; Blinov, Kirill A

    2011-10-01

    Long-range, two-dimensional heteronuclear shift correlation NMR methods play a pivotal role in the assembly of novel molecular structures. The well-established GHMBC method is a high-sensitivity mainstay technique, affording connectivity information via (n)J(CH) coupling pathways. Unfortunately, there is no simple way of determining the value of n and hence no way of differentiating two-bond from three- and occasionally four-bond correlations. Three-bond correlations, however, generally predominate. Recent work has shown that the unsymmetrical indirect covariance or generalized indirect covariance processing of multiplicity edited GHSQC and 1,1-ADEQUATE spectra provides high-sensitivity access to a (13)C-(13) C connectivity map in the form of an HSQC-1,1-ADEQUATE spectrum. Covariance processing of these data allows the 1,1-ADEQUATE connectivity information to be exploited with the inherent sensitivity of the GHSQC spectrum rather than the intrinsically lower sensitivity of the 1,1-ADEQUATE spectrum itself. Data acquisition times and/or sample size can be substantially reduced when covariance processing is to be employed. In an extension of that work, 1,n-ADEQUATE spectra can likewise be subjected to covariance processing to afford high-sensitivity access to the equivalent of (4)J(CH) GHMBC connectivity information. The method is illustrated using strychnine as a model compound. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Sampling Operations on Big Data

    DTIC Science & Technology

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  15. 40 CFR 721.1750 - 1H-Benzotriazole, 5-(pen-tyl-oxy)- and 1H-ben-zo-tri-a-zole, 5-(pen-tyl-oxy)-, sodium and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... this new information, and any information on methods for protecting against such risk, into a MSDS as... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL ACT... these substances is any manner or method of manufacture, import, or processing associated with any use...

  16. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation †

    PubMed Central

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-01-01

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted. PMID:28287448

  17. Real-Time Digital Signal Processing Based on FPGAs for Electronic Skin Implementation.

    PubMed

    Ibrahim, Ali; Gastaldo, Paolo; Chible, Hussein; Valle, Maurizio

    2017-03-10

    Enabling touch-sensing capability would help appliances understand interaction behaviors with their surroundings. Many recent studies are focusing on the development of electronic skin because of its necessity in various application domains, namely autonomous artificial intelligence (e.g., robots), biomedical instrumentation, and replacement prosthetic devices. An essential task of the electronic skin system is to locally process the tactile data and send structured information either to mimic human skin or to respond to the application demands. The electronic skin must be fabricated together with an embedded electronic system which has the role of acquiring the tactile data, processing, and extracting structured information. On the other hand, processing tactile data requires efficient methods to extract meaningful information from raw sensor data. Machine learning represents an effective method for data analysis in many domains: it has recently demonstrated its effectiveness in processing tactile sensor data. In this framework, this paper presents the implementation of digital signal processing based on FPGAs for tactile data processing. It provides the implementation of a tensorial kernel function for a machine learning approach. Implementation results are assessed by highlighting the FPGA resource utilization and power consumption. Results demonstrate the feasibility of the proposed implementation when real-time classification of input touch modalities are targeted.

  18. Music score watermarking by clef modifications

    NASA Astrophysics Data System (ADS)

    Schmucker, Martin; Yan, Hongning

    2003-06-01

    In this paper we present a new method for hiding data in music scores. In contrast to previous published algorithms we investigate the possibilities of embedding information in clefs. Using the clef as information carrier has two advantages: First, a clef is present in each staff line which guarantees a fixed capacity. Second, the clef defines the reference system for musical symbols and music containing symbols, e.g. the notes and the rests, are not degraded by manipulations. Music scores must be robust against greyscale to binary conversion. As a consequence, the information is embedded by modifying the black and white distribution of pixels in certain areas. We evaluate simple image processing mechanisms based on erosion and dilation for embedding the information. For retrieving the watermark the b/w-distribution is extracted from the given clef. To solve the synchronization problem the watermarked clef is normalized in a pre-processing step. The normalization is based on moments. The areas used for watermarking are calculated by image segmentation techniques which consider the features of a clef. We analyze capacity and robustness of the proposed method using different parameters for our proposed method. This proposed method can be combined with other music score watermarking methods to increase the capacity of existing watermarking techniques.

  19. Image Restoration Using Functional and Anatomical Information Fusion with Application to SPECT-MRI Images

    PubMed Central

    Benameur, S.; Mignotte, M.; Meunier, J.; Soucy, J. -P.

    2009-01-01

    Image restoration is usually viewed as an ill-posed problem in image processing, since there is no unique solution associated with it. The quality of restored image closely depends on the constraints imposed of the characteristics of the solution. In this paper, we propose an original extension of the NAS-RIF restoration technique by using information fusion as prior information with application in SPECT medical imaging. That extension allows the restoration process to be constrained by efficiently incorporating, within the NAS-RIF method, a regularization term which stabilizes the inverse solution. Our restoration method is constrained by anatomical information extracted from a high resolution anatomical procedure such as magnetic resonance imaging (MRI). This structural anatomy-based regularization term uses the result of an unsupervised Markovian segmentation obtained after a preliminary registration step between the MRI and SPECT data volumes from each patient. This method was successfully tested on 30 pairs of brain MRI and SPECT acquisitions from different subjects and on Hoffman and Jaszczak SPECT phantoms. The experiments demonstrated that the method performs better, in terms of signal-to-noise ratio, than a classical supervised restoration approach using a Metz filter. PMID:19812704

  20. Modelling spatiotemporal change using multidimensional arrays Meng

    NASA Astrophysics Data System (ADS)

    Lu, Meng; Appel, Marius; Pebesma, Edzer

    2017-04-01

    The large variety of remote sensors, model simulations, and in-situ records provide great opportunities to model environmental change. The massive amount of high-dimensional data calls for methods to integrate data from various sources and to analyse spatiotemporal and thematic information jointly. An array is a collection of elements ordered and indexed in arbitrary dimensions, which naturally represent spatiotemporal phenomena that are identified by their geographic locations and recording time. In addition, array regridding (e.g., resampling, down-/up-scaling), dimension reduction, and spatiotemporal statistical algorithms are readily applicable to arrays. However, the role of arrays in big geoscientific data analysis has not been systematically studied: How can arrays discretise continuous spatiotemporal phenomena? How can arrays facilitate the extraction of multidimensional information? How can arrays provide a clean, scalable and reproducible change modelling process that is communicable between mathematicians, computer scientist, Earth system scientist and stakeholders? This study emphasises on detecting spatiotemporal change using satellite image time series. Current change detection methods using satellite image time series commonly analyse data in separate steps: 1) forming a vegetation index, 2) conducting time series analysis on each pixel, and 3) post-processing and mapping time series analysis results, which does not consider spatiotemporal correlations and ignores much of the spectral information. Multidimensional information can be better extracted by jointly considering spatial, spectral, and temporal information. To approach this goal, we use principal component analysis to extract multispectral information and spatial autoregressive models to account for spatial correlation in residual based time series structural change modelling. We also discuss the potential of multivariate non-parametric time series structural change methods, hierarchical modelling, and extreme event detection methods to model spatiotemporal change. We show how array operations can facilitate expressing these methods, and how the open-source array data management and analytics software SciDB and R can be used to scale the process and make it easily reproducible.

  1. Applying high resolution remote sensing image and DEM to falling boulder hazard assessment

    NASA Astrophysics Data System (ADS)

    Huang, Changqing; Shi, Wenzhong; Ng, K. C.

    2005-10-01

    Boulder fall hazard assessing generally requires gaining the boulder information. The extensive mapping and surveying fieldwork is a time-consuming, laborious and dangerous conventional method. So this paper proposes an applying image processing technology to extract boulder and assess boulder fall hazard from high resolution remote sensing image. The method can replace the conventional method and extract the boulder information in high accuracy, include boulder size, shape, height and the slope and aspect of its position. With above boulder information, it can be satisfied for assessing, prevention and cure boulder fall hazard.

  2. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  3. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    NASA Astrophysics Data System (ADS)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  4. A Method to Quantify Visual Information Processing in Children Using Eye Tracking

    PubMed Central

    Kooiker, Marlou J.G.; Pel, Johan J.M.; van der Steen-Kant, Sanny P.; van der Steen, Johannes

    2016-01-01

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922

  5. A Method to Quantify Visual Information Processing in Children Using Eye Tracking.

    PubMed

    Kooiker, Marlou J G; Pel, Johan J M; van der Steen-Kant, Sanny P; van der Steen, Johannes

    2016-07-09

    Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child.

  6. MS-Electronic Nose Performance Improvement Using GC Retention Times And 2-Way And 3-Way Data Processing Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burian, Cosmin; Llobet, Eduard; Vilanova, Xavier

    We have designed a challenging experimental sample set in the form of 20 solutions with a high degree of similarity in order to study whether the addition of chromatographic separation information improves the performance of regular MS based electronic noses. In order to make an initial study of the approach, two different chromatographic methods were used. By processing the data of these experiments with 2 and 3-way algorithms, we have shown that the addition of chromatographic separation information improves the results compared to the 2-way analysis of mass spectra or total ion chromatogram treated separately. Our findings show that whenmore » the chromatographic peaks are resolved (longer measurement times), 2-way methods work better than 3-way methods, whereas in the case of a more challenging measurement (more coeluted chromatograms, much faster GC-MS measurements) 3-way methods work better.« less

  7. Reducing Interpolation Artifacts for Mutual Information Based Image Registration

    PubMed Central

    Soleimani, H.; Khosravifard, M.A.

    2011-01-01

    Medical image registration methods which use mutual information as similarity measure have been improved in recent decades. Mutual Information is a basic concept of Information theory which indicates the dependency of two random variables (or two images). In order to evaluate the mutual information of two images their joint probability distribution is required. Several interpolation methods, such as Partial Volume (PV) and bilinear, are used to estimate joint probability distribution. Both of these two methods yield some artifacts on mutual information function. Partial Volume-Hanning window (PVH) and Generalized Partial Volume (GPV) methods are introduced to remove such artifacts. In this paper we show that the acceptable performance of these methods is not due to their kernel function. It's because of the number of pixels which incorporate in interpolation. Since using more pixels requires more complex and time consuming interpolation process, we propose a new interpolation method which uses only four pixels (the same as PV and bilinear interpolations) and removes most of the artifacts. Experimental results of the registration of Computed Tomography (CT) images show superiority of the proposed scheme. PMID:22606673

  8. Decentralized modal identification using sparse blind source separation

    NASA Astrophysics Data System (ADS)

    Sadhu, A.; Hazra, B.; Narasimhan, S.; Pandey, M. D.

    2011-12-01

    Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time-frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure.

  9. Representing Information in Patient Reports Using Natural Language Processing and the Extensible Markup Language

    PubMed Central

    Friedman, Carol; Hripcsak, George; Shagina, Lyuda; Liu, Hongfang

    1999-01-01

    Objective: To design a document model that provides reliable and efficient access to clinical information in patient reports for a broad range of clinical applications, and to implement an automated method using natural language processing that maps textual reports to a form consistent with the model. Methods: A document model that encodes structured clinical information in patient reports while retaining the original contents was designed using the extensible markup language (XML), and a document type definition (DTD) was created. An existing natural language processor (NLP) was modified to generate output consistent with the model. Two hundred reports were processed using the modified NLP system, and the XML output that was generated was validated using an XML validating parser. Results: The modified NLP system successfully processed all 200 reports. The output of one report was invalid, and 199 reports were valid XML forms consistent with the DTD. Conclusions: Natural language processing can be used to automatically create an enriched document that contains a structured component whose elements are linked to portions of the original textual report. This integrated document model provides a representation where documents containing specific information can be accurately and efficiently retrieved by querying the structured components. If manual review of the documents is desired, the salient information in the original reports can also be identified and highlighted. Using an XML model of tagging provides an additional benefit in that software tools that manipulate XML documents are readily available. PMID:9925230

  10. A review on the relationship between food structure, processing, and bioavailability.

    PubMed

    Sensoy, Ilkay

    2014-01-01

    This review highlights the effects of processing and food matrix on bioaccessibility and bioavailability of functional components. Human digestive system is reviewed as an element in bioavailability. Methods for bioaccessibility and bioavailability determination are described. Information about the location of functional compounds in the tissue is presented to portray the matrix information. Research data on the effects of food matrix and processing on bioaccessibility and bioavailability are summarized. Finally, trends in the development of functional component delivery systems are included.

  11. Evaluation of the application of ERTS-1 data to the regional land use planning process. [Northeast Wisconsin

    NASA Technical Reports Server (NTRS)

    Clapp, J. L. (Principal Investigator); Green, T., III; Hanson, G. F.; Kiefer, R. W.; Niemann, B. J., Jr.

    1974-01-01

    The author has identified the following significant results. Employing simple and economical extraction methods, ERTS can provide valuable data to the planners at the state or regional level with a frequency never before possible. Interactive computer methods of working directly with ERTS digital information show much promise for providing land use information at a more specific level, since the data format production rate of ERTS justifies improved methods of analysis.

  12. Social Information Processing of Positive and Negative Hypothetical Events in Children with ADHD and Conduct Problems and Controls

    ERIC Educational Resources Information Center

    Andrade, Brendan F.; Waschbusch, Daniel A.; Doucet, Amelie; King, Sara; MacKinnon, Maura; McGrath, Patrick J.; Stewart, Sherry H.; Corkum, Penny

    2012-01-01

    Objective: This study examined social information processing (SIP) of events with varied outcomes in children with ADHD and conduct problems (CPs; defined as oppositional defiant disorder [ODD] or conduct disorder [CD]) and controls. Method: Participants were 64 children (46 boys, 18 girls) aged 6 to 12, including 39 with ADHD and 25 controls.…

  13. Usability Evaluation at the Point-of-Care: A Method to Identify User Information Needs in CPOE Applications

    PubMed Central

    Washburn, Jeff; Fiol, Guilherme Del; Rocha, Roberto A.

    2006-01-01

    Point of care usability evaluation may help identify information needs that occur during the process of providing care. We describe the process of using usability-specific recording software to record Computerized Physician Order Entry (CPOE) ordering sessions on admitted adult and pediatric patients at two urban tertiary hospitals in the Intermountain Healthcare system of hospitals. PMID:17238756

  14. Information Processing in Nursing Information Systems: An Evaluation Study from a Developing Country.

    PubMed

    Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi

    2017-01-01

    In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.

  15. Quantum information processing with trapped ions

    NASA Astrophysics Data System (ADS)

    Gaebler, John

    2013-03-01

    Trapped ions are one promising architecture for scalable quantum information processing. Ion qubits are held in multizone traps created from segmented arrays of electrodes and transported between trap zones using time varying electric potentials applied to the electrodes. Quantum information is stored in the ions' internal hyperfine states and quantum gates to manipulate the internal states and create entanglement are performed with laser beams and microwaves. Recently we have made progress in speeding up the ion transport and cooling processes that were the limiting tasks for the operation speed in previous experiments. We are also exploring improved two-qubit gates and new methods for creating ion entanglement. This work was supported by IARPA, ARO contract No. EAO139840, ONR and the NIST Quantum Information Program

  16. Supporting Active Patient and Health Care Collaboration: A Prototype for Future Health Care Information Systems.

    PubMed

    Åhlfeldt, Rose-Mharie; Persson, Anne; Rexhepi, Hanife; Wåhlander, Kalle

    2016-12-01

    This article presents and illustrates the main features of a proposed process-oriented approach for patient information distribution in future health care information systems, by using a prototype of a process support system. The development of the prototype was based on the Visuera method, which includes five defined steps. The results indicate that a visualized prototype is a suitable tool for illustrating both the opportunities and constraints of future ideas and solutions in e-Health. The main challenges for developing and implementing a fully functional process support system concern both technical and organizational/management aspects. © The Author(s) 2015.

  17. State-transfer simulation in integrated waveguide circuits

    NASA Astrophysics Data System (ADS)

    Latmiral, L.; Di Franco, C.; Mennea, P. L.; Kim, M. S.

    2015-08-01

    Spin-chain models have been widely studied in terms of quantum information processes, for instance for the faithful transmission of quantum states. Here, we investigate the limitations of mapping this process to an equivalent one through a bosonic chain. In particular, we keep in mind experimental implementations, which the progress in integrated waveguide circuits could make possible in the very near future. We consider the feasibility of exploiting the higher dimensionality of the Hilbert space of the chain elements for the transmission of a larger amount of information, and the effects of unwanted excitations during the process. Finally, we exploit the information-flux method to provide bounds to the transfer fidelity.

  18. Batch process fault detection and identification based on discriminant global preserving kernel slow feature analysis.

    PubMed

    Zhang, Hanyuan; Tian, Xuemin; Deng, Xiaogang; Cao, Yuping

    2018-05-16

    As an attractive nonlinear dynamic data analysis tool, global preserving kernel slow feature analysis (GKSFA) has achieved great success in extracting the high nonlinearity and inherently time-varying dynamics of batch process. However, GKSFA is an unsupervised feature extraction method and lacks the ability to utilize batch process class label information, which may not offer the most effective means for dealing with batch process monitoring. To overcome this problem, we propose a novel batch process monitoring method based on the modified GKSFA, referred to as discriminant global preserving kernel slow feature analysis (DGKSFA), by closely integrating discriminant analysis and GKSFA. The proposed DGKSFA method can extract discriminant feature of batch process as well as preserve global and local geometrical structure information of observed data. For the purpose of fault detection, a monitoring statistic is constructed based on the distance between the optimal kernel feature vectors of test data and normal data. To tackle the challenging issue of nonlinear fault variable identification, a new nonlinear contribution plot method is also developed to help identifying the fault variable after a fault is detected, which is derived from the idea of variable pseudo-sample trajectory projection in DGKSFA nonlinear biplot. Simulation results conducted on a numerical nonlinear dynamic system and the benchmark fed-batch penicillin fermentation process demonstrate that the proposed process monitoring and fault diagnosis approach can effectively detect fault and distinguish fault variables from normal variables. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  19. DESIGN MANUAL: PHOSPHORUS REMOVAL

    EPA Science Inventory

    This manual summarizes process design information for the best developed methods for removing phosphorus from wastewater. his manual discusses several proven phosphorus removal methods, including phosphorus removal obtainable through biological activity as well as chemical precip...

  20. A New Method for Conceptual Modelling of Information Systems

    NASA Astrophysics Data System (ADS)

    Gustas, Remigijus; Gustiene, Prima

    Service architecture is not necessarily bound to the technical aspects of information system development. It can be defined by using conceptual models that are independent of any implementation technology. Unfortunately, the conventional information system analysis and design methods cover just a part of required modelling notations for engineering of service architectures. They do not provide effective support to maintain semantic integrity between business processes and data. Service orientation is a paradigm that can be applied for conceptual modelling of information systems. The concept of service is rather well understood in different domains. It can be applied equally well for conceptualization of organizational and technical information system components. This chapter concentrates on analysis of the differences between service-oriented modelling and object-oriented modelling. Service-oriented method is used for semantic integration of information system static and dynamic aspects.

  1. Smith predictor-based multiple periodic disturbance compensation for long dead-time processes

    NASA Astrophysics Data System (ADS)

    Tan, Fang; Li, Han-Xiong; Shen, Ping

    2018-05-01

    Many disturbance rejection methods have been proposed for processes with dead-time, while these existing methods may not work well under multiple periodic disturbances. In this paper, a multiple periodic disturbance rejection is proposed under the Smith predictor configuration for processes with long dead-time. One feedback loop is added to compensate periodic disturbance while retaining the advantage of the Smith predictor. With information of the disturbance spectrum, the added feedback loop can remove multiple periodic disturbances effectively. The robust stability can be easily maintained through the rigorous analysis. Finally, simulation examples demonstrate the effectiveness and robustness of the proposed method for processes with long dead-time.

  2. A sequential method for spline approximation with variable knots. [recursive piecewise polynomial signal processing

    NASA Technical Reports Server (NTRS)

    Mier Muth, A. M.; Willsky, A. S.

    1978-01-01

    In this paper we describe a method for approximating a waveform by a spline. The method is quite efficient, as the data are processed sequentially. The basis of the approach is to view the approximation problem as a question of estimation of a polynomial in noise, with the possibility of abrupt changes in the highest derivative. This allows us to bring several powerful statistical signal processing tools into play. We also present some initial results on the application of our technique to the processing of electrocardiograms, where the knot locations themselves may be some of the most important pieces of diagnostic information.

  3. Retrieval of land cover information under thin fog in Landsat TM image

    NASA Astrophysics Data System (ADS)

    Wei, Yuchun

    2008-04-01

    Thin fog, which often appears in remote sensing image of subtropical climate region, has resulted in the low image quantity and bad image mapping. Therefore, it is necessary to develop the image processing method to retrieve land cover information under thin fog. In this paper, the Landsat TM image near the Taihu Lake that is in the subtropical climate zone of China was used as an example, and the workflow and method used to retrieve the land cover information under thin fog have been built based on ENVI software and a single TM image. The basic step covers three parts: 1) isolating the thin fog area in image according to the spectral difference of different bands; 2) retrieving the visible band information of different land cover types under thin fog from the near-infrared bands according to the relationships between near-infrared bands and visible bands of different land cover types in the area without fog; 3) image post-process. The result showed that the method in the paper is easy and suitable, and can be used to improve the quantity of TM image mapping more effectively.

  4. Measuring patterns in team interaction sequences using a discrete recurrence approach.

    PubMed

    Gorman, Jamie C; Cooke, Nancy J; Amazeen, Polemnia G; Fouse, Shannon

    2012-08-01

    Recurrence-based measures of communication determinism and pattern information are described and validated using previously collected team interaction data. Team coordination dynamics has revealed that"mixing" team membership can lead to flexible interaction processes, but keeping a team "intact" can lead to rigid interaction processes. We hypothesized that communication of intact teams would have greater determinism and higher pattern information compared to that of mixed teams. Determinism and pattern information were measured from three-person Uninhabited Air Vehicle team communication sequences over a series of 40-minute missions. Because team members communicated using push-to-talk buttons, communication sequences were automatically generated during each mission. The Composition x Mission determinism effect was significant. Intact teams' determinism increased over missions, whereas mixed teams' determinism did not change. Intact teams had significantly higher maximum pattern information than mixed teams. Results from these new communication analysis methods converge with content-based methods and support our hypotheses. Because they are not content based, and because they are automatic and fast, these new methods may be amenable to real-time communication pattern analysis.

  5. Clinical guideline representation in a CDS: a human information processing method.

    PubMed

    Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique

    2012-01-01

    The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.

  6. Task-technology fit of video telehealth for nurses in an outpatient clinic setting.

    PubMed

    Cady, Rhonda G; Finkelstein, Stanley M

    2014-07-01

    Incorporating telehealth into outpatient care delivery supports management of consumer health between clinic visits. Task-technology fit is a framework for understanding how technology helps and/or hinders a person during work processes. Evaluating the task-technology fit of video telehealth for personnel working in a pediatric outpatient clinic and providing care between clinic visits ensures the information provided matches the information needed to support work processes. The workflow of advanced practice registered nurse (APRN) care coordination provided via telephone and video telehealth was described and measured using a mixed-methods workflow analysis protocol that incorporated cognitive ethnography and time-motion study. Qualitative and quantitative results were merged and analyzed within the task-technology fit framework to determine the workflow fit of video telehealth for APRN care coordination. Incorporating video telehealth into APRN care coordination workflow provided visual information unavailable during telephone interactions. Despite additional tasks and interactions needed to obtain the visual information, APRN workflow efficiency, as measured by time, was not significantly changed. Analyzed within the task-technology fit framework, the increased visual information afforded by video telehealth supported the assessment and diagnostic information needs of the APRN. Telehealth must provide the right information to the right clinician at the right time. Evaluating task-technology fit using a mixed-methods protocol ensured rigorous analysis of fit within work processes and identified workflows that benefit most from the technology.

  7. Supporting Reflective Activities in Information Seeking on the Web

    NASA Astrophysics Data System (ADS)

    Saito, Hitomi; Miwa, Kazuhisa

    Recently, many opportunities have emerged to use the Internet in daily life and classrooms. However, with the growth of the World Wide Web (Web), it is becoming increasingly difficult to find target information on the Internet. In this study, we explore a method for developing the ability of users in information seeking on the Web and construct a search process feedback system supporting reflective activities of information seeking on the Web. Reflection is defined as a cognitive activity for monitoring, evaluating, and modifying one's thinking and process. In the field of learning science, many researchers have investigated reflective activities that facilitate learners' problem solving and deep understanding. The characteristics of this system are: (1) to show learners' search processes on the Web as described, based on a cognitive schema, and (2) to prompt learners to reflect on their search processes. We expect that users of this system can reflect on their search processes by receiving information on their own search processes provided by the system, and that these types of reflective activity helps them to deepen their understanding of information seeking activities. We have conducted an experiment to investigate the effects of our system. The experimental results confirmed that (1) the system actually facilitated the learners' reflective activities by providing process visualization and prompts, and (2) the learners who reflected on their search processes more actively understood their own search processes more deeply.

  8. [Application of regular expression in extracting key information from Chinese medicine literatures about re-evaluation of post-marketing surveillance].

    PubMed

    Wang, Zhifei; Xie, Yanming; Wang, Yongyan

    2011-10-01

    Computerizing extracting information from Chinese medicine literature seems more convenient than hand searching, which could simplify searching process and improve the accuracy. However, many computerized auto-extracting methods are increasingly used, regular expression is so special that could be efficient for extracting useful information in research. This article focused on regular expression applying in extracting information from Chinese medicine literature. Two practical examples were reported in this article about regular expression to extract "case number (non-terminology)" and "efficacy rate (subgroups for related information identification)", which explored how to extract information in Chinese medicine literature by means of some special research method.

  9. Post processing for offline Chinese handwritten character string recognition

    NASA Astrophysics Data System (ADS)

    Wang, YanWei; Ding, XiaoQing; Liu, ChangSong

    2012-01-01

    Offline Chinese handwritten character string recognition is one of the most important research fields in pattern recognition. Due to the free writing style, large variability in character shapes and different geometric characteristics, Chinese handwritten character string recognition is a challenging problem to deal with. However, among the current methods over-segmentation and merging method which integrates geometric information, character recognition information and contextual information, shows a promising result. It is found experimentally that a large part of errors are segmentation error and mainly occur around non-Chinese characters. In a Chinese character string, there are not only wide characters namely Chinese characters, but also narrow characters like digits and letters of the alphabet. The segmentation error is mainly caused by uniform geometric model imposed on all segmented candidate characters. To solve this problem, post processing is employed to improve recognition accuracy of narrow characters. On one hand, multi-geometric models are established for wide characters and narrow characters respectively. Under multi-geometric models narrow characters are not prone to be merged. On the other hand, top rank recognition results of candidate paths are integrated to boost final recognition of narrow characters. The post processing method is investigated on two datasets, in total 1405 handwritten address strings. The wide character recognition accuracy has been improved lightly and narrow character recognition accuracy has been increased up by 10.41% and 10.03% respectively. It indicates that the post processing method is effective to improve recognition accuracy of narrow characters.

  10. Watching diagnoses develop: Eye movements reveal symptom processing during diagnostic reasoning.

    PubMed

    Scholz, Agnes; Krems, Josef F; Jahn, Georg

    2017-10-01

    Finding a probable explanation for observed symptoms is a highly complex task that draws on information retrieval from memory. Recent research suggests that observed symptoms are interpreted in a way that maximizes coherence for a single likely explanation. This becomes particularly clear if symptom sequences support more than one explanation. However, there are no existing process data available that allow coherence maximization to be traced in ambiguous diagnostic situations, where critical information has to be retrieved from memory. In this experiment, we applied memory indexing, an eye-tracking method that affords rich time-course information concerning memory-based cognitive processing during higher order thinking, to reveal symptom processing and the preferred interpretation of symptom sequences. Participants first learned information about causes and symptoms presented in spatial frames. Gaze allocation to emptied spatial frames during symptom processing and during the diagnostic response reflected the subjective status of hypotheses held in memory and the preferred interpretation of ambiguous symptoms. Memory indexing traced how the diagnostic decision developed and revealed instances of hypothesis change and biases in symptom processing. Memory indexing thus provided direct online evidence for coherence maximization in processing ambiguous information.

  11. Pulse-echo probe of rock permeability near oil wells

    NASA Technical Reports Server (NTRS)

    Narasimhan, K. Y.; Parthasarathy, S. P.

    1978-01-01

    Processing method involves sequential insonifications of borehole wall at number of different frequencies. Return signals are normalized in amplitude, and root-mean-square (rms) value of each signal is determined. Values can be processed to yield information on size and number density of microfractures at various depths in rock matrix by using averaging methods developed for pulse-echo technique.

  12. Forest Service National Visitor Use Monitoring Process: Research Method Documentation

    Treesearch

    Donald B.K. English; Susan M. Kocis; Stanley J. Zarnoch; J. Ross Arnold

    2002-01-01

    In response to the need for improved information on recreational use of National Forest System lands, the authors have developed a nationwide, systematic monitoring process. This report documents the methods they used in estimating recreational use on an annual basis. The basic unit of measure is exiting volume of visitors from a recreation site on a given day. Sites...

  13. Systems, methods and apparatus for implementation of formal specifications derived from informal requirements

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Rouff, Christopher A. (Inventor); Rash, James L. (Inventor); Erickson, John D. (Inventor); Gracinin, Denis (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments an informal specification is translated without human intervention into a formal specification. In some embodiments the formal specification is a process-based specification. In some embodiments, the formal specification is translated into a high-level computer programming language which is further compiled into a set of executable computer instructions.

  14. Non-rigid ultrasound image registration using generalized relaxation labeling process

    NASA Astrophysics Data System (ADS)

    Lee, Jong-Ha; Seong, Yeong Kyeong; Park, MoonHo; Woo, Kyoung-Gu; Ku, Jeonghun; Park, Hee-Jun

    2013-03-01

    This research proposes a novel non-rigid registration method for ultrasound images. The most predominant anatomical features in medical images are tissue boundaries, which appear as edges. In ultrasound images, however, other features can be identified as well due to the specular reflections that appear as bright lines superimposed on the ideal edge location. In this work, an image's local phase information (via the frequency domain) is used to find the ideal edge location. The generalized relaxation labeling process is then formulated to align the feature points extracted from the ideal edge location. In this work, the original relaxation labeling method was generalized by taking n compatibility coefficient values to improve non-rigid registration performance. This contextual information combined with a relaxation labeling process is used to search for a correspondence. Then the transformation is calculated by the thin plate spline (TPS) model. These two processes are iterated until the optimal correspondence and transformation are found. We have tested our proposed method and the state-of-the-art algorithms with synthetic data and bladder ultrasound images of in vivo human subjects. Experiments show that the proposed method improves registration performance significantly, as compared to other state-of-the-art non-rigid registration algorithms.

  15. Information computer program for laser therapy and laser puncture

    NASA Astrophysics Data System (ADS)

    Badovets, Nadegda N.; Medvedev, Andrei V.

    1995-03-01

    An informative computer program containing laser therapy and puncture methods has been developed. It was used successfully in connection with the compact Russian medical laser apparatus HELIOS-O1M in laser treatment and the education process.

  16. Exploiting salient semantic analysis for information retrieval

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Meng, Bo; Quan, Changqin; Tu, Xinhui

    2016-11-01

    Recently, many Wikipedia-based methods have been proposed to improve the performance of different natural language processing (NLP) tasks, such as semantic relatedness computation, text classification and information retrieval. Among these methods, salient semantic analysis (SSA) has been proven to be an effective way to generate conceptual representation for words or documents. However, its feasibility and effectiveness in information retrieval is mostly unknown. In this paper, we study how to efficiently use SSA to improve the information retrieval performance, and propose a SSA-based retrieval method under the language model framework. First, SSA model is adopted to build conceptual representations for documents and queries. Then, these conceptual representations and the bag-of-words (BOW) representations can be used in combination to estimate the language models of queries and documents. The proposed method is evaluated on several standard text retrieval conference (TREC) collections. Experiment results on standard TREC collections show the proposed models consistently outperform the existing Wikipedia-based retrieval methods.

  17. Considering Information Up-to-Dateness to Increase the Accuracy of Therapy Decision Support Systems.

    PubMed

    Gaebel, Jan; Cypko, Mario A; Oeltze-Jafra, Steffen

    2017-01-01

    During the diagnostic process a lot of information is generated. All this information is assessed when making a final diagnosis and planning the therapy. While some patient information is stable, e.g., gender, others may become outdated, e.g., tumor size derived from CT data. Quantifying this information up-to-dateness and deriving consequences are difficult. Especially for the implementation in clinical decision support systems, this has not been studied. When information entities tend to become outdated, in practice, clinicians intuitively reduce their impact when making decisions. Therefore, in a system's calculations their impact should be reduced as well. We propose a method of decreasing the certainty of information entities based on their up-to-dateness. The method is tested in a decision support system for TNM staging based on Bayesian networks. We compared the actual N-state in records of 39 patients to the N-state calculated with and without decreasing data certainty. The results under decreased certainty correlated better with the actual states (r=0.958, p=0.008). We conclude that the up-to-dateness must be considered when processing clinical information to enhance decision making and ensure more patient safety.

  18. A review of decision support, risk communication and patient information tools for thrombolytic treatment in acute stroke: lessons for tool developers

    PubMed Central

    2013-01-01

    Background Tools to support clinical or patient decision-making in the treatment/management of a health condition are used in a range of clinical settings for numerous preference-sensitive healthcare decisions. Their impact in clinical practice is largely dependent on their quality across a range of domains. We critically analysed currently available tools to support decision making or patient understanding in the treatment of acute ischaemic stroke with intravenous thrombolysis, as an exemplar to provide clinicians/researchers with practical guidance on development, evaluation and implementation of such tools for other preference-sensitive treatment options/decisions in different clinical contexts. Methods Tools were identified from bibliographic databases, Internet searches and a survey of UK and North American stroke networks. Two reviewers critically analysed tools to establish: information on benefits/risks of thrombolysis included in tools, and the methods used to convey probabilistic information (verbal descriptors, numerical and graphical); adherence to guidance on presenting outcome probabilities (IPDASi probabilities items) and information content (Picker Institute Checklist); readability (Fog Index); and the extent that tools had comprehensive development processes. Results Nine tools of 26 identified included information on a full range of benefits/risks of thrombolysis. Verbal descriptors, frequencies and percentages were used to convey probabilistic information in 20, 19 and 18 tools respectively, whilst nine used graphical methods. Shortcomings in presentation of outcome probabilities (e.g. omitting outcomes without treatment) were identified. Patient information tools had an aggregate median Fog index score of 10. None of the tools had comprehensive development processes. Conclusions Tools to support decision making or patient understanding in the treatment of acute stroke with thrombolysis have been sub-optimally developed. Development of tools should utilise mixed methods and strategies to meaningfully involve clinicians, patients and their relatives in an iterative design process; include evidence-based methods to augment interpretability of textual and probabilistic information (e.g. graphical displays showing natural frequencies) on the full range of outcome states associated with available options; and address patients with different levels of health literacy. Implementation of tools will be enhanced when mechanisms are in place to periodically assess the relevance of tools and where necessary, update the mode of delivery, form and information content. PMID:23777368

  19. Making sense of health information technology implementation: A qualitative study protocol

    PubMed Central

    2010-01-01

    Background Implementing new practices, such as health information technology (HIT), is often difficult due to the disruption of the highly coordinated, interdependent processes (e.g., information exchange, communication, relationships) of providing care in hospitals. Thus, HIT implementation may occur slowly as staff members observe and make sense of unexpected disruptions in care. As a critical organizational function, sensemaking, defined as the social process of searching for answers and meaning which drive action, leads to unified understanding, learning, and effective problem solving -- strategies that studies have linked to successful change. Project teamwork is a change strategy increasingly used by hospitals that facilitates sensemaking by providing a formal mechanism for team members to share ideas, construct the meaning of events, and take next actions. Methods In this longitudinal case study, we aim to examine project teams' sensemaking and action as the team prepares to implement new information technology in a tiertiary care hospital. Based on management and healthcare literature on HIT implementation and project teamwork, we chose sensemaking as an alternative to traditional models for understanding organizational change and teamwork. Our methods choices are derived from this conceptual framework. Data on project team interactions will be prospectively collected through direct observation and organizational document review. Through qualitative methods, we will identify sensemaking patterns and explore variation in sensemaking across teams. Participant demographics will be used to explore variation in sensemaking patterns. Discussion Outcomes of this research will be new knowledge about sensemaking patterns of project teams, such as: the antecedents and consequences of the ongoing, evolutionary, social process of implementing HIT; the internal and external factors that influence the project team, including team composition, team member interaction, and interaction between the project team and the larger organization; the ways in which internal and external factors influence project team processes; and the ways in which project team processes facilitate team task accomplishment. These findings will lead to new methods of implementing HIT in hospitals. PMID:21114860

  20. LSTM-CRF | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    LSTM-CRF uses Natural Language Processing methods for detecting Adverse Drug Events, Drugname, Indication and other medically relevant information from Electronic Health Records. It implements Recurrent Neural Networks using several CRF based inference methods.

  1. Informal Learning of Social Workers: A Method of Narrative Inquiry

    ERIC Educational Resources Information Center

    Gola, Giancarlo

    2009-01-01

    Purpose: The purpose of this paper is to investigate social workers' processes of informal learning, through their narration of their professional experience, in order to understand how social workers learn. Informal learning is any individual practice or activity that is able to produce continuous learning; it is often non-intentional and…

  2. The Information Impact: Ensuring New Product Winners.

    ERIC Educational Resources Information Center

    Trubkin, Loene

    Despite investment in new research tools and techniques, the product development success rate has not improved within the last 25 years. One way to increase the success rate is to have the right information at each stage of the process. Today, a relatively new method of gathering information--online access to electronic files called…

  3. The crack detection algorithm of pavement image based on edge information

    NASA Astrophysics Data System (ADS)

    Yang, Chunde; Geng, Mingyue

    2018-05-01

    As the images of pavement cracks are affected by a large amount of complicated noises, such as uneven illumination and water stains, the detected cracks are discontinuous and the main body information at the edge of the cracks is easily lost. In order to solve the problem, a crack detection algorithm in pavement image based on edge information is proposed. Firstly, the image is pre-processed by the nonlinear gray-scale transform function and reconstruction filter to enhance the linear characteristic of the crack. At the same time, an adaptive thresholding method is designed to coarsely extract the cracks edge according to the gray-scale gradient feature and obtain the crack gradient information map. Secondly, the candidate edge points are obtained according to the gradient information, and the edge is detected based on the single pixel percolation processing, which is improved by using the local difference between pixels in the fixed region. Finally, complete crack is obtained by filling the crack edge. Experimental results show that the proposed method can accurately detect pavement cracks and preserve edge information.

  4. Study of the correlation parameters of the surface structure of disordered semiconductors by the two-dimensional DFA and average mutual information methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpatov, A. V.; Vikhrov, S. P.; Rybina, N. V., E-mail: pgnv@mail.ru

    The processes of self-organization of the surface structure of hydrogenated amorphous silicon are studied by the methods of fluctuation analysis and average mutual information on the basis of atomic-force-microscopy images of the surface. It is found that all of the structures can be characterized by a correlation vector and represented as a superposition of harmonic components and noise. It is shown that, under variations in the technological parameters of the production of a-Si:H films, the correlation properties of their structure vary as well. As the substrate temperature is increased, the formation of structural irregularities becomes less efficient; in this case,more » the length of the correlation vector and the degree of structural ordering increase. It is shown that the procedure based on the method of fluctuation analysis in combination with the method of average mutual information provides a means for studying the self-organization processes in any structures on different length scales.« less

  5. Morphology-Induced Information Transfer in Bat Sonar

    NASA Astrophysics Data System (ADS)

    Reijniers, Jonas; Vanderelst, Dieter; Peremans, Herbert

    2010-10-01

    It has been argued that an important part of understanding bat echolocation comes down to understanding the morphology of the bat sound processing apparatus. In this Letter we present a method based on information theory that allows us to assess target localization performance of bat sonar, without a priori knowledge on the position, size, or shape of the reflecting target. We demonstrate this method using simulated directivity patterns of the frequency-modulated bat Micronycteris microtis. The results of this analysis indicate that the morphology of this bat’s sound processing apparatus has evolved to be a compromise between sensitivity and accuracy with the pinnae and the noseleaf playing different roles.

  6. Periodical capacity setting methods for make-to-order multi-machine production systems

    PubMed Central

    Altendorfer, Klaus; Hübl, Alexander; Jodlbauer, Herbert

    2014-01-01

    The paper presents different periodical capacity setting methods for make-to-order, multi-machine production systems with stochastic customer required lead times and stochastic processing times to improve service level and tardiness. These methods are developed as decision support when capacity flexibility exists, such as, a certain range of possible working hours a week for example. The methods differ in the amount of information used whereby all are based on the cumulated capacity demand at each machine. In a simulation study the methods’ impact on service level and tardiness is compared to a constant provided capacity for a single and a multi-machine setting. It is shown that the tested capacity setting methods can lead to an increase in service level and a decrease in average tardiness in comparison to a constant provided capacity. The methods using information on processing time and customer required lead time distribution perform best. The results found in this paper can help practitioners to make efficient use of their flexible capacity. PMID:27226649

  7. Cloud decision model for selecting sustainable energy crop based on linguistic intuitionistic information

    NASA Astrophysics Data System (ADS)

    Peng, Hong-Gang; Wang, Jian-Qiang

    2017-11-01

    In recent years, sustainable energy crop has become an important energy development strategy topic in many countries. Selecting the most sustainable energy crop is a significant problem that must be addressed during any biofuel production process. The focus of this study is the development of an innovative multi-criteria decision-making (MCDM) method to handle sustainable energy crop selection problems. Given that various uncertain data are encountered in the evaluation of sustainable energy crops, linguistic intuitionistic fuzzy numbers (LIFNs) are introduced to present the information necessary to the evaluation process. Processing qualitative concepts requires the effective support of reliable tools; then, a cloud model can be used to deal with linguistic intuitionistic information. First, LIFNs are converted and a novel concept of linguistic intuitionistic cloud (LIC) is proposed. The operations, score function and similarity measurement of the LICs are defined. Subsequently, the linguistic intuitionistic cloud density-prioritised weighted Heronian mean operator is developed, which served as the basis for the construction of an applicable MCDM model for sustainable energy crop selection. Finally, an illustrative example is provided to demonstrate the proposed method, and its feasibility and validity are further verified by comparing it with other existing methods.

  8. Separation of foreground and background from light field using gradient information.

    PubMed

    Lee, Jae Young; Park, Rae-Hong

    2017-02-01

    Studies of computer vision or machine vision applications using a light field camera have been increasing in recent years. However, the abilities that the light field camera has are not fully used in these applications. In this paper, we propose a method for direct separation of foreground and background that uses the gradient information and can be used in various applications such as pre-processing. From an optical phenomenon whereby the bundles of rays from the background are flipped, we derive that the disparity sign of the background in the captured three-dimensional scene has the opposite disparity sign of the foreground. Using the majority-weighted voting algorithm based on the gradient information with the Lambertian assumption and the gradient constraint, the foreground and background can be separated at each pixel. In regard to pre-processing, the proposed method can be used for various applications such as occlusion and saliency detection, disparity estimation, and so on. Experimental results with the EPFL light field dataset and Stanford Lytro light field dataset show that the proposed method achieves better performance in terms of the occlusion detection, and thus can be effectively used in pre-processing for saliency detection and disparity estimation.

  9. [Digitalization of radiological imaging information and consequences for patient care in the hospital ].

    PubMed

    den Heeten, G J; Barneveld Binkhuysen, F H

    2001-08-25

    Determining the rate at which radiology must be digitalised has been a controversial issue for many years. Much radiological information is still obtained from the film-screen combination (X-rays) with all of its known inherent restrictions. The importance of imaging information in the healthcare process continues to increase for both radiologists and referring physicians, and the ongoing developments in information technology means that it is possible to integrate imaging information and electronic patient files. The healthcare process can only become more effective and efficient when the appropriate information is in the right place at the right time, something that conventional methods, using photos that need to be physically moved, can scarcely satisfy. There is also a desire for integration with information obtained from nuclear medicine, pathology and endoscopy, and eventually of all stand-alone data systems with relevance for the individually oriented hospital healthcare. The transition from a conventional to a digital process is complex; it is accompanied by the transition from a data-oriented to a process-oriented system. Many years have already been invested in the integration of information systems and the development of digital systems within radiology, the current performance of which is such that many hospitals are considering the digitalisation process or are already implementing parts of it.

  10. Semantic integration of differently asynchronous audio-visual information in videos of real-world events in cognitive processing: an ERP study.

    PubMed

    Liu, Baolin; Wu, Guangning; Wang, Zhongning; Ji, Xiang

    2011-07-01

    In the real world, some of the auditory and visual information received by the human brain are temporally asynchronous. How is such information integrated in cognitive processing in the brain? In this paper, we aimed to study the semantic integration of differently asynchronous audio-visual information in cognitive processing using ERP (event-related potential) method. Subjects were presented with videos of real world events, in which the auditory and visual information are temporally asynchronous. When the critical action was prior to the sound, sounds incongruous with the preceding critical actions elicited a N400 effect when compared to congruous condition. This result demonstrates that semantic contextual integration indexed by N400 also applies to cognitive processing of multisensory information. In addition, the N400 effect is early in latency when contrasted with other visually induced N400 studies. It is shown that cross modal information is facilitated in time when contrasted with visual information in isolation. When the sound was prior to the critical action, a larger late positive wave was observed under the incongruous condition compared to congruous condition. P600 might represent a reanalysis process, in which the mismatch between the critical action and the preceding sound was evaluated. It is shown that environmental sound may affect the cognitive processing of a visual event. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  11. On vital aid: the why, what and how of validation

    PubMed Central

    Kleywegt, Gerard J.

    2009-01-01

    Limitations to the data and subjectivity in the structure-determination process may cause errors in macromolecular crystal structures. Appropriate validation techniques may be used to reveal problems in structures, ideally before they are analysed, published or deposited. Additionally, such tech­niques may be used a posteriori to assess the (relative) merits of a model by potential users. Weak validation methods and statistics assess how well a model reproduces the information that was used in its construction (i.e. experimental data and prior knowledge). Strong methods and statistics, on the other hand, test how well a model predicts data or information that were not used in the structure-determination process. These may be data that were excluded from the process on purpose, general knowledge about macromolecular structure, information about the biological role and biochemical activity of the molecule under study or its mutants or complexes and predictions that are based on the model and that can be tested experimentally. PMID:19171968

  12. Natural Resource Information System. Volume 2: System operating procedures and instructions

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A total computer software system description is provided for the prototype Natural Resource Information System designed to store, process, and display data of maximum usefulness to land management decision making. Program modules are described, as are the computer file design, file updating methods, digitizing process, and paper tape conversion to magnetic tape. Operating instructions for the system, data output, printed output, and graphic output are also discussed.

  13. Could a neuroscientist understand a microprocessor?

    DOE PAGES

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    2017-01-12

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  14. Interactive access to LP DAAC satellite data archives through a combination of open-source and custom middleware web services

    USGS Publications Warehouse

    Davis, Brian N.; Werpy, Jason; Friesz, Aaron M.; Impecoven, Kevin; Quenzer, Robert; Maiersperger, Tom; Meyer, David J.

    2015-01-01

    Current methods of searching for and retrieving data from satellite land remote sensing archives do not allow for interactive information extraction. Instead, Earth science data users are required to download files over low-bandwidth networks to local workstations and process data before science questions can be addressed. New methods of extracting information from data archives need to become more interactive to meet user demands for deriving increasingly complex information from rapidly expanding archives. Moving the tools required for processing data to computer systems of data providers, and away from systems of the data consumer, can improve turnaround times for data processing workflows. The implementation of middleware services was used to provide interactive access to archive data. The goal of this middleware services development is to enable Earth science data users to access remote sensing archives for immediate answers to science questions instead of links to large volumes of data to download and process. Exposing data and metadata to web-based services enables machine-driven queries and data interaction. Also, product quality information can be integrated to enable additional filtering and sub-setting. Only the reduced content required to complete an analysis is then transferred to the user.

  15. Could a Neuroscientist Understand a Microprocessor?

    PubMed Central

    Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods. PMID:28081141

  16. Could a Neuroscientist Understand a Microprocessor?

    PubMed

    Jonas, Eric; Kording, Konrad Paul

    2017-01-01

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods from neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Additionally, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.

  17. Could a neuroscientist understand a microprocessor?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonas, Eric; Kording, Konrad Paul; Diedrichsen, Jorn

    There is a popular belief in neuroscience that we are primarily data limited, and that producing large, multimodal, and complex datasets will, with the help of advanced data analysis algorithms, lead to fundamental insights into the way the brain processes information. These datasets do not yet exist, and if they did we would have no way of evaluating whether or not the algorithmically-generated insights were sufficient or even correct. To address this, here we take a classical microprocessor as a model organism, and use our ability to perform arbitrary experiments on it to see if popular data analysis methods frommore » neuroscience can elucidate the way it processes information. Microprocessors are among those artificial information processing systems that are both complex and that we understand at all levels, from the overall logical flow, via logical gates, to the dynamics of transistors. We show that the approaches reveal interesting structure in the data but do not meaningfully describe the hierarchy of information processing in the microprocessor. This suggests current analytic approaches in neuroscience may fall short of producing meaningful understanding of neural systems, regardless of the amount of data. Furthermore, we argue for scientists using complex non-linear dynamical systems with known ground truth, such as the microprocessor as a validation platform for time-series and structure discovery methods.« less

  18. Natural Language Processing Methods and Systems for Biomedical Ontology Learning

    PubMed Central

    Liu, Kaihong; Hogan, William R.; Crowley, Rebecca S.

    2010-01-01

    While the biomedical informatics community widely acknowledges the utility of domain ontologies, there remain many barriers to their effective use. One important requirement of domain ontologies is that they must achieve a high degree of coverage of the domain concepts and concept relationships. However, the development of these ontologies is typically a manual, time-consuming, and often error-prone process. Limited resources result in missing concepts and relationships as well as difficulty in updating the ontology as knowledge changes. Methodologies developed in the fields of natural language processing, information extraction, information retrieval and machine learning provide techniques for automating the enrichment of an ontology from free-text documents. In this article, we review existing methodologies and developed systems, and discuss how existing methods can benefit the development of biomedical ontologies. PMID:20647054

  19. From IHE Audit Trails to XES Event Logs Facilitating Process Mining.

    PubMed

    Paster, Ferdinand; Helm, Emmanuel

    2015-01-01

    Recently Business Intelligence approaches like process mining are applied to the healthcare domain. The goal of process mining is to gain process knowledge, compliance and room for improvement by investigating recorded event data. Previous approaches focused on process discovery by event data from various specific systems. IHE, as a globally recognized basis for healthcare information systems, defines in its ATNA profile how real-world events must be recorded in centralized event logs. The following approach presents how audit trails collected by the means of ATNA can be transformed to enable process mining. Using the standardized audit trails provides the ability to apply these methods to all IHE based information systems.

  20. Medication incident reporting in residential aged care facilities: Limitations and risks to residents’ safety

    PubMed Central

    2012-01-01

    Background Medication incident reporting (MIR) is a key safety critical care process in residential aged care facilities (RACFs). Retrospective studies of medication incident reports in aged care have identified the inability of existing MIR processes to generate information that can be used to enhance residents’ safety. However, there is little existing research that investigates the limitations of the existing information exchange process that underpins MIR, despite the considerable resources that RACFs’ devote to the MIR process. The aim of this study was to undertake an in-depth exploration of the information exchange process involved in MIR and identify factors that inhibit the collection of meaningful information in RACFs. Methods The study was undertaken in three RACFs (part of a large non-profit organisation) in NSW, Australia. A total of 23 semi-structured interviews and 62 hours of observation sessions were conducted between May to July 2011. The qualitative data was iteratively analysed using a grounded theory approach. Results The findings highlight significant gaps in the design of the MIR artefacts as well as information exchange issues in MIR process execution. Study results emphasized the need to: a) design MIR artefacts that facilitate identification of the root causes of medication incidents, b) integrate the MIR process within existing information systems to overcome key gaps in information exchange execution, and c) support exchange of information that can facilitate a multi-disciplinary approach to medication incident management in RACFs. Conclusions This study highlights the advantages of viewing MIR process holistically rather than as segregated tasks, as a means to identify gaps in information exchange that need to be addressed in practice to improve safety critical processes. PMID:23122411

  1. Diffusion processes in tumors: A nuclear medicine approach

    NASA Astrophysics Data System (ADS)

    Amaya, Helman

    2016-07-01

    The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and 18F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer software was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical 18F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.

  2. A wavelet domain adaptive image watermarking method based on chaotic encryption

    NASA Astrophysics Data System (ADS)

    Wei, Fang; Liu, Jian; Cao, Hanqiang; Yang, Jun

    2009-10-01

    A digital watermarking technique is a specific branch of steganography, which can be used in various applications, provides a novel way to solve security problems for multimedia information. In this paper, we proposed a kind of wavelet domain adaptive image digital watermarking method using chaotic stream encrypt and human eye visual property. The secret information that can be seen as a watermarking is hidden into a host image, which can be publicly accessed, so the transportation of the secret information will not attract the attention of illegal receiver. The experimental results show that the method is invisible and robust against some image processing.

  3. Quantum Approach to Informatics

    NASA Astrophysics Data System (ADS)

    Stenholm, Stig; Suominen, Kalle-Antti

    2005-08-01

    An essential overview of quantum information Information, whether inscribed as a mark on a stone tablet or encoded as a magnetic domain on a hard drive, must be stored in a physical object and thus made subject to the laws of physics. Traditionally, information processing such as computation occurred in a framework governed by laws of classical physics. However, information can also be stored and processed using the states of matter described by non-classical quantum theory. Understanding this quantum information, a fundamentally different type of information, has been a major project of physicists and information theorists in recent years, and recent experimental research has started to yield promising results. Quantum Approach to Informatics fills the need for a concise introduction to this burgeoning new field, offering an intuitive approach for readers in both the physics and information science communities, as well as in related fields. Only a basic background in quantum theory is required, and the text keeps the focus on bringing this theory to bear on contemporary informatics. Instead of proofs and other highly formal structures, detailed examples present the material, making this a uniquely accessible introduction to quantum informatics. Topics covered include: * An introduction to quantum information and the qubit * Concepts and methods of quantum theory important for informatics * The application of information concepts to quantum physics * Quantum information processing and computing * Quantum gates * Error correction using quantum-based methods * Physical realizations of quantum computing circuits A helpful and economical resource for understanding this exciting new application of quantum theory to informatics, Quantum Approach to Informatics provides students and researchers in physics and information science, as well as other interested readers with some scientific background, with an essential overview of the field.

  4. Automated ancillary cancer history classification for mesothelioma patients from free-text clinical reports

    PubMed Central

    Wilson, Richard A.; Chapman, Wendy W.; DeFries, Shawn J.; Becich, Michael J.; Chapman, Brian E.

    2010-01-01

    Background: Clinical records are often unstructured, free-text documents that create information extraction challenges and costs. Healthcare delivery and research organizations, such as the National Mesothelioma Virtual Bank, require the aggregation of both structured and unstructured data types. Natural language processing offers techniques for automatically extracting information from unstructured, free-text documents. Methods: Five hundred and eight history and physical reports from mesothelioma patients were split into development (208) and test sets (300). A reference standard was developed and each report was annotated by experts with regard to the patient’s personal history of ancillary cancer and family history of any cancer. The Hx application was developed to process reports, extract relevant features, perform reference resolution and classify them with regard to cancer history. Two methods, Dynamic-Window and ConText, for extracting information were evaluated. Hx’s classification responses using each of the two methods were measured against the reference standard. The average Cohen’s weighted kappa served as the human benchmark in evaluating the system. Results: Hx had a high overall accuracy, with each method, scoring 96.2%. F-measures using the Dynamic-Window and ConText methods were 91.8% and 91.6%, which were comparable to the human benchmark of 92.8%. For the personal history classification, Dynamic-Window scored highest with 89.2% and for the family history classification, ConText scored highest with 97.6%, in which both methods were comparable to the human benchmark of 88.3% and 97.2%, respectively. Conclusion: We evaluated an automated application’s performance in classifying a mesothelioma patient’s personal and family history of cancer from clinical reports. To do so, the Hx application must process reports, identify cancer concepts, distinguish the known mesothelioma from ancillary cancers, recognize negation, perform reference resolution and determine the experiencer. Results indicated that both information extraction methods tested were dependant on the domain-specific lexicon and negation extraction. We showed that the more general method, ConText, performed as well as our task-specific method. Although Dynamic- Window could be modified to retrieve other concepts, ConText is more robust and performs better on inconclusive concepts. Hx could greatly improve and expedite the process of extracting data from free-text, clinical records for a variety of research or healthcare delivery organizations. PMID:21031012

  5. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  6. Organisation of biotechnological information into knowledge.

    PubMed

    Boh, B

    1996-09-01

    The success of biotechnological research, development and marketing depends to a large extent on the international transfer of information and on the ability to organise biotechnology information into knowledge. To increase the efficiency of information-based approaches, an information strategy has been developed and consists of the following stages: definition of the problem, its structure and sub-problems; acquisition of data by targeted processing of computer-supported bibliographic, numeric, textual and graphic databases; analysis of data and building of specialized in-house information systems; information processing for structuring data into systems, recognition of trends and patterns of knowledge, particularly by information synthesis using the concept of information density; design of research hypotheses; testing hypotheses in the laboratory and/or pilot plant; repeated evaluation and optimization of hypotheses by information methods and testing them by further laboratory work. The information approaches are illustrated by examples from the university-industry joint projects in biotechnology, biochemistry and agriculture.

  7. Spectral simplicity of apparent complexity. II. Exact complexities and complexity spectra

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Crutchfield, James P.

    2018-03-01

    The meromorphic functional calculus developed in Part I overcomes the nondiagonalizability of linear operators that arises often in the temporal evolution of complex systems and is generic to the metadynamics of predicting their behavior. Using the resulting spectral decomposition, we derive closed-form expressions for correlation functions, finite-length Shannon entropy-rate approximates, asymptotic entropy rate, excess entropy, transient information, transient and asymptotic state uncertainties, and synchronization information of stochastic processes generated by finite-state hidden Markov models. This introduces analytical tractability to investigating information processing in discrete-event stochastic processes, symbolic dynamics, and chaotic dynamical systems. Comparisons reveal mathematical similarities between complexity measures originally thought to capture distinct informational and computational properties. We also introduce a new kind of spectral analysis via coronal spectrograms and the frequency-dependent spectra of past-future mutual information. We analyze a number of examples to illustrate the methods, emphasizing processes with multivariate dependencies beyond pairwise correlation. This includes spectral decomposition calculations for one representative example in full detail.

  8. Handbook of automated data collection methods for the National Transit Database

    DOT National Transportation Integrated Search

    2003-10-01

    In recent years, with the increasing sophistication and capabilities of information processing technologies, there has been a renewed interest on the part of transit systems to tap the rich information potential of the National Transit Database (NTD)...

  9. Eventogram: A Visual Representation of Main Events in Biomedical Signals.

    PubMed

    Elgendi, Mohamed

    2016-09-22

    Biomedical signals carry valuable physiological information and many researchers have difficulty interpreting and analyzing long-term, one-dimensional, quasi-periodic biomedical signals. Traditionally, biomedical signals are analyzed and visualized using periodogram, spectrogram, and wavelet methods. However, these methods do not offer an informative visualization of main events within the processed signal. This paper attempts to provide an event-related framework to overcome the drawbacks of the traditional visualization methods and describe the main events within the biomedical signal in terms of duration and morphology. Electrocardiogram and photoplethysmogram signals are used in the analysis to demonstrate the differences between the traditional visualization methods, and their performance is compared against the proposed method, referred to as the " eventogram " in this paper. The proposed method is based on two event-related moving averages that visualizes the main time-domain events in the processed biomedical signals. The traditional visualization methods were unable to find dominant events in processed signals while the eventogram was able to visualize dominant events in signals in terms of duration and morphology. Moreover, eventogram -based detection algorithms succeeded with detecting main events in different biomedical signals with a sensitivity and positive predictivity >95%. The output of the eventogram captured unique patterns and signatures of physiological events, which could be used to visualize and identify abnormal waveforms in any quasi-periodic signal.

  10. Fault detection of Tennessee Eastman process based on topological features and SVM

    NASA Astrophysics Data System (ADS)

    Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen

    2018-03-01

    Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.

  11. Cognitive task load in a naval ship control centre: from identification to prediction.

    PubMed

    Grootjen, M; Neerincx, M A; Veltman, J A

    Deployment of information and communication technology will lead to further automation of control centre tasks and an increasing amount of information to be processed. A method for establishing adequate levels of cognitive task load for the operators in such complex environments has been developed. It is based on a model distinguishing three load factors: time occupied, task-set switching, and level of information processing. Application of the method resulted in eight scenarios for eight extremes of task load (i.e. low and high values for each load factor). These scenarios were performed by 13 teams in a high-fidelity control centre simulator of the Royal Netherlands Navy. The results show that the method provides good prediction of the task load that will actually appear in the simulator. The model allowed identification of under- and overload situations showing negative effects on operator performance corresponding to controlled experiments in a less realistic task environment. Tools proposed to keep the operator at an optimum task load are (adaptive) task allocation and interface support.

  12. Method and apparatus for filtering visual documents

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E. (Inventor); Shelton, Robert O. (Inventor)

    1993-01-01

    A method and apparatus for producing an abstract or condensed version of a visual document is presented. The frames comprising the visual document are first sampled to reduce the number of frames required for processing. The frames are then subjected to a structural decomposition process that reduces all information in each frame to a set of values. These values are in turn normalized and further combined to produce only one information content value per frame. The information content values of these frames are then compared to a selected distribution cutoff point. This effectively selects those values at the tails of a normal distribution, thus filtering key frames from their surrounding frames. The value for each frame is then compared with the value from the previous frame, and the respective frame is finally stored only if the values are significantly different. The method filters or compresses a visual document with a reduction in digital storage on the ratio of up to 700 to 1 or more, depending on the content of the visual document being filtered.

  13. OPTICAL correlation identification technology applied in underwater laser imaging target identification

    NASA Astrophysics Data System (ADS)

    Yao, Guang-tao; Zhang, Xiao-hui; Ge, Wei-long

    2012-01-01

    The underwater laser imaging detection is an effective method of detecting short distance target underwater as an important complement of sonar detection. With the development of underwater laser imaging technology and underwater vehicle technology, the underwater automatic target identification has gotten more and more attention, and is a research difficulty in the area of underwater optical imaging information processing. Today, underwater automatic target identification based on optical imaging is usually realized with the method of digital circuit software programming. The algorithm realization and control of this method is very flexible. However, the optical imaging information is 2D image even 3D image, the amount of imaging processing information is abundant, so the electronic hardware with pure digital algorithm will need long identification time and is hard to meet the demands of real-time identification. If adopt computer parallel processing, the identification speed can be improved, but it will increase complexity, size and power consumption. This paper attempts to apply optical correlation identification technology to realize underwater automatic target identification. The optics correlation identification technology utilizes the Fourier transform characteristic of Fourier lens which can accomplish Fourier transform of image information in the level of nanosecond, and optical space interconnection calculation has the features of parallel, high speed, large capacity and high resolution, combines the flexibility of calculation and control of digital circuit method to realize optoelectronic hybrid identification mode. We reduce theoretical formulation of correlation identification and analyze the principle of optical correlation identification, and write MATLAB simulation program. We adopt single frame image obtained in underwater range gating laser imaging to identify, and through identifying and locating the different positions of target, we can improve the speed and orientation efficiency of target identification effectively, and validate the feasibility of this method primarily.

  14. Bayesian networks and information theory for audio-visual perception modeling.

    PubMed

    Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis

    2010-09-01

    Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.

  15. Cybernetic Basis and System Practice of Remote Sensing and Spatial Information Science

    NASA Astrophysics Data System (ADS)

    Tan, X.; Jing, X.; Chen, R.; Ming, Z.; He, L.; Sun, Y.; Sun, X.; Yan, L.

    2017-09-01

    Cybernetics provides a new set of ideas and methods for the study of modern science, and it has been fully applied in many areas. However, few people have introduced cybernetics into the field of remote sensing. The paper is based on the imaging process of remote sensing system, introducing cybernetics into the field of remote sensing, establishing a space-time closed-loop control theory for the actual operation of remote sensing. The paper made the process of spatial information coherently, and improved the comprehensive efficiency of the space information from acquisition, procession, transformation to application. We not only describes the application of cybernetics in remote sensing platform control, sensor control, data processing control, but also in whole system of remote sensing imaging process control. We achieve the information of output back to the input to control the efficient operation of the entire system. This breakthrough combination of cybernetics science and remote sensing science will improve remote sensing science to a higher level.

  16. Research on polarization imaging information parsing method

    NASA Astrophysics Data System (ADS)

    Yuan, Hongwu; Zhou, Pucheng; Wang, Xiaolong

    2016-11-01

    Polarization information parsing plays an important role in polarization imaging detection. This paper focus on the polarization information parsing method: Firstly, the general process of polarization information parsing is given, mainly including polarization image preprocessing, multiple polarization parameters calculation, polarization image fusion and polarization image tracking, etc.; And then the research achievements of the polarization information parsing method are presented, in terms of polarization image preprocessing, the polarization image registration method based on the maximum mutual information is designed. The experiment shows that this method can improve the precision of registration and be satisfied the need of polarization information parsing; In terms of multiple polarization parameters calculation, based on the omnidirectional polarization inversion model is built, a variety of polarization parameter images are obtained and the precision of inversion is to be improve obviously; In terms of polarization image fusion , using fuzzy integral and sparse representation, the multiple polarization parameters adaptive optimal fusion method is given, and the targets detection in complex scene is completed by using the clustering image segmentation algorithm based on fractal characters; In polarization image tracking, the average displacement polarization image characteristics of auxiliary particle filtering fusion tracking algorithm is put forward to achieve the smooth tracking of moving targets. Finally, the polarization information parsing method is applied to the polarization imaging detection of typical targets such as the camouflage target, the fog and latent fingerprints.

  17. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  18. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    PubMed

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  19. Experiment research on inertia-aided adaptive electronic image stabilization of optical stable platform

    NASA Astrophysics Data System (ADS)

    Lu, Xiaodong; Wu, Tianze; Zhou, Jun; Zhao, Bin; Ma, Xiaoyuan; Tang, Xiucheng

    2016-03-01

    An electronic image stabilization method compounded with inertia information, which can compensate the coupling interference caused by the pitch-yaw movement of the optical stable platform system, has been proposed in this paper. Firstly the mechanisms of coning rotation and lever-arm translation of line of sight (LOS) are analyzed during the stabilization process under moving carriers, and the mathematical model which describes the relationship between LOS rotation angle and platform attitude angle are derived. Then the image spin angle caused by coning rotation is estimated by using inertia information. Furthermore, an adaptive block matching method, which based on image edge and angular point, is proposed to smooth the jitter created by the lever-arm translation. This method optimizes the matching process and strategies. Finally, the results of hardware-in-the-loop simulation verified the effectiveness and real-time performance of the proposed method.

  20. Multifractal analysis of information processing in hippocampal neural ensembles during working memory under Δ9-tetrahydrocannabinol administration

    PubMed Central

    Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.

    2014-01-01

    Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297

  1. Privacy-Related Context Information for Ubiquitous Health

    PubMed Central

    Nykänen, Pirkko; Ruotsalainen, Pekka

    2014-01-01

    Background Ubiquitous health has been defined as a dynamic network of interconnected systems. A system is composed of one or more information systems, their stakeholders, and the environment. These systems offer health services to individuals and thus implement ubiquitous computing. Privacy is the key challenge for ubiquitous health because of autonomous processing, rich contextual metadata, lack of predefined trust among participants, and the business objectives. Additionally, regulations and policies of stakeholders may be unknown to the individual. Context-sensitive privacy policies are needed to regulate information processing. Objective Our goal was to analyze privacy-related context information and to define the corresponding components and their properties that support privacy management in ubiquitous health. These properties should describe the privacy issues of information processing. With components and their properties, individuals can define context-aware privacy policies and set their privacy preferences that can change in different information-processing situations. Methods Scenarios and user stories are used to analyze typical activities in ubiquitous health to identify main actors, goals, tasks, and stakeholders. Context arises from an activity and, therefore, we can determine different situations, services, and systems to identify properties for privacy-related context information in information-processing situations. Results Privacy-related context information components are situation, environment, individual, information technology system, service, and stakeholder. Combining our analyses and previously identified characteristics of ubiquitous health, more detailed properties for the components are defined. Properties define explicitly what context information for different components is needed to create context-aware privacy policies that can control, limit, and constrain information processing. With properties, we can define, for example, how data can be processed or how components are regulated or in what kind of environment data can be processed. Conclusions This study added to the vision of ubiquitous health by analyzing information processing from the viewpoint of an individual’s privacy. We learned that health and wellness-related activities may happen in several environments and situations with multiple stakeholders, services, and systems. We have provided new knowledge regarding privacy-related context information and corresponding components by analyzing typical activities in ubiquitous health. With the identified components and their properties, individuals can define their personal preferences on information processing based on situational information, and privacy services can capture privacy-related context of the information-processing situation. PMID:25100084

  2. Evaluation of Alternative Conceptual Models Using Interdisciplinary Information: An Application in Shallow Groundwater Recharge and Discharge

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Bajcsy, P.; Valocchi, A. J.; Kim, C.; Wang, J.

    2007-12-01

    Natural systems are complex, thus extensive data are needed for their characterization. However, data acquisition is expensive; consequently we develop models using sparse, uncertain information. When all uncertainties in the system are considered, the number of alternative conceptual models is large. Traditionally, the development of a conceptual model has relied on subjective professional judgment. Good judgment is based on experience in coordinating and understanding auxiliary information which is correlated to the model but difficult to be quantified into the mathematical model. For example, groundwater recharge and discharge (R&D) processes are known to relate to multiple information sources such as soil type, river and lake location, irrigation patterns and land use. Although hydrologists have been trying to understand and model the interaction between each of these information sources and R&D processes, it is extremely difficult to quantify their correlations using a universal approach due to the complexity of the processes, the spatiotemporal distribution and uncertainty. There is currently no single method capable of estimating R&D rates and patterns for all practical applications. Chamberlin (1890) recommended use of "multiple working hypotheses" (alternative conceptual models) for rapid advancement in understanding of applied and theoretical problems. Therefore, cross analyzing R&D rates and patterns from various estimation methods and related field information will likely be superior to using only a single estimation method. We have developed the Pattern Recognition Utility (PRU), to help GIS users recognize spatial patterns from noisy 2D image. This GIS plug-in utility has been applied to help hydrogeologists establish alternative R&D conceptual models in a more efficient way than conventional methods. The PRU uses numerical methods and image processing algorithms to estimate and visualize shallow R&D patterns and rates. It can provide a fast initial estimate prior to planning labor intensive and time consuming field R&D measurements. Furthermore, the Spatial Pattern 2 Learn (SP2L) was developed to cross analyze results from the PRU with ancillary field information, such as land coverage, soil type, topographic maps and previous estimates. The learning process of SP2L cross examines each initially recognized R&D pattern with the ancillary spatial dataset, and then calculates a quantifiable reliability index for each R&D map using a supervised machine learning technique called decision tree. This JAVA based software package is capable of generating alternative R&D maps if the user decides to apply certain conditions recognized by the learning process. The reliability indices from SP2L will improve the traditionally subjective approach to initiating conceptual models by providing objectively quantifiable conceptual bases for further probabilistic and uncertainty analyses. Both the PRU and SP2L have been designed to be user-friendly and universal utilities for pattern recognition and learning to improve model predictions from sparse measurements by computer-assisted integration of spatially dense geospatial image data and machine learning of model dependencies.

  3. Results of research on development of an intellectual information system of bankruptcy risk assessment of the enterprise

    NASA Astrophysics Data System (ADS)

    Telipenko, E.; Chernysheva, T.; Zakharova, A.; Dumchev, A.

    2015-10-01

    The article represents research results about the knowledge base development for the intellectual information system for the bankruptcy risk assessment of the enterprise. It is described the process analysis of the knowledge base development; the main process stages, some problems and their solutions are given. The article introduces the connectionist model for the bankruptcy risk assessment based on the analysis of industrial enterprise financial accounting. The basis for this connectionist model is a three-layer perceptron with the back propagation of error algorithm. The knowledge base for the intellectual information system consists of processed information and the processing operation method represented as the connectionist model. The article represents the structure of the intellectual information system, the knowledge base, and the information processing algorithm for neural network training. The paper shows mean values of 10 indexes for industrial enterprises; with the help of them it is possible to carry out a financial analysis of industrial enterprises and identify correctly the current situation for well-timed managerial decisions. Results are given about neural network testing on the data of both bankrupt and financially strong enterprises, which were not included into training and test sets.

  4. Cognition and Health Literacy in Older Adults’ Recall of Self-Care Information

    PubMed Central

    Madison, Anna; Gao, Xuefei; Graumlich, James F.; Conner-Garcia, Thembi; Murray, Michael D.; Stine-Morrow, Elizabeth A. L.; Morrow, Daniel G.

    2017-01-01

    Abstract Purpose of the Study: Health literacy is associated with health outcomes presumably because it influences the understanding of information needed for self-care. However, little is known about the language comprehension mechanisms that underpin health literacy. Design and Methods: We explored the relationship between a commonly used measure of health literacy (Short Test of Functional Health Literacy in Adults [STOFHLA]) and comprehension of health information among 145 older adults. Results: Results showed that performance on the STOFHLA was associated with recall of health information. Consistent with the Process-Knowledge Model of Health Literacy, mediation analysis showed that both processing capacity and knowledge mediated the association between health literacy and recall of health information. In addition, knowledge moderated the effects of processing capacity limits, such that processing capacity was less likely to be associated with recall for older adults with higher levels of knowledge. Implications: These findings suggest that knowledge contributes to health literacy and can compensate for deficits in processing capacity to support comprehension of health information among older adults. The implications of these findings for improving patient education materials for older adults with inadequate health literacy are discussed. PMID:26209450

  5. Uncovering the problem-solving process: cued retrospective reporting versus concurrent and retrospective reporting.

    PubMed

    van Gog, Tamara; Paas, Fred; van Merriënboer, Jeroen J G; Witte, Puk

    2005-12-01

    This study investigated the amounts of problem-solving process information ("action," "why," "how," and "metacognitive") elicited by means of concurrent, retrospective, and cued retrospective reporting. In a within-participants design, 26 participants completed electrical circuit troubleshooting tasks under different reporting conditions. The method of cued retrospective reporting used the original computer-based task and a superimposed record of the participant's eye fixations and mouse-keyboard operations as a cue for retrospection. Cued retrospective reporting (with the exception of why information) and concurrent reporting (with the exception of metacognitive information) resulted in a higher number of codes on the different types of information than did retrospective reporting.

  6. Reconstruction method for data protection in telemedicine systems

    NASA Astrophysics Data System (ADS)

    Buldakova, T. I.; Suyatinov, S. I.

    2015-03-01

    In the report the approach to protection of transmitted data by creation of pair symmetric keys for the sensor and the receiver is offered. Since biosignals are unique for each person, their corresponding processing allows to receive necessary information for creation of cryptographic keys. Processing is based on reconstruction of the mathematical model generating time series that are diagnostically equivalent to initial biosignals. Information about the model is transmitted to the receiver, where the restoration of physiological time series is performed using the reconstructed model. Thus, information about structure and parameters of biosystem model received in the reconstruction process can be used not only for its diagnostics, but also for protection of transmitted data in telemedicine complexes.

  7. Fuzzy simulation in concurrent engineering

    NASA Technical Reports Server (NTRS)

    Kraslawski, A.; Nystrom, L.

    1992-01-01

    Concurrent engineering is becoming a very important practice in manufacturing. A problem in concurrent engineering is the uncertainty associated with the values of the input variables and operating conditions. The problem discussed in this paper concerns the simulation of processes where the raw materials and the operational parameters possess fuzzy characteristics. The processing of fuzzy input information is performed by the vertex method and the commercial simulation packages POLYMATH and GEMS. The examples are presented to illustrate the usefulness of the method in the simulation of chemical engineering processes.

  8. Methods and Apparatus for Autonomous Robotic Control

    NASA Technical Reports Server (NTRS)

    Gorshechnikov, Anatoly (Inventor); Livitz, Gennady (Inventor); Versace, Massimiliano (Inventor); Palma, Jesse (Inventor)

    2017-01-01

    Sensory processing of visual, auditory, and other sensor information (e.g., visual imagery, LIDAR, RADAR) is conventionally based on "stovepiped," or isolated processing, with little interactions between modules. Biological systems, on the other hand, fuse multi-sensory information to identify nearby objects of interest more quickly, more efficiently, and with higher signal-to-noise ratios. Similarly, examples of the OpenSense technology disclosed herein use neurally inspired processing to identify and locate objects in a robot's environment. This enables the robot to navigate its environment more quickly and with lower computational and power requirements.

  9. Increasing patient safety and efficiency in transfusion therapy using formal process definitions.

    PubMed

    Henneman, Elizabeth A; Avrunin, George S; Clarke, Lori A; Osterweil, Leon J; Andrzejewski, Chester; Merrigan, Karen; Cobleigh, Rachel; Frederick, Kimberly; Katz-Bassett, Ethan; Henneman, Philip L

    2007-01-01

    The administration of blood products is a common, resource-intensive, and potentially problem-prone area that may place patients at elevated risk in the clinical setting. Much of the emphasis in transfusion safety has been targeted toward quality control measures in laboratory settings where blood products are prepared for administration as well as in automation of certain laboratory processes. In contrast, the process of transfusing blood in the clinical setting (ie, at the point of care) has essentially remained unchanged over the past several decades. Many of the currently available methods for improving the quality and safety of blood transfusions in the clinical setting rely on informal process descriptions, such as flow charts and medical algorithms, to describe medical processes. These informal descriptions, although useful in presenting an overview of standard processes, can be ambiguous or incomplete. For example, they often describe only the standard process and leave out how to handle possible failures or exceptions. One alternative to these informal descriptions is to use formal process definitions, which can serve as the basis for a variety of analyses because these formal definitions offer precision in the representation of all possible ways that a process can be carried out in both standard and exceptional situations. Formal process definitions have not previously been used to describe and improve medical processes. The use of such formal definitions to prospectively identify potential error and improve the transfusion process has not previously been reported. The purpose of this article is to introduce the concept of formally defining processes and to describe how formal definitions of blood transfusion processes can be used to detect and correct transfusion process errors in ways not currently possible using existing quality improvement methods.

  10. How are information seeking, scanning, and processing related to beliefs about the roles of genetics and behavior in cancer causation?

    PubMed Central

    Waters, Erika A.; Wheeler, Courtney; Hamilton, Jada G.

    2016-01-01

    Background Understanding that cancer is caused by both genetic and behavioral risk factors is an important component of genomic literacy. However, a considerable percentage of people in the U.S. do not endorse such multifactorial beliefs. Methods Using nationally representative cross-sectional data from the U.S. Health Information National Trends Survey (N=2,529), we examined how information seeking, information scanning, and key information processing characteristics were associated with endorsing a multifactorial model of cancer causation. Results Multifactorial beliefs about cancer were more common among respondents who engaged in cancer information scanning (p=.001), were motivated to process health information (p=.005), and who reported a family history of cancer (p=.0002). Respondents who reported having previous negative information seeking experiences had lower odds of endorsing multifactorial beliefs (p=.01). Multifactorial beliefs were not associated with cancer information seeking, trusting cancer information obtained from the Internet, trusting cancer information from a physician, self-efficacy for obtaining cancer information, numeracy, or being aware of direct-to-consumer genetic testing (ps>.05). Conclusion Gaining additional understanding of how people access, process, and use health information will be critical for the continued development and dissemination of effective health communication interventions and for the further translation of genomics research to public health and clinical practice. PMID:27661291

  11. Ethnographic process evaluation in primary care: explaining the complexity of implementation.

    PubMed

    Bunce, Arwen E; Gold, Rachel; Davis, James V; McMullen, Carmit K; Jaworski, Victoria; Mercer, MaryBeth; Nelson, Christine

    2014-12-05

    The recent growth of implementation research in care delivery systems has led to a renewed interest in methodological approaches that deliver not only intervention outcome data but also deep understanding of the complex dynamics underlying the implementation process. We suggest that an ethnographic approach to process evaluation, when informed by and integrated with quantitative data, can provide this nuanced insight into intervention outcomes. The specific methods used in such ethnographic process evaluations are rarely presented in detail; our objective is to stimulate a conversation around the successes and challenges of specific data collection methods in health care settings. We use the example of a translational clinical trial among 11 community clinics in Portland, OR that are implementing an evidence-based, health-information technology (HIT)-based intervention focused on patients with diabetes. Our ethnographic process evaluation employed weekly diaries by clinic-based study employees, observation, informal and formal interviews, document review, surveys, and group discussions to identify barriers and facilitators to implementation success, provide insight into the quantitative study outcomes, and uncover lessons potentially transferable to other implementation projects. These methods captured the depth and breadth of factors contributing to intervention uptake, while minimizing disruption to clinic work and supporting mid-stream shifts in implementation strategies. A major challenge is the amount of dedicated researcher time required. The deep understanding of the 'how' and 'why' behind intervention outcomes that can be gained through an ethnographic approach improves the credibility and transferability of study findings. We encourage others to share their own experiences with ethnography in implementation evaluation and health services research, and to consider adapting the methods and tools described here for their own research.

  12. Electrondriven processes in polyatomic molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKoy, Vincent

    2017-03-20

    This project developed and applied scalable computational methods to obtain information about low-energy electron collisions with larger polyatomic molecules. Such collisions are important in modeling radiation damage to living systems, in spark ignition and combustion, and in plasma processing of materials. The focus of the project was to develop efficient methods that could be used to obtain both fundamental scientific insights and data of practical value to applications.

  13. Stoke's and anti-Stoke's characteristics of anaerobic and aerobic bacterias at excitation of fluorescence by low-intensity red light: I. Research of anaerobic bacterias

    NASA Astrophysics Data System (ADS)

    Masychev, Victor I.; Alexandrov, Michail T.

    2000-04-01

    Biopsy or photo dynamic therapy of tumors are usually investigated by fluorescent diagnostics methods. Information on modified method of fluorescence diagnostics of inflammatory diseases is represented in this research. Anaerobic micro organisms are often the cause of these pathological processes. These micro organisms also accompany disbiotic processes in intestines.

  14. Conceptual design of industrial process displays.

    PubMed

    Pedersen, C R; Lind, M

    1999-11-01

    Today, process displays used in industry are often designed on the basis of piping and instrumentation diagrams without any method of ensuring that the needs of the operators are fulfilled. Therefore, a method for a systematic approach to the design of process displays is needed. This paper discusses aspects of process display design taking into account both the designer's and the operator's points of view. Three aspects are emphasized: the operator tasks, the display content and the display form. The distinction between these three aspects is the basis for proposing an outline for a display design method that matches the industrial practice of modular plant design and satisfies the needs of reusability of display design solutions. The main considerations in display design in the industry are to specify the operator's activities in detail, to extract the information the operators need from the plant design specification and documentation, and finally to present this information. The form of the display is selected from existing standardized display elements such as trend curves, mimic diagrams, ecological interfaces, etc. Further knowledge is required to invent new display elements. That is, knowledge about basic visual means of presenting information and how humans perceive and interpret these means and combinations. This knowledge is required in the systematic selection of graphical items for a given display content. The industrial part of the method is first illustrated in the paper by a simple example from a plant with batch processes. Later the method is applied to develop a supervisory display for a condenser system in a nuclear power plant. The differences between the continuous plant domain of power production and the batch processes from the example are analysed and broad categories of display types are proposed. The problems involved in specification and invention of a supervisory display are analysed and conclusions from these problems are made. It is concluded that the design method proposed provides a framework for the progress of the display design and is useful in pin-pointing the actual problems. The method was useful in reducing the number of existing displays that could fulfil the requirements of the supervision task. The method provided at the same time a framework for dealing with the problems involved in inventing new displays based on structured analysis. However the problems in a systematic approach to display invention still need consideration.

  15. Elaboration and formalization of current scientific knowledge of risks and preventive measures illustrated by colorectal cancer.

    PubMed

    Giorgi, R; Gouvernet, J; Dufour, J; Degoulet, P; Laugier, R; Quilichini, F; Fieschi, M

    2001-01-01

    Present the method used to elaborate and formalize current scientific knowledge to provide physicians with tools available on the Internet, that enable them to evaluate individual patient risk, give personalized preventive recommendations or early screening measures. The approach suggested in this article is in line with medical procedures based on levels of evidence (Evidence-based Medicine). A cyclical process for developing recommendations allows us to quickly incorporate current scientific information. At each phase, the analysis is reevaluated by experts in the field collaborating on the project. The information is formalized through the use of levels of evidence and grades of recommendations. GLIF model is used to implement recommendations for clinical practice guidelines. The most current scientific evidence incorporated in a cyclical process includes several steps: critical analysis according to the Evidence-based Medicine method; identification of predictive factors; setting-up risk levels; identification of prevention measures; elaboration of personalized recommendation. The information technology implementation of the clinical practice guideline enables physicians to quickly obtain personalized information for their patients. Cases of colorectal prevention illustrate our approach. Integration of current scientific knowledge is an important process. The delay between the moment new information arrives and the moment the practitioner applies it, is thus reduced.

  16. Research on pre-processing of QR Code

    NASA Astrophysics Data System (ADS)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  17. Methods and apparatus of analyzing electrical power grid data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Critchlow, Terence J.; Gibson, Tara D.

    Apparatus and methods of processing large-scale data regarding an electrical power grid are described. According to one aspect, a method of processing large-scale data regarding an electrical power grid includes accessing a large-scale data set comprising information regarding an electrical power grid; processing data of the large-scale data set to identify a filter which is configured to remove erroneous data from the large-scale data set; using the filter, removing erroneous data from the large-scale data set; and after the removing, processing data of the large-scale data set to identify an event detector which is configured to identify events of interestmore » in the large-scale data set.« less

  18. A Vision-Aided 3D Path Teaching Method before Narrow Butt Joint Welding

    PubMed Central

    Zeng, Jinle; Chang, Baohua; Du, Dong; Peng, Guodong; Chang, Shuhe; Hong, Yuxiang; Wang, Li; Shan, Jiguo

    2017-01-01

    For better welding quality, accurate path teaching for actuators must be achieved before welding. Due to machining errors, assembly errors, deformations, etc., the actual groove position may be different from the predetermined path. Therefore, it is significant to recognize the actual groove position using machine vision methods and perform an accurate path teaching process. However, during the teaching process of a narrow butt joint, the existing machine vision methods may fail because of poor adaptability, low resolution, and lack of 3D information. This paper proposes a 3D path teaching method for narrow butt joint welding. This method obtains two kinds of visual information nearly at the same time, namely 2D pixel coordinates of the groove in uniform lighting condition and 3D point cloud data of the workpiece surface in cross-line laser lighting condition. The 3D position and pose between the welding torch and groove can be calculated after information fusion. The image resolution can reach 12.5 μm. Experiments are carried out at an actuator speed of 2300 mm/min and groove width of less than 0.1 mm. The results show that this method is suitable for groove recognition before narrow butt joint welding and can be applied in path teaching fields of 3D complex components. PMID:28492481

  19. A Vision-Aided 3D Path Teaching Method before Narrow Butt Joint Welding.

    PubMed

    Zeng, Jinle; Chang, Baohua; Du, Dong; Peng, Guodong; Chang, Shuhe; Hong, Yuxiang; Wang, Li; Shan, Jiguo

    2017-05-11

    For better welding quality, accurate path teaching for actuators must be achieved before welding. Due to machining errors, assembly errors, deformations, etc., the actual groove position may be different from the predetermined path. Therefore, it is significant to recognize the actual groove position using machine vision methods and perform an accurate path teaching process. However, during the teaching process of a narrow butt joint, the existing machine vision methods may fail because of poor adaptability, low resolution, and lack of 3D information. This paper proposes a 3D path teaching method for narrow butt joint welding. This method obtains two kinds of visual information nearly at the same time, namely 2D pixel coordinates of the groove in uniform lighting condition and 3D point cloud data of the workpiece surface in cross-line laser lighting condition. The 3D position and pose between the welding torch and groove can be calculated after information fusion. The image resolution can reach 12.5 μm. Experiments are carried out at an actuator speed of 2300 mm/min and groove width of less than 0.1 mm. The results show that this method is suitable for groove recognition before narrow butt joint welding and can be applied in path teaching fields of 3D complex components.

  20. A Novel Estimator for the Rate of Information Transfer by Continuous Signals

    PubMed Central

    Takalo, Jouni; Ignatova, Irina; Weckström, Matti; Vähäsöyrinki, Mikko

    2011-01-01

    The information transfer rate provides an objective and rigorous way to quantify how much information is being transmitted through a communications channel whose input and output consist of time-varying signals. However, current estimators of information content in continuous signals are typically based on assumptions about the system's linearity and signal statistics, or they require prohibitive amounts of data. Here we present a novel information rate estimator without these limitations that is also optimized for computational efficiency. We validate the method with a simulated Gaussian information channel and demonstrate its performance with two example applications. Information transfer between the input and output signals of a nonlinear system is analyzed using a sensory receptor neuron as the model system. Then, a climate data set is analyzed to demonstrate that the method can be applied to a system based on two outputs generated by interrelated random processes. These analyses also demonstrate that the new method offers consistent performance in situations where classical methods fail. In addition to these examples, the method is applicable to a wide range of continuous time series commonly observed in the natural sciences, economics and engineering. PMID:21494562

  1. Study on Mosaic and Uniform Color Method of Satellite Image Fusion in Large Srea

    NASA Astrophysics Data System (ADS)

    Liu, S.; Li, H.; Wang, X.; Guo, L.; Wang, R.

    2018-04-01

    Due to the improvement of satellite radiometric resolution and the color difference for multi-temporal satellite remote sensing images and the large amount of satellite image data, how to complete the mosaic and uniform color process of satellite images is always an important problem in image processing. First of all using the bundle uniform color method and least squares mosaic method of GXL and the dodging function, the uniform transition of color and brightness can be realized in large area and multi-temporal satellite images. Secondly, using Color Mapping software to color mosaic images of 16bit to mosaic images of 8bit based on uniform color method with low resolution reference images. At last, qualitative and quantitative analytical methods are used respectively to analyse and evaluate satellite image after mosaic and uniformity coloring. The test reflects the correlation of mosaic images before and after coloring is higher than 95 % and image information entropy increases, texture features are enhanced which have been proved by calculation of quantitative indexes such as correlation coefficient and information entropy. Satellite image mosaic and color processing in large area has been well implemented.

  2. Health Information System in Primary Health Care: The Challenges and Barriers from Local Providers’ Perspective of an Area in Iran

    PubMed Central

    Yazdi-Feyzabadi, Vahid; Emami, Mozhgan; Mehrolhassani, Mohammad Hossein

    2015-01-01

    Background: Health information system (HIS) has been utilized for collecting, processing, storing, and transferring the required information for planning and decision-making at different levels of health sector to provide quality services. In this study, in order to provide high-quality HIS, primary health care (PHC) providers’ perspective on current challenges and barriers were investigated. Methods: This study was carried out with a qualitative approach using semi-structured audiotaped focus group discussions (FGDs). One FGD was conducted with 13 Behvarz and health technicians as front-line workers and the other with 16 personnel including physicians, statisticians, and health professionals working in health centers of the PHC network in KUMS. The discussions were transcribed and then analyzed using the framework analysis method. Results: The identified organizational challenges were categorized into two groups: HIS structure and the current model of PHC in urban areas. Furthermore, the structural challenges were classified into HIS management structure (information systems resources, including human, supplies, and organizational rules) and information process. Conclusions: The HIS works effectively and efficiently when there are a consistency and integrity between the human, supplies, and process aspects. Hence, multifaceted interventions including strengthening the organizational culture to use the information in decisions, eliminating infrastructural obstacles, appointing qualified staff and more investment for service delivery at urban areas are the most fundamental requirements of high-quality HIS in PHC. PMID:26236444

  3. Multi-Connection Pattern Analysis: Decoding the representational content of neural communication.

    PubMed

    Li, Yuanning; Richardson, Robert Mark; Ghuman, Avniel Singh

    2017-11-15

    The lack of multivariate methods for decoding the representational content of interregional neural communication has left it difficult to know what information is represented in distributed brain circuit interactions. Here we present Multi-Connection Pattern Analysis (MCPA), which works by learning mappings between the activity patterns of the populations as a factor of the information being processed. These maps are used to predict the activity from one neural population based on the activity from the other population. Successful MCPA-based decoding indicates the involvement of distributed computational processing and provides a framework for probing the representational structure of the interaction. Simulations demonstrate the efficacy of MCPA in realistic circumstances. In addition, we demonstrate that MCPA can be applied to different signal modalities to evaluate a variety of hypothesis associated with information coding in neural communications. We apply MCPA to fMRI and human intracranial electrophysiological data to provide a proof-of-concept of the utility of this method for decoding individual natural images and faces in functional connectivity data. We further use a MCPA-based representational similarity analysis to illustrate how MCPA may be used to test computational models of information transfer among regions of the visual processing stream. Thus, MCPA can be used to assess the information represented in the coupled activity of interacting neural circuits and probe the underlying principles of information transformation between regions. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Alternatives Assessment Frameworks: Research Needs for the Informed Substitution of Hazardous Chemicals

    PubMed Central

    Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally

    2015-01-01

    Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778

  5. Diffusion processes of fragmentary information on scale-free networks

    NASA Astrophysics Data System (ADS)

    Li, Xun; Cao, Lang

    2016-05-01

    Compartmental models of diffusion over contact networks have proven representative of real-life propagation phenomena among interacting individuals. However, there is a broad class of collective spreading mechanisms departing from compartmental representations, including those for diffusive objects capable of fragmentation and transmission unnecessarily as a whole. Here, we consider a continuous-state susceptible-infected-susceptible (SIS) model as an ideal limit-case of diffusion processes of fragmentary information on networks, where individuals possess fractions of the information content and update them by selectively exchanging messages with partners in the vicinity. Specifically, we incorporate local information, such as neighbors' node degrees and carried contents, into the individual partner choice, and examine the roles of a variety of such strategies in the information diffusion process, both qualitatively and quantitatively. Our method provides an effective and flexible route of modulating continuous-state diffusion dynamics on networks and has potential in a wide array of practical applications.

  6. Information filtering via preferential diffusion.

    PubMed

    Lü, Linyuan; Liu, Weiping

    2011-06-01

    Recommender systems have shown great potential in addressing the information overload problem, namely helping users in finding interesting and relevant objects within a huge information space. Some physical dynamics, including the heat conduction process and mass or energy diffusion on networks, have recently found applications in personalized recommendation. Most of the previous studies focus overwhelmingly on recommendation accuracy as the only important factor, while overlooking the significance of diversity and novelty that indeed provide the vitality of the system. In this paper, we propose a recommendation algorithm based on the preferential diffusion process on a user-object bipartite network. Numerical analyses on two benchmark data sets, MovieLens and Netflix, indicate that our method outperforms the state-of-the-art methods. Specifically, it can not only provide more accurate recommendations, but also generate more diverse and novel recommendations by accurately recommending unpopular objects.

  7. Information filtering via preferential diffusion

    NASA Astrophysics Data System (ADS)

    Lü, Linyuan; Liu, Weiping

    2011-06-01

    Recommender systems have shown great potential in addressing the information overload problem, namely helping users in finding interesting and relevant objects within a huge information space. Some physical dynamics, including the heat conduction process and mass or energy diffusion on networks, have recently found applications in personalized recommendation. Most of the previous studies focus overwhelmingly on recommendation accuracy as the only important factor, while overlooking the significance of diversity and novelty that indeed provide the vitality of the system. In this paper, we propose a recommendation algorithm based on the preferential diffusion process on a user-object bipartite network. Numerical analyses on two benchmark data sets, MovieLens and Netflix, indicate that our method outperforms the state-of-the-art methods. Specifically, it can not only provide more accurate recommendations, but also generate more diverse and novel recommendations by accurately recommending unpopular objects.

  8. Developing Emotion-Based Case Formulations: A Research-Informed Method.

    PubMed

    Pascual-Leone, Antonio; Kramer, Ueli

    2017-01-01

    New research-informed methods for case conceptualization that cut across traditional therapy approaches are increasingly popular. This paper presents a trans-theoretical approach to case formulation based on the research observations of emotion. The sequential model of emotional processing (Pascual-Leone & Greenberg, 2007) is a process research model that provides concrete markers for therapists to observe the emerging emotional development of their clients. We illustrate how this model can be used by clinicians to track change and provides a 'clinical map,' by which therapist may orient themselves in-session and plan treatment interventions. Emotional processing offers as a trans-theoretical framework for therapists who wish to conduct emotion-based case formulations. First, we present criteria for why this research model translates well into practice. Second, two contrasting case studies are presented to demonstrate the method. The model bridges research with practice by using client emotion as an axis of integration. Key Practitioner Message Process research on emotion can offer a template for therapists to make case formulations while using a range of treatment approaches. The sequential model of emotional processing provides a 'process map' of concrete markers for therapists to (1) observe the emerging emotional development of their clients, and (2) help therapists develop a treatment plan. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Multi-criteria evaluation methods in the production scheduling

    NASA Astrophysics Data System (ADS)

    Kalinowski, K.; Krenczyk, D.; Paprocka, I.; Kempa, W.; Grabowik, C.

    2016-08-01

    The paper presents a discussion on the practical application of different methods of multi-criteria evaluation in the process of scheduling in manufacturing systems. Among the methods two main groups are specified: methods based on the distance function (using metacriterion) and methods that create a Pareto set of possible solutions. The basic criteria used for scheduling were also described. The overall procedure of evaluation process in production scheduling was presented. It takes into account the actions in the whole scheduling process and human decision maker (HDM) participation. The specified HDM decisions are related to creating and editing a set of evaluation criteria, selection of multi-criteria evaluation method, interaction in the searching process, using informal criteria and making final changes in the schedule for implementation. According to need, process scheduling may be completely or partially automated. Full automatization is possible in case of metacriterion based objective function and if Pareto set is selected - the final decision has to be done by HDM.

  10. Effectiveness of Visual Methods in Information Procedures for Stem Cell Recipients and Donors

    PubMed Central

    Sarıtürk, Çağla; Gereklioğlu, Çiğdem; Korur, Aslı; Asma, Süheyl; Yeral, Mahmut; Solmaz, Soner; Büyükkurt, Nurhilal; Tepebaşı, Songül; Kozanoğlu, İlknur; Boğa, Can; Özdoğu, Hakan

    2017-01-01

    Objective: Obtaining informed consent from hematopoietic stem cell recipients and donors is a critical step in the transplantation process. Anxiety may affect their understanding of the provided information. However, use of audiovisual methods may facilitate understanding. In this prospective randomized study, we investigated the effectiveness of using an audiovisual method of providing information to patients and donors in combination with the standard model. Materials and Methods: A 10-min informational animation was prepared for this purpose. In total, 82 participants were randomly assigned to two groups: group 1 received the additional audiovisual information and group 2 received standard information. A 20-item questionnaire was administered to participants at the end of the informational session. Results: A reliability test and factor analysis showed that the questionnaire was reliable and valid. For all participants, the mean overall satisfaction score was 184.8±19.8 (maximum possible score of 200). However, for satisfaction with information about written informed consent, group 1 scored significantly higher than group 2 (p=0.039). Satisfaction level was not affected by age, education level, or differences between the physicians conducting the informative session. Conclusion: This study shows that using audiovisual tools may contribute to a better understanding of the informed consent procedure and potential risks of stem cell transplantation. PMID:27476890

  11. Information Technology and Academic Productivity.

    ERIC Educational Resources Information Center

    Massy, William F.; Zemsky, Robert

    1996-01-01

    Enumerates the challenges of adopting information technology (IT)-based teaching and learning strategies in higher education. Concerns addressed include whether IT should supplant rather than augment traditional teaching methods, the financing of IT acquisition, change of teaching and learning processes to increase productivity per person, and…

  12. ANALYTICAL METHOD CHECKLIST FOR VOLATILE ORGANIC COMPOUNDS BY GC/MS (HANDOUT)

    EPA Science Inventory

    The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...

  13. Improved patch-based learning for image deblurring

    NASA Astrophysics Data System (ADS)

    Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng

    2015-05-01

    Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.

  14. Storage and computationally efficient permutations of factorized covariance and square-root information arrays

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector stored Upper triangular Diagonal factorized covariance and vector stored upper triangular Square Root Information arrays is presented. The method involves cyclic permutation of the rows and columns of the arrays and retriangularization with fast (slow) Givens rotations (reflections). Minimal computation is performed, and a one dimensional scratch array is required. To make the method efficient for large arrays on a virtual memory machine, computations are arranged so as to avoid expensive paging faults. This method is potentially important for processing large volumes of radio metric data in the Deep Space Network.

  15. Chemical computing with reaction-diffusion processes.

    PubMed

    Gorecki, J; Gizynski, K; Guzowski, J; Gorecka, J N; Garstecki, P; Gruenert, G; Dittrich, P

    2015-07-28

    Chemical reactions are responsible for information processing in living organisms. It is believed that the basic features of biological computing activity are reflected by a reaction-diffusion medium. We illustrate the ideas of chemical information processing considering the Belousov-Zhabotinsky (BZ) reaction and its photosensitive variant. The computational universality of information processing is demonstrated. For different methods of information coding constructions of the simplest signal processing devices are described. The function performed by a particular device is determined by the geometrical structure of oscillatory (or of excitable) and non-excitable regions of the medium. In a living organism, the brain is created as a self-grown structure of interacting nonlinear elements and reaches its functionality as the result of learning. We discuss whether such a strategy can be adopted for generation of chemical information processing devices. Recent studies have shown that lipid-covered droplets containing solution of reagents of BZ reaction can be transported by a flowing oil. Therefore, structures of droplets can be spontaneously formed at specific non-equilibrium conditions, for example forced by flows in a microfluidic reactor. We describe how to introduce information to a droplet structure, track the information flow inside it and optimize medium evolution to achieve the maximum reliability. Applications of droplet structures for classification tasks are discussed. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  16. Students' Reflective Essays as Insights into Student Centred-Pedagogies within the Undergraduate Research Methods Curriculum

    ERIC Educational Resources Information Center

    Hosein, Anesa; Rao, Namrata

    2017-01-01

    In higher education, despite the emphasis on student-centred pedagogical approaches, undergraduate research methods pedagogy remains surprisingly teacher-directed. Consequently, it may lead to research methods students assuming that becoming a researcher involves gathering information rather than it being a continuous developmental process. To…

  17. A fuzzy MCDM approach for evaluating school performance based on linguistic information

    NASA Astrophysics Data System (ADS)

    Musani, Suhaina; Jemain, Abdul Aziz

    2013-11-01

    Decision making is the process of finding the best option among the feasible alternatives. This process should consider a variety of criteria, but this study only focus on academic achievement. The data used is the percentage of candidates who obtained Malaysian Certificate of Education (SPM) in Melaka based on school academic achievement for each subject. 57 secondary schools in Melaka as listed by the Ministry of Education involved in this study. Therefore the school ranking can be done using MCDM (Multi Criteria Decision Making) methods. The objective of this study is to develop a rational method for evaluating school performance based on linguistic information. Since the information or level of academic achievement provided in linguistic manner, there is a possible chance of getting incomplete or uncertain problems. So in order to overcome the situation, the information could be provided as fuzzy numbers. Since fuzzy set represents the uncertainty in human perceptions. In this research, VIKOR (Multi Criteria Optimization and Compromise Solution) has been used as a MCDM tool for the school ranking process in fuzzy environment. Results showed that fuzzy set theory can solve the limitations of using MCDM when there is uncertainty problems exist in the data.

  18. Informed consent recall and comprehension in orthodontics: traditional vs improved readability and processability methods.

    PubMed

    Kang, Edith Y; Fields, Henry W; Kiyak, Asuman; Beck, F Michael; Firestone, Allen R

    2009-10-01

    Low general and health literacy in the United States means informed consent documents are not well understood by most adults. Methods to improve recall and comprehension of informed consent have not been tested in orthodontics. The purposes of this study were to evaluate (1) recall and comprehension among patients and parents by using the American Association of Orthodontists' (AAO) informed consent form and new forms incorporating improved readability and processability; (2) the association between reading ability, anxiety, and sociodemographic variables and recall and comprehension; and (3) how various domains (treatment, risk, and responsibility) of information are affected by the forms. Three treatment groups (30 patient-parent pairs in each) received an orthodontic case presentation and either the AAO form, an improved readability form (MIC), or an improved readability and processability (pairing audio and visual cues) form (MIC + SS). Structured interviews were transcribed and coded to evaluate recall and comprehension. Significant relationships among patient-related variables and recall and comprehension explained little of the variance. The MIC + SS form significantly improved patient recall and parent recall and comprehension. Recall was better than comprehension, and parents performed better than patients. The MIC + SS form significantly improved patient treatment comprehension and risk recall and parent treatment recall and comprehension. Patients and parents both overestimated their understanding of the materials. Improving the readability of consent materials made little difference, but combining improved readability and processability benefited both patients' recall and parents' recall and comprehension compared with the AAO form.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Liping; Zhu, Fulong, E-mail: zhufulong@hust.edu.cn; Duan, Ke

    Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of opticalmore » devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.« less

  20. Ultrasonic power measurement system based on acousto-optic interaction.

    PubMed

    He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan

    2016-05-01

    Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.

  1. Ultrasonic power measurement system based on acousto-optic interaction

    NASA Astrophysics Data System (ADS)

    He, Liping; Zhu, Fulong; Chen, Yanming; Duan, Ke; Lin, Xinxin; Pan, Yongjun; Tao, Jiaquan

    2016-05-01

    Ultrasonic waves are widely used, with applications including the medical, military, and chemical fields. However, there are currently no effective methods for ultrasonic power measurement. Previously, ultrasonic power measurement has been reliant on mechanical methods such as hydrophones and radiation force balances. This paper deals with ultrasonic power measurement based on an unconventional method: acousto-optic interaction. Compared with mechanical methods, the optical method has a greater ability to resist interference and also has reduced environmental requirements. Therefore, this paper begins with an experimental determination of the acoustic power in water contained in a glass tank using a set of optical devices. Because the light intensity of the diffraction image generated by acousto-optic interaction contains the required ultrasonic power information, specific software was written to extract the light intensity information from the image through a combination of filtering, binarization, contour extraction, and other image processing operations. The power value can then be obtained rapidly by processing the diffraction image using a computer. The results of this work show that the optical method offers advantages that include accuracy, speed, and a noncontact measurement method.

  2. How does information influence hope in family members of traumatic coma patients in intensive care unit?

    PubMed

    Verhaeghe, Sofie T L; van Zuuren, Florence J; Defloor, Tom; Duijnstee, Mia S H; Grypdonck, Mieke H F

    2007-08-01

    To assess the interplay between hope and the information provided by health care professionals. Earlier research learned that hope is crucial for relatives of traumatic coma patients. Also it has been reported that the need for information is extremely important for relatives of critically ill patients. A qualitative approach according to the 'grounded theory' method with constant comparison was used. We held 24 in-depth interviews with 22 family members of 16 patients with traumatic coma. Data processing and data analysis took place in a cyclic process wherein the induction of themes was alternated by confrontation with new material. Family members of traumatic coma patients want information that is as accurate as possible, provided by doctors and nurses in an understandable manner and leaving room for hope. At first, family members can do no more than passively absorb the information they receive. After some time, they actively start working with information and learn what to build their hope on. In this way, concrete hope evolves and seems to be strongly determined by information. Information that is more positive than warranted is not appreciated at all. It leads to false hope and once its real nature becomes apparent, to increased distress and loss of trust in the professionals. The process of hope is crucial in coping with traumatic coma and information can facilitate this process. If professionals, especially nurses, keep the process in mind that family members go through in handling information, they can not only facilitate this process but also help them to establish realistic hope.

  3. Negotiating a Systems Development Method

    NASA Astrophysics Data System (ADS)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  4. Research on distributed optical fiber sensing data processing method based on LabVIEW

    NASA Astrophysics Data System (ADS)

    Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing

    2018-01-01

    The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.

  5. The assessment of cognitive errors using an observer-rated method.

    PubMed

    Drapeau, Martin

    2014-01-01

    Cognitive Errors (CEs) are a key construct in cognitive behavioral therapy (CBT). Integral to CBT is that individuals with depression process information in an overly negative or biased way, and that this bias is reflected in specific depressotypic CEs which are distinct from normal information processing. Despite the importance of this construct in CBT theory, practice, and research, few methods are available to researchers and clinicians to reliably identify CEs as they occur. In this paper, the author presents a rating system, the Cognitive Error Rating Scale, which can be used by trained observers to identify and assess the cognitive errors of patients or research participants in vivo, i.e., as they are used or reported by the patients or participants. The method is described, including some of the more important rating conventions to be considered when using the method. This paper also describes the 15 cognitive errors assessed, and the different summary scores, including valence of the CEs, that can be derived from the method.

  6. A trust-based recommendation method using network diffusion processes

    NASA Astrophysics Data System (ADS)

    Chen, Ling-Jiao; Gao, Jian

    2018-09-01

    A variety of rating-based recommendation methods have been extensively studied including the well-known collaborative filtering approaches and some network diffusion-based methods, however, social trust relations are not sufficiently considered when making recommendations. In this paper, we contribute to the literature by proposing a trust-based recommendation method, named CosRA+T, after integrating the information of trust relations into the resource-redistribution process. Specifically, a tunable parameter is used to scale the resources received by trusted users before the redistribution back to the objects. Interestingly, we find an optimal scaling parameter for the proposed CosRA+T method to achieve its best recommendation accuracy, and the optimal value seems to be universal under several evaluation metrics across different datasets. Moreover, results of extensive experiments on the two real-world rating datasets with trust relations, Epinions and FriendFeed, suggest that CosRA+T has a remarkable improvement in overall accuracy, diversity and novelty. Our work takes a step towards designing better recommendation algorithms by employing multiple resources of social network information.

  7. Universal strategies for the DNA-encoding of libraries of small molecules using the chemical ligation of oligonucleotide tags

    PubMed Central

    Litovchick, Alexander; Clark, Matthew A; Keefe, Anthony D

    2014-01-01

    The affinity-mediated selection of large libraries of DNA-encoded small molecules is increasingly being used to initiate drug discovery programs. We present universal methods for the encoding of such libraries using the chemical ligation of oligonucleotides. These methods may be used to record the chemical history of individual library members during combinatorial synthesis processes. We demonstrate three different chemical ligation methods as examples of information recording processes (writing) for such libraries and two different cDNA-generation methods as examples of information retrieval processes (reading) from such libraries. The example writing methods include uncatalyzed and Cu(I)-catalyzed alkyne-azide cycloadditions and a novel photochemical thymidine-psoralen cycloaddition. The first reading method “relay primer-dependent bypass” utilizes a relay primer that hybridizes across a chemical ligation junction embedded in a fixed-sequence and is extended at its 3′-terminus prior to ligation to adjacent oligonucleotides. The second reading method “repeat-dependent bypass” utilizes chemical ligation junctions that are flanked by repeated sequences. The upstream repeat is copied prior to a rearrangement event during which the 3′-terminus of the cDNA hybridizes to the downstream repeat and polymerization continues. In principle these reading methods may be used with any ligation chemistry and offer universal strategies for the encoding (writing) and interpretation (reading) of DNA-encoded chemical libraries. PMID:25483841

  8. Sensing roller for in-process thickness measurement

    DOEpatents

    Novak, James L.

    1996-01-01

    An apparatus and method for processing materials by sensing roller, in which the sensing roller has a plurality of conductive rings (electrodes) separated by rings of dielectric material. Sensing capacitances or impedances between the electrodes provides information on thicknesses of the materials being processed, location of wires therein, and other like characteristics of the materials.

  9. 78 FR 67188 - Agency Information Collection Activities: Submitted for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    ... production method. 1206.259(c)(1)(ii) and (c)(2)(iii). Submit arm's-length and non- AUDIT PROCESS (See Note...-length AUDIT PROCESS (See Note). transportation contracts, production agreements, operating agreements...-length AUDIT PROCESS (See Note). power plant contracts, production and operating agreements and related...

  10. 78 FR 9732 - Agency Information Collection Activities: Submitted for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ...-length AUDIT PROCESS--See Note. transmission contracts, production and operating agreements and related... interest. 1206.354(n) Submit all arm's-length AUDIT PROCESS--See Note. power plant contracts, production... production method. 1206.458(c)(1)(iv) and (c)(2)(vi).. Submit arm's-length washing AUDIT PROCESS--See Note...

  11. Computer-Based Enhancements for the Improvement of Learning.

    ERIC Educational Resources Information Center

    Tennyson, Robert D.

    The third of four symposium papers argues that, if instructional methods are to improve learning, they must have two aspects: a direct trace to a specific learning process, and empirical support that demonstrates their significance. Focusing on the tracing process, the paper presents an information processing model of learning that can be used by…

  12. The Transition Assessment Process and IDEIA 2004

    ERIC Educational Resources Information Center

    Sitlington, Patricia L.; Clark, Gary M.

    2007-01-01

    This article will first provide an overview of the transition assessment process in terms of the requirements of the Individuals with Disabilities Education Improvement Act of 2004 and the basic tenets of the process. The second section will provide an overview of the methods of gathering assessment information on the student and on the living,…

  13. Comparison of tissue equalization, and premium view post-processing methods in full field digital mammography.

    PubMed

    Chen, Baoying; Wang, Wei; Huang, Jin; Zhao, Ming; Cui, Guangbin; Xu, Jing; Guo, Wei; Du, Pang; Li, Pei; Yu, Jun

    2010-10-01

    To retrospectively evaluate the diagnostic abilities of 2 post-processing methods provided by GE Senographe DS system, tissue equalization (TE) and premium view (PV) in full field digital mammography (FFDM). In accordance with the ethical standards of the World Medical Association, this study was approved by regional ethics committee and signed informed patient consents were obtained. We retrospectively reviewed digital mammograms from 101 women (mean age, 47 years; range, 23-81 years) in the modes of TE and PV, respectively. Three radiologists, fully blinded to the post-processing methods, all patient clinical information and histologic results, read images by using objective image interpretation criteria for diagnostic information end points such as lesion border delineation, definition of disease extent, visualization of internal and surrounding morphologic features of the lesions. Also, overall diagnostic impression in terms of lesion conspicuity, detectability and diagnostic confidence was assessed. Between-group comparisons were performed with Wilcoxon signed rank test. Readers 1, 2, and 3 demonstrated significant overall better impression of PV in 29, 27, and 24 patients, compared with that for TE in 12, 13, and 11 patients, respectively (p<0.05). Significant (p<0.05) better impression of PV was also demonstrated for diagnostic information end points. Importantly, PV proved to be more sensitive than TE while detecting malignant lesions in dense breast rather than benign lesions and malignancy in non-dense breast (p<0.01). PV compared with TE provides marked better diagnostic information in FFDM, particularly for patients with malignancy in dense breast. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  14. PREFACE: I International Scientific School Methods of Digital Image Processing in Optics and Photonics

    NASA Astrophysics Data System (ADS)

    Gurov, I. P.; Kozlov, S. A.

    2014-09-01

    The first international scientific school "Methods of Digital Image Processing in Optics and Photonics" was held with a view to develop cooperation between world-class experts, young scientists, students and post-graduate students, and to exchange information on the current status and directions of research in the field of digital image processing in optics and photonics. The International Scientific School was managed by: Saint Petersburg National Research University of Information Technologies, Mechanics and Optics (ITMO University) - Saint Petersburg (Russia) Chernyshevsky Saratov State University - Saratov (Russia) National research nuclear University "MEPHI" (NRNU MEPhI) - Moscow (Russia) The school was held with the participation of the local chapters of Optical Society of America (OSA), the Society of Photo-Optical Instrumentation Engineers (SPIE) and IEEE Photonics Society. Further details, including topics, committees and conference photos are available in the PDF

  15. How can knowledge discovery methods uncover spatio-temporal patterns in environmental data?

    NASA Astrophysics Data System (ADS)

    Wachowicz, Monica

    2000-04-01

    This paper proposes the integration of KDD, GVis and STDB as a long-term strategy, which will allow users to apply knowledge discovery methods for uncovering spatio-temporal patterns in environmental data. The main goal is to combine innovative techniques and associated tools for exploring very large environmental data sets in order to arrive at valid, novel, potentially useful, and ultimately understandable spatio-temporal patterns. The GeoInsight approach is described using the principles and key developments in the research domains of KDD, GVis, and STDB. The GeoInsight approach aims at the integration of these research domains in order to provide tools for performing information retrieval, exploration, analysis, and visualization. The result is a knowledge-based design, which involves visual thinking (perceptual-cognitive process) and automated information processing (computer-analytical process).

  16. Computer image processing in marine resource exploration

    NASA Technical Reports Server (NTRS)

    Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.

    1976-01-01

    Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.

  17. Method and Apparatus for Non-Invasive Measurement of Changes in Intracranial Pressure

    NASA Technical Reports Server (NTRS)

    Yost, William T. (Inventor); Cantrell, John H., Jr. (Inventor)

    2004-01-01

    A method and apparatus for measuring intracranial pressure. In one embodiment, the method comprises the steps of generating an information signal that comprises components (e.g., pulsatile changes and slow changes) that are related to intracranial pressure and blood pressure, generating a reference signal comprising pulsatile components that are solely related to blood pressure, processing the information and reference signals to determine the pulsatile components of the information signal that have generally the same phase as the pulsatile components of the reference signal, and removing from the information signal the pulsatile components determined to have generally the same phase as the pulsatile components of the reference signal so as to provide a data signal having components wherein substantially all of the components are related to intracranial pressure.

  18. Using the EZ-Diffusion Model to Score a Single-Category Implicit Association Test of Physical Activity

    PubMed Central

    Rebar, Amanda L.; Ram, Nilam; Conroy, David E.

    2014-01-01

    Objective The Single-Category Implicit Association Test (SC-IAT) has been used as a method for assessing automatic evaluations of physical activity, but measurement artifact or consciously-held attitudes could be confounding the outcome scores of these measures. The objective of these two studies was to address these measurement concerns by testing the validity of a novel SC-IAT scoring technique. Design Study 1 was a cross-sectional study, and study 2 was a prospective study. Method In study 1, undergraduate students (N = 104) completed SC-IATs for physical activity, flowers, and sedentary behavior. In study 2, undergraduate students (N = 91) completed a SC-IAT for physical activity, self-reported affective and instrumental attitudes toward physical activity, physical activity intentions, and wore an accelerometer for two weeks. The EZ-diffusion model was used to decompose the SC-IAT into three process component scores including the information processing efficiency score. Results In study 1, a series of structural equation model comparisons revealed that the information processing score did not share variability across distinct SC-IATs, suggesting it does not represent systematic measurement artifact. In study 2, the information processing efficiency score was shown to be unrelated to self-reported affective and instrumental attitudes toward physical activity, and positively related to physical activity behavior, above and beyond the traditional D-score of the SC-IAT. Conclusions The information processing efficiency score is a valid measure of automatic evaluations of physical activity. PMID:25484621

  19. User-centered requirements engineering in health information systems: a study in the hemophilia field.

    PubMed

    Teixeira, Leonor; Ferreira, Carlos; Santos, Beatriz Sousa

    2012-06-01

    The use of sophisticated information and communication technologies (ICTs) in the health care domain is a way to improve the quality of services. However, there are also hazards associated with the introduction of ICTs in this domain and a great number of projects have failed due to the lack of systematic consideration of human and other non-technology issues throughout the design or implementation process, particularly in the requirements engineering process. This paper presents the methodological approach followed in the design process of a web-based information system (WbIS) for managing the clinical information in hemophilia care, which integrates the values and practices of user-centered design (UCD) activities into the principles of software engineering, particularly in the phase of requirements engineering (RE). This process followed a paradigm that combines a grounded theory for data collection with an evolutionary design based on constant development and refinement of the generic domain model using three well-known methodological approaches: (a) object-oriented system analysis; (b) task analysis; and, (c) prototyping, in a triangulation work. This approach seems to be a good solution for the requirements engineering process in this particular case of the health care domain, since the inherent weaknesses of individual methods are reduced, and emergent requirements are easier to elicit. Moreover, the requirements triangulation matrix gives the opportunity to look across the results of all used methods and decide what requirements are critical for the system success. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  20. Methods of Labor Economy Increasing in Educational Organization

    ERIC Educational Resources Information Center

    Dorozhkin, Evgenij M.; Krotov, Yakov E.; Tkacheva, Oksana N.; Kruchkov, Konstantin V.; Korotaev, Ivan S.

    2016-01-01

    The urgency of problem under investigation due to fact that increasing demand of the information technology infrastructure development in current conditions of educational institutions functioning, including formation of the information-educational environment point of view. Offered organizational and economic model of constructing processes for…

  1. PRE-QAPP AGREEMENT (PQA) AND ANALYTICAL METHOD CHECKLISTS (AMCS): TOOLS FOR PLANNING RESEARCH PROJECTS

    EPA Science Inventory

    The Land Remediation and Pollution Control Division (LRPCD) QA Manager strives to assist LRPCD researchers in developing functional planning documents for their research projects. As part of the planning process, several pieces of information are needed, including information re...

  2. Integration of an OWL-DL knowledge base with an EHR prototype and providing customized information.

    PubMed

    Jing, Xia; Kay, Stephen; Marley, Tom; Hardiker, Nicholas R

    2014-09-01

    When clinicians use electronic health record (EHR) systems, their ability to obtain general knowledge is often an important contribution to their ability to make more informed decisions. In this paper we describe a method by which an external, formal representation of clinical and molecular genetic knowledge can be integrated into an EHR such that customized knowledge can be delivered to clinicians in a context-appropriate manner.Web Ontology Language-Description Logic (OWL-DL) is a formal knowledge representation language that is widely used for creating, organizing and managing biomedical knowledge through the use of explicit definitions, consistent structure and a computer-processable format, particularly in biomedical fields. In this paper we describe: 1) integration of an OWL-DL knowledge base with a standards-based EHR prototype, 2) presentation of customized information from the knowledge base via the EHR interface, and 3) lessons learned via the process. The integration was achieved through a combination of manual and automatic methods. Our method has advantages for scaling up to and maintaining knowledge bases of any size, with the goal of assisting clinicians and other EHR users in making better informed health care decisions.

  3. Cognitive and behavioural assessment of people with traumatic brain injury in the work place: occupational therapists' perceptions.

    PubMed

    Bootes, Kylie; Chapparo, Christine J

    2002-01-01

    Cognitive and behavioural impairments, in the absence of severe physical disability, are commonly related to poor return to work outcomes for people with traumatic brain injury (TBI). Along with other health professionals, occupational therapists make judgements about cognitive and behavioural dimensions of work capacity of clients with TBI during the return to work process. Unlike many physical functional capacity evaluations, there is no standard method that therapists use to assess the ability of people with TBI to perform cognitive operations required for work. Little is known about what information occupational therapists use in their assessment of cognitive and behavioural aspects of client performance within the work place. This study employed qualitative research methods to determine what information is utilised by 20 therapists who assess the work capacity of people with TBI in the workplace. Results indicated that the process of making judgements about cognitive and behavioural competence within the work place is a multifaceted process. Therapists triangulate client information from multiple sources and types of data to produce an accurate view of client work capacity. Central to this process is the relationship between the client, the job and the work environment.

  4. 1D Seismic reflection technique to increase depth information in surface seismic investigations

    NASA Astrophysics Data System (ADS)

    Camilletti, Stefano; Fiera, Francesco; Umberto Pacini, Lando; Perini, Massimiliano; Prosperi, Andrea

    2017-04-01

    1D seismic methods, such as MASW Re.Mi. and HVSR, have been extensively used in engineering investigations, bedrock research, Vs profile and to some extent for hydrologic applications, during the past 20 years. Recent advances in equipment, sound sources and computer interpretation techniques, make 1D seismic methods highly effective in shallow subsoil modeling. Classical 1D seismic surveys allows economical collection of subsurface data however they fail to return accurate information for depths greater than 50 meters. Using a particular acquisition technique it is possible to collect data that can be quickly processed through reflection technique in order to obtain more accurate velocity information in depth. Furthermore, data processing returns a narrow stratigraphic section, alongside the 1D velocity model, where lithological boundaries are represented. This work will show how collect a single-CMP to determine: (1) depth of bedrock; (2) gravel layers in clayey domains; (3) accurate Vs profile. Seismic traces was processed by means a new software developed in collaboration with SARA electronics instruments S.r.l company, Perugia - ITALY. This software has the great advantage of being able to be used directly in the field in order to reduce the times elapsing between acquisition and processing.

  5. Mapping care processes within a hospital: a web-based proposal merging enterprise modelling and ISO normative principles.

    PubMed

    Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius

    2003-01-01

    Today, the economic and regulatory environment are pressuring hospitals and healthcare professionals to account for their results and methods of care delivery. The evaluation of the quality and the safety of care, the traceability of the acts performed and the evaluation of practices are some of the reasons underpinning current interest in clinical and hospital information systems. The structured collection of users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysis. We used a modelling technique designed for manufacturing processes (SADT: Structured Analysis and Design Technique). We enhanced the initial model of activity of this method and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary from the description of a given process and to locate documents (procedures, recommendations, instructions). Aimed at structuring needs and storing information provided by teams directly involved regarding the workings of an institution (or at least part of it), the process mapping approach has an important contribution to make in the analysis of clinical information systems.

  6. Photogrammetric mapping for cadastral land information systems

    NASA Astrophysics Data System (ADS)

    Muzakidis, Panagiotis D.

    The creation of a "clean" digital database is a most important and complex task, upon which the usefulness of a Parcel-Based Land Information System depends. Capturing data by photogrammetric methods for cadastral purposes necessitates the transformation of data into a computer compatible form. Such input requires the encoding, editing and structuring of data. The research is carried out in two phases, the first is concerned with defining the data modelling schemes and the classification of basic data for a parcel-based land information system together with the photogrammetric methods to be adopted to collect these data. The second deals with data editing and data structuring processes in order to produce "clean" information relevant to such a system. Implementation of the proposed system at both the data collection stage and within the data processing stage itself demands a number of flexible criteria to be defined within the methodology. Development of these criteria will include consideration of the cadastral characteristics peculiar to Greece.

  7. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  8. A method of classification for multisource data in remote sensing based on interval-valued probabilities

    NASA Technical Reports Server (NTRS)

    Kim, Hakil; Swain, Philip H.

    1990-01-01

    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.

  9. Beyond informed consent.

    PubMed Central

    Bhutta, Zulfiqar A.

    2004-01-01

    Although a relatively recent phenomenon, the role of informed consent in human research is central to its ethical regulation and conduct. However, guidelines often recommend procedures for obtaining informed consent (usually written consent) that are difficult to implement in developing countries. This paper reviews the guidelines for obtaining informed consent and also discusses prevailing views on current controversies, ambiguities and problems with these guidelines and suggests potential solutions. The emphasis in most externally sponsored research projects in developing countries is on laborious documentation of several mechanical aspects of the research process rather than on assuring true comprehension and voluntary participation. The onus for the oversight of this process is often left to overworked and ill-equipped local ethics review committees. Current guidelines and processes for obtaining informed consent should be reviewed with the specific aim of developing culturally appropriate methods of sharing information about the research project and obtaining and documenting consent that is truly informed. Further research is needed to examine the validity and user friendliness of innovations in information sharing procedures for obtaining consent in different cultural settings. PMID:15643799

  10. Renewed roles for librarians in problem-based learning in the medical curriculum.

    PubMed

    Mi, Misa

    2011-01-01

    Problem-based learning (PBL) is a teaching-learning process or method of instruction that is widely used in medical education curricula. Librarians play important roles as facilitators for PBL as well as guides for information resources. Involvement in PBL activities presents unique opportunities to incorporate library resources and instruction into the medical curriculum. This article reviews the problem-based learning method within the conceptual framework of the learning theory of constructivism. It describes how a medical librarian at a U.S. medical school used emerging technologies to facilitate PBL small group case discussions, guide students to quality information resources, and enhance the learning environment for the PBL process.

  11. Trinary Encoder, Decoder, Multiplexer and Demultiplexer Using Savart Plate and Spatial Light Modulator

    NASA Astrophysics Data System (ADS)

    Ghosh, Amal K.; Singha Roy, Souradip; Mandal, Sudipta; Basuray, Amitabha

    Optoelectronic processors have already been developed with the strong potentiality of optics in information and data processing. Encoder, Decoder, Multiplexers and Demultiplexers are the most important components in modern system designs and in communications. We have implemented the same using trinary logic gates with signed magnitude defined as Modified Trinary Number (MTN). The Spatial Light Modulator (SLM) based optoelectronic circuit is suitable for high speed data processing and communications using photon as carrier. We also presented here a possible method of implementing the same using light with photon as carrier of information. The importance of the method is that all the basic gates needed may be fabricated based on basic building block.

  12. Research on the method of precise alignment technology of atmospheric laser communication

    NASA Astrophysics Data System (ADS)

    Chen, Wen-jian; Gao, Wei; Duan, Yuan-yuan; Ma, Shi-wei; Chen, Jian

    2016-10-01

    Atmosphere laser communication takes advantage of laser as the carrier transmitting the voice, data, and image information in the atmosphere. Because of its high reliability, strong anti-interference ability, the advantages of easy installation, it has great potential and development space in the communications field. In the process of establish communication, the capture, targeting and tracking of the communication signal is the key technology. This paper introduce a method of targeting the signal spot in the process of atmosphere laser communication, which through the way of making analog signal addition and subtraction directly and normalized to obtain the target azimuth information to drive the servo system to achieve precise alignment of tracking.

  13. A color fusion method of infrared and low-light-level images based on visual perception

    NASA Astrophysics Data System (ADS)

    Han, Jing; Yan, Minmin; Zhang, Yi; Bai, Lianfa

    2014-11-01

    The color fusion images can be obtained through the fusion of infrared and low-light-level images, which will contain both the information of the two. The fusion images can help observers to understand the multichannel images comprehensively. However, simple fusion may lose the target information due to inconspicuous targets in long-distance infrared and low-light-level images; and if targets extraction is adopted blindly, the perception of the scene information will be affected seriously. To solve this problem, a new fusion method based on visual perception is proposed in this paper. The extraction of the visual targets ("what" information) and parallel processing mechanism are applied in traditional color fusion methods. The infrared and low-light-level color fusion images are achieved based on efficient typical targets learning. Experimental results show the effectiveness of the proposed method. The fusion images achieved by our algorithm can not only improve the detection rate of targets, but also get rich natural information of the scenes.

  14. Smart Information System for Gachon University Gil Hospital

    PubMed Central

    Jung, Eun Young; Jeong, Byung Hui; Moon, Byung Chan; Kang, Hyung Wook; Tchah, Hann; Han, Gi Seong; Cheng, Woo Sung; Lee, Young Ho

    2012-01-01

    Objectives In this research, the hospital information system of Gachon University Gil hospital is introduced and a future strategy for hospital information systems is proposed. Methods This research introduces the development conditions of hospital information system at Gachon University Gil hospital, information about the development of the enterprise resource planning (ERP), a medical service process improvement system, and the personal health record (PHR) system. Results The medical service process and work efficiency were improved through the medical service process improvement system, which is the most common hospital information system at Gachon University Gil hospital and which includes an emergency medical service system, an online evaluation system and a round support system. Conclusions Gachon University Gil hospital developed medical service improvement systems to increase work efficiency of medical team and optimized the systems to prove the availability of high-quality medical services for patients and their families. The PHR-based personalized health care solution is under development and will provide higher quality medical service for more patients in the future. PMID:22509476

  15. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  16. [Technologies for Complex Intelligent Clinical Data Analysis].

    PubMed

    Baranov, A A; Namazova-Baranova, L S; Smirnov, I V; Devyatkin, D A; Shelmanov, A O; Vishneva, E A; Antonova, E V; Smirnov, V I

    2016-01-01

    The paper presents the system for intelligent analysis of clinical information. Authors describe methods implemented in the system for clinical information retrieval, intelligent diagnostics of chronic diseases, patient's features importance and for detection of hidden dependencies between features. Results of the experimental evaluation of these methods are also presented. Healthcare facilities generate a large flow of both structured and unstructured data which contain important information about patients. Test results are usually retained as structured data but some data is retained in the form of natural language texts (medical history, the results of physical examination, and the results of other examinations, such as ultrasound, ECG or X-ray studies). Many tasks arising in clinical practice can be automated applying methods for intelligent analysis of accumulated structured array and unstructured data that leads to improvement of the healthcare quality. the creation of the complex system for intelligent data analysis in the multi-disciplinary pediatric center. Authors propose methods for information extraction from clinical texts in Russian. The methods are carried out on the basis of deep linguistic analysis. They retrieve terms of diseases, symptoms, areas of the body and drugs. The methods can recognize additional attributes such as "negation" (indicates that the disease is absent), "no patient" (indicates that the disease refers to the patient's family member, but not to the patient), "severity of illness", disease course", "body region to which the disease refers". Authors use a set of hand-drawn templates and various techniques based on machine learning to retrieve information using a medical thesaurus. The extracted information is used to solve the problem of automatic diagnosis of chronic diseases. A machine learning method for classification of patients with similar nosology and the methodfor determining the most informative patients'features are also proposed. Authors have processed anonymized health records from the pediatric center to estimate the proposed methods. The results show the applicability of the information extracted from the texts for solving practical problems. The records ofpatients with allergic, glomerular and rheumatic diseases were used for experimental assessment of the method of automatic diagnostic. Authors have also determined the most appropriate machine learning methods for classification of patients for each group of diseases, as well as the most informative disease signs. It has been found that using additional information extracted from clinical texts, together with structured data helps to improve the quality of diagnosis of chronic diseases. Authors have also obtained pattern combinations of signs of diseases. The proposed methods have been implemented in the intelligent data processing system for a multidisciplinary pediatric center. The experimental results show the availability of the system to improve the quality of pediatric healthcare.

  17. A Method for the Study of Human Factors in Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Barnhart, W.; Billings, C.; Cooper, G.; Gilstrap, R.; Lauber, J.; Orlady, H.; Puskas, B.; Stephens, W.

    1975-01-01

    A method for the study of human factors in the aviation environment is described. A conceptual framework is provided within which pilot and other human errors in aircraft operations may be studied with the intent of finding out how, and why, they occurred. An information processing model of human behavior serves as the basis for the acquisition and interpretation of information relating to occurrences which involve human error. A systematic method of collecting such data is presented and discussed. The classification of the data is outlined.

  18. Light Weight MP3 Watermarking Method for Mobile Terminals

    NASA Astrophysics Data System (ADS)

    Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro

    This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.

  19. MAVEN-SA: Model-Based Automated Visualization for Enhanced Situation Awareness

    DTIC Science & Technology

    2005-11-01

    34 methods. But historically, as arts evolve, these how to methods become systematized and codified (e.g. the development and refinement of color theory ...schema (as necessary) 3. Draw inferences from new knowledge to support decision making process 33 Visual language theory suggests that humans process...informed by theories of learning. Over the years, many types of software have been developed to support student learning. The various types of

  20. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity.

    PubMed

    Lizier, Joseph T; Heinzle, Jakob; Horstmann, Annette; Haynes, John-Dylan; Prokopenko, Mikhail

    2011-02-01

    The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.

  1. Directional dual-tree complex wavelet packet transforms for processing quadrature signals.

    PubMed

    Serbes, Gorkem; Gulcur, Halil Ozcan; Aydin, Nizamettin

    2016-03-01

    Quadrature signals containing in-phase and quadrature-phase components are used in many signal processing applications in every field of science and engineering. Specifically, Doppler ultrasound systems used to evaluate cardiovascular disorders noninvasively also result in quadrature format signals. In order to obtain directional blood flow information, the quadrature outputs have to be preprocessed using methods such as asymmetrical and symmetrical phasing filter techniques. These resultant directional signals can be employed in order to detect asymptomatic embolic signals caused by small emboli, which are indicators of a possible future stroke, in the cerebral circulation. Various transform-based methods such as Fourier and wavelet were frequently used in processing embolic signals. However, most of the times, the Fourier and discrete wavelet transforms are not appropriate for the analysis of embolic signals due to their non-stationary time-frequency behavior. Alternatively, discrete wavelet packet transform can perform an adaptive decomposition of the time-frequency axis. In this study, directional discrete wavelet packet transforms, which have the ability to map directional information while processing quadrature signals and have less computational complexity than the existing wavelet packet-based methods, are introduced. The performances of proposed methods are examined in detail by using single-frequency, synthetic narrow-band, and embolic quadrature signals.

  2. Business Process-Based Resource Importance Determination

    NASA Astrophysics Data System (ADS)

    Fenz, Stefan; Ekelhart, Andreas; Neubauer, Thomas

    Information security risk management (ISRM) heavily depends on realistic impact values representing the resources’ importance in the overall organizational context. Although a variety of ISRM approaches have been proposed, well-founded methods that provide an answer to the following question are still missing: How can business processes be used to determine resources’ importance in the overall organizational context? We answer this question by measuring the actual importance level of resources based on business processes. Therefore, this paper presents our novel business process-based resource importance determination method which provides ISRM with an efficient and powerful tool for deriving realistic resource importance figures solely from existing business processes. The conducted evaluation has shown that the calculation results of the developed method comply to the results gained in traditional workshop-based assessments.

  3. Using formative research to lay the foundation for community level HIV prevention efforts: an example from the AIDS Community Demonstration Projects.

    PubMed Central

    Higgins, D L; O'Reilly, K; Tashima, N; Crain, C; Beeker, C; Goldbaum, G; Elifson, C S; Galavotti, C; Guenther-Grey, C

    1996-01-01

    The AIDS Community Demonstration Projects provided community-level HIV prevention interventions to historically hard-to-reach groups at high risk for HIV infection. The projects operated under a common research protocol which encompassed formative research, intervention delivery, process evaluation, and outcome evaluation. A formative research process specifically focusing on intervention development was devised to assist project staff in identifying, prioritizing, accessing, and understanding the intervention target groups. This process was central to the creation of interventions that were acceptable and unique to the target populations. Intended to be rapid, the process took 6 months to complete. Drawn from the disciplines of anthropology, community psychology, sociology, and public health, the formative research process followed distinct steps which included (a) defining the populations at high-risk for HIV; (b) gathering information about these populations through interviews with persons who were outside of, but who had contact with, the target groups (such as staff from the health department and alcohol and drug treatment facilities, as well as persons who interacted in an informal manner with the target groups, such as clerks in neighborhood grocery stores and bartenders); (c) interviewing people with access to the target populations (gatekeepers), and conducting observations in areas where these high-risk groups were reported to gather (from previous interviews); (d) interviewing members of these groups at high risk for HIV infection or transmission; and (e) systematically integrating information throughout the process. Semistructured interview schedules were used for all data collection in this process. This standardized systematic method yielded valuable information about the focal groups in each demonstration project site. The method, if adopted by others, would assist community intervention specialists in developing interventions that are culturally appropriate and meaningful to their respective target populations. PMID:8862154

  4. Methods and Systems for Advanced Spaceport Information Management

    NASA Technical Reports Server (NTRS)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  5. Methods and systems for advanced spaceport information management

    NASA Technical Reports Server (NTRS)

    Ely, Donald W. (Inventor); Fussell, Ronald M. (Inventor); Halpin, Paul C. (Inventor); Blackwell-Thompson, Charlie (Inventor); Meier, Gary M. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  6. Baselining the New GSFC Information Systems Center: The Foundation for Verifiable Software Process Improvement

    NASA Technical Reports Server (NTRS)

    Parra, A.; Schultz, D.; Boger, J.; Condon, S.; Webby, R.; Morisio, M.; Yakimovich, D.; Carver, J.; Stark, M.; Basili, V.; hide

    1999-01-01

    This paper describes a study performed at the Information System Center (ISC) in NASA Goddard Space Flight Center. The ISC was set up in 1998 as a core competence center in information technology. The study aims at characterizing people, processes and products of the new center, to provide a basis for proposing improvement actions and comparing the center before and after these actions have been performed. The paper presents the ISC, goals and methods of the study, results and suggestions for improvement, through the branch-level portion of this baselining effort.

  7. [Medical data warehousing as a generator of system component for decision support in health care].

    PubMed

    Catibusić, Sulejman; Hadzagić-Catibusić, Feriha; Zubcević, Smail

    2004-01-01

    Growth in role of data warehousing as strategic information for decision makers is significant. Many health institutions have data warehouse implementations in process of development or even in production. This article was made with intention to improve general understanding of data warehousing requirements form the point of view of end-users, and information system as well. For that reason, in this document advantages and arguments for implementation, techniques and methods of data warehousing, data warehouse foundation and exploration of information as final product of data warehousing process have been described.

  8. Real-Time Nonlinear Optical Information Processing.

    DTIC Science & Technology

    1979-06-01

    operations aree presented. One approach realizes the halftone method of nonlinear optical processing in real time by replacing the conventional...photographic recording medium with a real-time image transducer. In the second approach halftoning is eliminated and the real-time device is used directly

  9. A new EEG synchronization strength analysis method: S-estimator based normalized weighted-permutation mutual information.

    PubMed

    Cui, Dong; Pu, Weiting; Liu, Jing; Bian, Zhijie; Li, Qiuli; Wang, Lei; Gu, Guanghua

    2016-10-01

    Synchronization is an important mechanism for understanding information processing in normal or abnormal brains. In this paper, we propose a new method called normalized weighted-permutation mutual information (NWPMI) for double variable signal synchronization analysis and combine NWPMI with S-estimator measure to generate a new method named S-estimator based normalized weighted-permutation mutual information (SNWPMI) for analyzing multi-channel electroencephalographic (EEG) synchronization strength. The performances including the effects of time delay, embedding dimension, coupling coefficients, signal to noise ratios (SNRs) and data length of the NWPMI are evaluated by using Coupled Henon mapping model. The results show that the NWPMI is superior in describing the synchronization compared with the normalized permutation mutual information (NPMI). Furthermore, the proposed SNWPMI method is applied to analyze scalp EEG data from 26 amnestic mild cognitive impairment (aMCI) subjects and 20 age-matched controls with normal cognitive function, who both suffer from type 2 diabetes mellitus (T2DM). The proposed methods NWPMI and SNWPMI are suggested to be an effective index to estimate the synchronization strength. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Display methods of electronic patient record screens: patient privacy concerns.

    PubMed

    Niimi, Yukari; Ota, Katsumasa

    2013-01-01

    To provide adequate care, medical professionals have to collect not only medical information but also information that may be related to private aspects of the patient's life. With patients' increasing awareness of information privacy, healthcare providers have to pay attention to the patients' right of privacy. This study aimed to clarify the requirements of the display method of electronic patient record (EPR) screens in consideration of both patients' information privacy concerns and health professionals' information needs. For this purpose, semi-structured group interviews were conducted of 78 medical professionals. They pointed out that partial concealment of information to meet patients' requests for privacy could result in challenges in (1) safety in healthcare, (2) information sharing, (3) collaboration, (4) hospital management, and (5) communication. They believed that EPRs should (1) meet the requirements of the therapeutic process, (2) have restricted access, (3) provide convenient access to necessary information, and (4) facilitate interprofessional collaboration. This study provides direction for the development of display methods that balance the sharing of vital information and protection of patient privacy.

  11. Health technology assessment process of a cardiovascular medical device in four different settings.

    PubMed

    Olry de Labry Lima, Antonio; Espín Balbino, Jaime; Lemgruber, Alexandre; Caro Martínez, Araceli; García-Mochón, Leticia; Martín Ruiz, Eva; Lessa, Fernanda

    2017-10-01

    Health technology assessment (HTA) is a tool to help the decision-making process. The aim is to describe methods and processes used in the reimbursement decision making for drug-eluting stents (DES) in four different settings. DES as a technology under study was selected according to different criteria, all of them agreed by a working group. A survey of key informants was designed. DES was evaluated following well-structured HTA processes. Nonetheless, scope for improvement was observed in relation to the data considered for the final decision, the transparency and inclusiveness of the process as well as in the methods employed. An attempt to describe the HTA processes of a well-known medical device.

  12. Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback

    PubMed Central

    Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi

    2016-01-01

    Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery. PMID:27861505

  13. Content-Based Discovery for Web Map Service using Support Vector Machine and User Relevance Feedback.

    PubMed

    Hu, Kai; Gui, Zhipeng; Cheng, Xiaoqiang; Qi, Kunlun; Zheng, Jie; You, Lan; Wu, Huayi

    2016-01-01

    Many discovery methods for geographic information services have been proposed. There are approaches for finding and matching geographic information services, methods for constructing geographic information service classification schemes, and automatic geographic information discovery. Overall, the efficiency of the geographic information discovery keeps improving., There are however, still two problems in Web Map Service (WMS) discovery that must be solved. Mismatches between the graphic contents of a WMS and the semantic descriptions in the metadata make discovery difficult for human users. End-users and computers comprehend WMSs differently creating semantic gaps in human-computer interactions. To address these problems, we propose an improved query process for WMSs based on the graphic contents of WMS layers, combining Support Vector Machine (SVM) and user relevance feedback. Our experiments demonstrate that the proposed method can improve the accuracy and efficiency of WMS discovery.

  14. A Framework for Integrating Environmental Justice in Regulatory Analysis

    PubMed Central

    Nweke, Onyemaechi C.

    2011-01-01

    With increased interest in integrating environmental justice into the process for developing environmental regulations in the United States, analysts and decision makers are confronted with the question of what methods and data can be used to assess disproportionate environmental health impacts. However, as a first step to identifying data and methods, it is important that analysts understand what information on equity impacts is needed for decision making. Such knowledge originates from clearly stated equity objectives and the reflection of those objectives throughout the analytical activities that characterize Regulatory Impact Analysis (RIA), a process that is traditionally used to inform decision making. The framework proposed in this paper advocates structuring analyses to explicitly provide pre-defined output on equity impacts. Specifically, the proposed framework emphasizes: (a) defining equity objectives for the proposed regulatory action at the onset of the regulatory process, (b) identifying specific and related sub-objectives for key analytical steps in the RIA process, and (c) developing explicit analytical/research questions to assure that stated sub-objectives and objectives are met. In proposing this framework, it is envisioned that information on equity impacts informs decision-making in regulatory development, and that this is achieved through a systematic and consistent approach that assures linkages between stated equity objectives, regulatory analyses, selection of policy options, and the design of compliance and enforcement activities. PMID:21776235

  15. Patterns-Based IS Change Management in SMEs

    NASA Astrophysics Data System (ADS)

    Makna, Janis; Kirikova, Marite

    The majority of information systems change management guidelines and standards are either too abstract or too bureaucratic to be easily applicable in small enterprises. This chapter proposes the approach, the method, and the prototype that are designed especially for information systems change management in small and medium enterprises. The approach is based on proven patterns of changes in the set of information systems elements. The set of elements was obtained by theoretical analysis of information systems and business process definitions and enterprise architectures. The patterns were evolved from a number of information systems theories and tested in 48 information systems change management projects. The prototype presents and helps to handle three basic change patterns, which help to anticipate the overall scope of changes related to particular elementary changes in an enterprise information system. The use of prototype requires just basic knowledge in organizational business process and information management.

  16. Adding Semantics and OPM Ontology for the Provenance of Multi-sensor Merged Climate Data Records. Now What About Reproducibility?

    NASA Astrophysics Data System (ADS)

    Hua, H.; Wilson, B. D.; Manipon, G.; Pan, L.; Fetzer, E.

    2011-12-01

    Multi-decadal climate data records are critical to studying climate variability and change. These often also require merging data from multiple instruments such as those from NASA's A-Train that contain measurements covering a wide range of atmospheric conditions and phenomena. Multi-decadal climate data record of water vapor measurements from sensors on A-Train, operational weather, and other satellites are being assembled from existing data sources, or produced from well-established methods published in peer-reviewed literature. However, the immense volume and inhomogeneity of data often requires an "exploratory computing" approach to product generation where data is processed in a variety of different ways with varying algorithms, parameters, and code changes until an acceptable intermediate product is generated. This process is repeated until a desirable final merged product can be generated. Typically the production legacy is often lost due to the complexity of processing steps that were tried along the way. The data product information associated with source data, processing methods, parameters used, intermediate product outputs, and associated materials are often hidden in each of the trials and scattered throughout the processing system(s). We will discuss methods to help users better capture and explore the production legacy of the data, metadata, ancillary files, code, and computing environment changes used during the production of these merged and multi-sensor data products. By leveraging existing semantic and provenance tools, we can capture sufficient information to enable users to track, perform faceted searches, and visualize the provenance of the products and processing lineage. We will explore if sufficient provenance information can be captured to enable science reproducibility of these climate data records.

  17. Uncovering the essential links in online commercial networks

    NASA Astrophysics Data System (ADS)

    Zeng, Wei; Fang, Meiling; Shao, Junming; Shang, Mingsheng

    2016-09-01

    Recommender systems are designed to effectively support individuals' decision-making process on various web sites. It can be naturally represented by a user-object bipartite network, where a link indicates that a user has collected an object. Recently, research on the information backbone has attracted researchers' interests, which is a sub-network with fewer nodes and links but carrying most of the relevant information. With the backbone, a system can generate satisfactory recommenda- tions while saving much computing resource. In this paper, we propose an enhanced topology-aware method to extract the information backbone in the bipartite network mainly based on the information of neighboring users and objects. Our backbone extraction method enables the recommender systems achieve more than 90% of the accuracy of the top-L recommendation, however, consuming only 20% links. The experimental results show that our method outperforms the alternative backbone extraction methods. Moreover, the structure of the information backbone is studied in detail. Finally, we highlight that the information backbone is one of the most important properties of the bipartite network, with which one can significantly improve the efficiency of the recommender system.

  18. Candidate Causes. Sediments. In: Causal Analysis, Diagnosis Decision Information System, USEPA Website

    EPA Science Inventory

    CADDIS is an online application that helps scientists and engineers in the Regions, States, and Tribes find, access, organize, use, and share information to conduct causal evaluations in aquatic systems. It is based on the USEPA stressor identification process, a formal method fo...

  19. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  20. Analysis of cutting force signals by wavelet packet transform for surface roughness monitoring in CNC turning

    NASA Astrophysics Data System (ADS)

    García Plaza, E.; Núñez López, P. J.

    2018-01-01

    On-line monitoring of surface finish in machining processes has proven to be a substantial advancement over traditional post-process quality control techniques by reducing inspection times and costs and by avoiding the manufacture of defective products. This study applied techniques for processing cutting force signals based on the wavelet packet transform (WPT) method for the monitoring of surface finish in computer numerical control (CNC) turning operations. The behaviour of 40 mother wavelets was analysed using three techniques: global packet analysis (G-WPT), and the application of two packet reduction criteria: maximum energy (E-WPT) and maximum entropy (SE-WPT). The optimum signal decomposition level (Lj) was determined to eliminate noise and to obtain information correlated to surface finish. The results obtained with the G-WPT method provided an in-depth analysis of cutting force signals, and frequency ranges and signal characteristics were correlated to surface finish with excellent results in the accuracy and reliability of the predictive models. The radial and tangential cutting force components at low frequency provided most of the information for the monitoring of surface finish. The E-WPT and SE-WPT packet reduction criteria substantially reduced signal processing time, but at the expense of discarding packets with relevant information, which impoverished the results. The G-WPT method was observed to be an ideal procedure for processing cutting force signals applied to the real-time monitoring of surface finish, and was estimated to be highly accurate and reliable at a low analytical-computational cost.

  1. Research on Quantum Authentication Methods for the Secure Access Control Among Three Elements of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Dong, Yumin; Xiao, Shufen; Ma, Hongyang; Chen, Libo

    2016-12-01

    Cloud computing and big data have become the developing engine of current information technology (IT) as a result of the rapid development of IT. However, security protection has become increasingly important for cloud computing and big data, and has become a problem that must be solved to develop cloud computing. The theft of identity authentication information remains a serious threat to the security of cloud computing. In this process, attackers intrude into cloud computing services through identity authentication information, thereby threatening the security of data from multiple perspectives. Therefore, this study proposes a model for cloud computing protection and management based on quantum authentication, introduces the principle of quantum authentication, and deduces the quantum authentication process. In theory, quantum authentication technology can be applied in cloud computing for security protection. This technology cannot be cloned; thus, it is more secure and reliable than classical methods.

  2. Method of developing all-optical trinary JK, D-type, and T-type flip-flops using semiconductor optical amplifiers.

    PubMed

    Garai, Sisir Kumar

    2012-04-10

    To meet the demand of very fast and agile optical networks, the optical processors in a network system should have a very fast execution rate, large information handling, and large information storage capacities. Multivalued logic operations and multistate optical flip-flops are the basic building blocks for such fast running optical computing and data processing systems. In the past two decades, many methods of implementing all-optical flip-flops have been proposed. Most of these suffer from speed limitations because of the low switching response of active devices. The frequency encoding technique has been used because of its many advantages. It can preserve its identity throughout data communication irrespective of loss of light energy due to reflection, refraction, attenuation, etc. The action of polarization-rotation-based very fast switching of semiconductor optical amplifiers increases processing speed. At the same time, tristate optical flip-flops increase information handling capacity.

  3. Quantifying quantum coherence with quantum Fisher information.

    PubMed

    Feng, X N; Wei, L F

    2017-11-14

    Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.

  4. A simple method to identify areas of environmental risk due to manure application.

    PubMed

    Flores, Héctor; Arumí, José Luis; Rivera, Diego; Lagos, L Octavio

    2012-06-01

    The management of swine manure is becoming an important environmental issue in Chile. One option for the final disposal of manure is to use it as a biofertilizer, but this practice could impact the surrounding environment. To assess the potential environmental impacts of the use of swine manure as a biofertilizer, we propose a method to identify zones of environmental risk through indices. The method considers two processes: nutrient runoff and solute leaching, and uses available information about soils, crops and management practices (irrigation, fertilization, and rotation). We applied the method to qualitatively assess the environmental risk associated with the use of swine manure as a biofertilizer in an 8,000-pig farm located in Central Chile. Results showed that the farm has a moderate environmental risk, but some specific locations have high environmental risks, especially those associated with impacts on areas surrounding water resources. This information could assist the definition of better farm-level management practices, as well as the preservation of riparian vegetation acting as buffer strips. The main advantage of our approach is that it combines qualitative and quantitative information, including particular situations or field features based on expert knowledge. The method is flexible, simple, and can be easily extended or adapted to other processes.

  5. 42 CFR 82.30 - How will NIOSH inform the public of any plans to change scientific elements underlying the dose...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR... change scientific elements underlying the dose reconstruction process to maintain methods reasonably... methods reasonably current with scientific progress? Periodically, NIOSH will publish a notice in the...

  6. Treatment Methods for Kidney Failure: Peritoneal Dialysis

    MedlinePlus

    ... Process Research Training & Career Development Funded Grants & Grant History Research Resources Research at NIDDK Technology Advancement & Transfer Meetings & Workshops Health Information Diabetes Digestive ...

  7. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  8. Certification for civil flight decks and the human-computer interface

    NASA Technical Reports Server (NTRS)

    Mcclumpha, Andrew J.; Rudisill, Marianne

    1994-01-01

    This paper will address the issue of human factor aspects of civil flight deck certification, with emphasis on the pilot's interface with automation. In particular, three questions will be asked that relate to this certification process: (1) are the methods, data, and guidelines available from human factors to adequately address the problems of certifying as safe and error tolerant the complex automated systems of modern civil transport aircraft; (2) do aircraft manufacturers effectively apply human factors information during the aircraft flight deck design process; and (3) do regulatory authorities effectively apply human factors information during the aircraft certification process?

  9. RN, CIO: an executive informatics career.

    PubMed

    Staggers, Nancy; Lasome, Caterina E M

    2005-01-01

    The Chief Information Officer (CIO) position is a viable new career track for clinical informaticists. Nurses, especially informatics nurses, are uniquely positioned for the CIO role because of their operational knowledge of clinical processes, communication skills, systems thinking abilities, and knowledge about information structures and processes. This article describes essential knowledge and skills for the CIO executive position. Competencies not typical to nurses can be learned and developed, particularly strategic visioning and organizational finesse. This article concludes by describing career development steps toward the CIO position: leadership and management; healthcare operations; organizational finesse; and informatics knowledge, processes, methods, and structures.

  10. Visual information processing II; Proceedings of the Meeting, Orlando, FL, Apr. 14-16, 1993

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O. (Editor); Juday, Richard D. (Editor)

    1993-01-01

    Various papers on visual information processing are presented. Individual topics addressed include: aliasing as noise, satellite image processing using a hammering neural network, edge-detetion method using visual perception, adaptive vector median filters, design of a reading test for low-vision image warping, spatial transformation architectures, automatic image-enhancement method, redundancy reduction in image coding, lossless gray-scale image compression by predictive GDF, information efficiency in visual communication, optimizing JPEG quantization matrices for different applications, use of forward error correction to maintain image fidelity, effect of peanoscanning on image compression. Also discussed are: computer vision for autonomous robotics in space, optical processor for zero-crossing edge detection, fractal-based image edge detection, simulation of the neon spreading effect by bandpass filtering, wavelet transform (WT) on parallel SIMD architectures, nonseparable 2D wavelet image representation, adaptive image halftoning based on WT, wavelet analysis of global warming, use of the WT for signal detection, perfect reconstruction two-channel rational filter banks, N-wavelet coding for pattern classification, simulation of image of natural objects, number-theoretic coding for iconic systems.

  11. Information theoretic methods for image processing algorithm optimization

    NASA Astrophysics Data System (ADS)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  12. Research on Design Information Management System for Leather Goods

    NASA Astrophysics Data System (ADS)

    Lu, Lei; Peng, Wen-li

    The idea of setting up a design information management system of leather goods was put forward to solve the problems existed in current information management of leather goods. Working principles of the design information management system for leather goods were analyzed in detail. Firstly, the acquiring approach of design information of leather goods was introduced. Secondly, the processing methods of design information were introduced. Thirdly, the management of design information in database was studied. Finally, the application of the system was discussed by taking the shoes products as an example.

  13. Double ionization in R -matrix theory using a two-electron outer region

    NASA Astrophysics Data System (ADS)

    Wragg, Jack; Parker, J. S.; van der Hart, H. W.

    2015-08-01

    We have developed a two-electron outer region for use within R -matrix theory to describe double ionization processes. The capability of this method is demonstrated for single-photon double ionization of He in the photon energy region between 80 and 180 eV. The cross sections are in agreement with established data. The extended R -matrix with time dependence method also provides information on higher-order processes, as demonstrated by the identification of signatures for sequential double ionization processes involving an intermediate He+ state with n =2 .

  14. Increased attention but more efficient disengagement: neuroscientific evidence for defensive processing of threatening health information.

    PubMed

    Kessels, Loes T E; Ruiter, Robert A C; Jansma, Bernadette M

    2010-07-01

    Previous studies indicate that people respond defensively to threatening health information, especially when the information challenges self-relevant goals. The authors investigated whether reduced acceptance of self-relevant health risk information is already visible in early attention processes, that is, attention disengagement processes. In a randomized, controlled trial with 29 smoking and nonsmoking students, a variant of Posner's cueing task was used in combination with the high-temporal resolution method of event-related brain potentials (ERPs). Reaction times and P300 ERP. Smokers showed lower P300 amplitudes in response to high- as opposed to low-threat invalid trials when moving their attention to a target in the opposite visual field, indicating more efficient attention disengagement processes. Furthermore, both smokers and nonsmokers showed increased P300 amplitudes in response to the presentation of high- as opposed to low-threat valid trials, indicating threat-induced attention-capturing processes. Reaction time measures did not support the ERP data, indicating that the ERP measure can be extremely informative to measure low-level attention biases in health communication. The findings provide the first neuroscientific support for the hypothesis that threatening health information causes more efficient disengagement among those for whom the health threat is self-relevant. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  15. Compatibility of Qualitative and Quantitative Methods: Studying Child Sexual Abuse in America.

    ERIC Educational Resources Information Center

    Phelan, Patricia

    1987-01-01

    Illustrates how the combined use of qualitative and quantitative methods were necessary in obtaining a clearer understanding of the process of incest in American society. Argues that the exclusive use of one methodology would have obscured important information. (FMW)

  16. Automatic Query Formulations in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1983-01-01

    Introduces methods designed to reduce role of search intermediaries by generating Boolean search formulations automatically using term frequency considerations from natural language statements provided by system patrons. Experimental results are supplied and methods are described for applying automatic query formulation process in practice.…

  17. Sensing roller for in-process thickness measurement

    DOEpatents

    Novak, J.L.

    1996-07-16

    An apparatus and method are disclosed for processing materials by sensing roller, in which the sensing roller has a plurality of conductive rings (electrodes) separated by rings of dielectric material. Sensing capacitances or impedances between the electrodes provides information on thicknesses of the materials being processed, location of wires therein, and other like characteristics of the materials. 6 figs.

  18. A Computational Method for Enabling Teaching-Learning Process in Huge Online Courses and Communities

    ERIC Educational Resources Information Center

    Mora, Higinio; Ferrández, Antonio; Gil, David; Peral, Jesús

    2017-01-01

    Massive Open Online Courses and e-learning represent the future of the teaching-learning processes through the development of Information and Communication Technologies. They are the response to the new education needs of society. However, this future also presents many challenges such as the processing of online forums when a huge number of…

  19. Design requirements for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.

    1972-01-01

    Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.

  20. Municipal Wastewater Processes. Instructor Guide. Working for Clean Water: An Information Program for Advisory Groups.

    ERIC Educational Resources Information Center

    Stoltzfus, Lorna

    Described is a one-hour overview of the unit processes which comprise a municipal wastewater treatment system. Topics covered in this instructor's guide include types of pollutants encountered, treatment methods, and procedures by which wastewater treatment processes are selected. A slide-tape program is available to supplement this component of…

  1. Model of Higher GIS Education

    ERIC Educational Resources Information Center

    Jakab, Imrich; Ševcík, Michal; Grežo, Henrich

    2017-01-01

    The methods of geospatial data processing are being continually innovated, and universities that are focused on educating experts in Environmental Science should reflect this reality with an elaborate and purpose-built modernization of the education process, education content, as well as learning conditions. Geographic Information Systems (GIS)…

  2. The Deceptively Simple N170 Reflects Network Information Processing Mechanisms Involving Visual Feature Coding and Transfer Across Hemispheres

    PubMed Central

    Ince, Robin A. A.; Jaworska, Katarzyna; Gross, Joachim; Panzeri, Stefano; van Rijsbergen, Nicola J.; Rousselet, Guillaume A.; Schyns, Philippe G.

    2016-01-01

    A key to understanding visual cognition is to determine “where”, “when”, and “how” brain responses reflect the processing of the specific visual features that modulate categorization behavior—the “what”. The N170 is the earliest Event-Related Potential (ERP) that preferentially responds to faces. Here, we demonstrate that a paradigmatic shift is necessary to interpret the N170 as the product of an information processing network that dynamically codes and transfers face features across hemispheres, rather than as a local stimulus-driven event. Reverse-correlation methods coupled with information-theoretic analyses revealed that visibility of the eyes influences face detection behavior. The N170 initially reflects coding of the behaviorally relevant eye contralateral to the sensor, followed by a causal communication of the other eye from the other hemisphere. These findings demonstrate that the deceptively simple N170 ERP hides a complex network information processing mechanism involving initial coding and subsequent cross-hemispheric transfer of visual features. PMID:27550865

  3. Diffusion processes in tumors: A nuclear medicine approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amaya, Helman, E-mail: haamayae@unal.edu.co

    The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and {sup 18}F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer softwaremore » was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical {sup 18}F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.« less

  4. Arithmetic of five-part of leukocytes based on image process

    NASA Astrophysics Data System (ADS)

    Li, Yian; Wang, Guoyou; Liu, Jianguo

    2007-12-01

    This paper apply computer image processing and pattern recognizition methods to solve the problem of auto classification and counting of leukocytes (white blood cell) in peripheral blood. In this paper a new leukocyte arithmetic of five-part based on image process and pattern recognizition is presented, which relized auto classify of leukocyte. The first aim is detect the leukocytes . A major requirement of the whole system is to classify these leukocytes to 5 classes. This arithmetic bases on notability mechanism of eyes, process image by sequence, divides up leukocytes and pick up characters. Using the prior kwonledge of cells and image shape information, this arithmetic divides up the probable shape of Leukocyte first by a new method based on Chamfer and then gets the detail characters. It can reduce the mistake judge rate and the calculation greatly. It also has the learning fuction. This paper also presented a new measurement of karyon's shape which can provide more accurate information. This algorithm has great application value in clinical blood test .

  5. A decentralized square root information filter/smoother

    NASA Technical Reports Server (NTRS)

    Bierman, G. J.; Belzer, M. R.

    1985-01-01

    A number of developments has recently led to a considerable interest in the decentralization of linear least squares estimators. The developments are partly related to the impending emergence of VLSI technology, the realization of parallel processing, and the need for algorithmic ways to speed the solution of dynamically decoupled, high dimensional estimation problems. A new method is presented for combining Square Root Information Filters (SRIF) estimates obtained from independent data sets. The new method involves an orthogonal transformation, and an information matrix filter 'homework' problem discussed by Schweppe (1973) is generalized. The employed SRIF orthogonal transformation methodology has been described by Bierman (1977).

  6. Information Security Scheme Based on Computational Temporal Ghost Imaging.

    PubMed

    Jiang, Shan; Wang, Yurong; Long, Tao; Meng, Xiangfeng; Yang, Xiulun; Shu, Rong; Sun, Baoqing

    2017-08-09

    An information security scheme based on computational temporal ghost imaging is proposed. A sequence of independent 2D random binary patterns are used as encryption key to multiply with the 1D data stream. The cipher text is obtained by summing the weighted encryption key. The decryption process can be realized by correlation measurement between the encrypted information and the encryption key. Due to the instinct high-level randomness of the key, the security of this method is greatly guaranteed. The feasibility of this method and robustness against both occlusion and additional noise attacks are discussed with simulation, respectively.

  7. Medical Image Analysis by Cognitive Information Systems - a Review.

    PubMed

    Ogiela, Lidia; Takizawa, Makoto

    2016-10-01

    This publication presents a review of medical image analysis systems. The paradigms of cognitive information systems will be presented by examples of medical image analysis systems. The semantic processes present as it is applied to different types of medical images. Cognitive information systems were defined on the basis of methods for the semantic analysis and interpretation of information - medical images - applied to cognitive meaning of medical images contained in analyzed data sets. Semantic analysis was proposed to analyzed the meaning of data. Meaning is included in information, for example in medical images. Medical image analysis will be presented and discussed as they are applied to various types of medical images, presented selected human organs, with different pathologies. Those images were analyzed using different classes of cognitive information systems. Cognitive information systems dedicated to medical image analysis was also defined for the decision supporting tasks. This process is very important for example in diagnostic and therapy processes, in the selection of semantic aspects/features, from analyzed data sets. Those features allow to create a new way of analysis.

  8. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    PubMed

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  9. [An Extraction and Recognition Method of the Distributed Optical Fiber Vibration Signal Based on EMD-AWPP and HOSA-SVM Algorithm].

    PubMed

    Zhang, Yanjun; Liu, Wen-zhe; Fu, Xing-hu; Bi, Wei-hong

    2016-02-01

    Given that the traditional signal processing methods can not effectively distinguish the different vibration intrusion signal, a feature extraction and recognition method of the vibration information is proposed based on EMD-AWPP and HOSA-SVM, using for high precision signal recognition of distributed fiber optic intrusion detection system. When dealing with different types of vibration, the method firstly utilizes the adaptive wavelet processing algorithm based on empirical mode decomposition effect to reduce the abnormal value influence of sensing signal and improve the accuracy of signal feature extraction. Not only the low frequency part of the signal is decomposed, but also the high frequency part the details of the signal disposed better by time-frequency localization process. Secondly, it uses the bispectrum and bicoherence spectrum to accurately extract the feature vector which contains different types of intrusion vibration. Finally, based on the BPNN reference model, the recognition parameters of SVM after the implementation of the particle swarm optimization can distinguish signals of different intrusion vibration, which endows the identification model stronger adaptive and self-learning ability. It overcomes the shortcomings, such as easy to fall into local optimum. The simulation experiment results showed that this new method can effectively extract the feature vector of sensing information, eliminate the influence of random noise and reduce the effects of outliers for different types of invasion source. The predicted category identifies with the output category and the accurate rate of vibration identification can reach above 95%. So it is better than BPNN recognition algorithm and improves the accuracy of the information analysis effectively.

  10. Bridge Condition Assessment Using D Numbers

    PubMed Central

    Hu, Yong

    2014-01-01

    Bridge condition assessment is a complex problem influenced by many factors. The uncertain environment increases more its complexity. Due to the uncertainty in the process of assessment, one of the key problems is the representation of assessment results. Though there exists many methods that can deal with uncertain information, however, they have more or less deficiencies. In this paper, a new representation of uncertain information, called D numbers, is presented. It extends the Dempster-Shafer theory. By using D numbers, a new method is developed for the bridge condition assessment. Compared to these existing methods, the proposed method is simpler and more effective. An illustrative case is given to show the effectiveness of the new method. PMID:24696639

  11. Using a Multimedia Presentation to Enhance Informed Consent in a Pediatric Emergency Department.

    PubMed

    Spencer, Sandra P; Stoner, Michael J; Kelleher, Kelly; Cohen, Daniel M

    2015-08-01

    Informed consent is an ethical process for ensuring patient autonomy. Multimedia presentations (MMPs) often aid the informed consent process for research studies. Thus, it follows that MMPs would improve informed consent in clinical settings. The aim of this study was to determine if an MMP for the informed consent process for ketamine sedation improves parental satisfaction and comprehension as compared with standard practice. This 2-phase study compared 2 methods of informed consent for ketamine sedation of pediatric patients. Phase 1 was a randomized, prospective study that compared the standard verbal consent to an MMP. Phase 2 implemented the MMP into daily work flow to validate the previous year's results. Parents completed a survey evaluating their satisfaction of the informed consent process and assessing their knowledge of ketamine sedation. Primary outcome measures were parental overall satisfaction with the informed consent process and knowledge of ketamine sedation. One hundred eighty-four families from a free-standing, urban, tertiary pediatric emergency department with over 85,000 annual visits were enrolled. Different demographics were not associated with a preference for the MMP or improved scores on the content quiz. Intervention families were more likely "to feel involved in the decision to use ketamine" and to understand that "they had the right to refuse the ketamine" as compared with control families. The intervention group scored significantly higher overall on the content section than the control group. Implementation and intervention families responded similarly to all survey sections. Multimedia presentation improves parental understanding of ketamine sedation, whereas parental satisfaction with the informed consent process remains unchanged. Use of MMP in the emergency department for informed consent shows potential for both patients and providers.

  12. A multiscale forecasting method for power plant fleet management

    NASA Astrophysics Data System (ADS)

    Chen, Hongmei

    In recent years the electric power industry has been challenged by a high level of uncertainty and volatility brought on by deregulation and globalization. A power producer must minimize the life cycle cost while meeting stringent safety and regulatory requirements and fulfilling customer demand for high reliability. Therefore, to achieve true system excellence, a more sophisticated system-level decision-making process with a more accurate forecasting support system to manage diverse and often widely dispersed generation units as a single, easily scaled and deployed fleet system in order to fully utilize the critical assets of a power producer has been created as a response. The process takes into account the time horizon for each of the major decision actions taken in a power plant and develops methods for information sharing between them. These decisions are highly interrelated and no optimal operation can be achieved without sharing information in the overall process. The process includes a forecasting system to provide information for planning for uncertainty. A new forecasting method is proposed, which utilizes a synergy of several modeling techniques properly combined at different time-scales of the forecasting objects. It can not only take advantages of the abundant historical data but also take into account the impact of pertinent driving forces from the external business environment to achieve more accurate forecasting results. Then block bootstrap is utilized to measure the bias in the estimate of the expected life cycle cost which will actually be needed to drive the business for a power plant in the long run. Finally, scenario analysis is used to provide a composite picture of future developments for decision making or strategic planning. The decision-making process is applied to a typical power producer chosen to represent challenging customer demand during high-demand periods. The process enhances system excellence by providing more accurate market information, evaluating the impact of external business environment, and considering cross-scale interactions between decision actions. Along with this process, system operation strategies, maintenance schedules, and capacity expansion plans that guide the operation of the power plant are optimally identified, and the total life cycle costs are estimated.

  13. Method and Apparatus for Processing UDP Data Packets

    NASA Technical Reports Server (NTRS)

    Murphy, Brandon M. (Inventor)

    2017-01-01

    A method and apparatus for processing a plurality of data packets. A data packet is received. A determination is made as to whether a portion of the data packet follows a selected digital recorder standard protocol based on a header of the data packet. Raw data in the data packet is converted into human-readable information in response to a determination that the portion of the data packet follows the selected digital recorder standard protocol.

  14. Wireless device monitoring systems and monitoring devices, and associated methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCown, Steven H; Derr, Kurt W; Rohde, Kenneth W

    Wireless device monitoring systems and monitoring devices include a communications module for receiving wireless communications of a wireless device. Processing circuitry is coupled with the communications module and configured to process the wireless communications to determine whether the wireless device is authorized or unauthorized to be present at the monitored area based on identification information of the wireless device. Methods of monitoring for the presence and identity of wireless devices are also provided.

  15. Development and evaluation of climatologically-downscaled AFWA AGRMET precipitation products over the continental U.S.

    NASA Astrophysics Data System (ADS)

    Garcia, M.; Peters-Lidard, C. D.; Eylander, J. B.; Daly, C.; Gibson, W.; Tian, Y.; Zeng, J.; Kato, H.

    2008-05-01

    Collaborations between the Air Force Weather Agency (AFWA), the Hydrological Sciences Branch at NASA-GSFC, and the PRISM Group at Oregon State University have led to improvements in the processing of meteorological forcing inputs for the NASA-GSFC Land Information System (LIS; Kumar et al. 2006), a sophisticated framework for LSM operation and model coupling experiments. Efforts at AFWA toward the production of surface hydrometeorological products are currently in transition from the legacy Agricultural Meteorology modeling system (AGRMET) to use of the LIS framework and procedures. Recent enhancements to meteorological input processing for application to land surface models in LIS include the assimilation of climate-based information for the spatial interpolation and downscaling of precipitation fields. Climatological information included in the LIS- based downscaling procedure for North America is provided by a monthly high-resolution PRISM (Daly et al. 1994, 2002; Daly 2006) dataset based on a 30-year analysis period. The combination of these sources and methods attempts to address the strengths and weaknesses of available legacy products, objective interpolation methods, and the PRISM knowledge-based methodology. All of these efforts are oriented on an operational need for timely estimation of spatial precipitation fields at adequate spatial resolution for customer dissemination and near-real-time simulations in regions of interest. This work focuses on value added to the AGRMET precipitation product by the inclusion of high-quality climatological information on a monthly time scale. The AGRMET method uses microwave-based satellite precipitation estimates from various polar-orbiting platforms (NOAA POES and DMSP), infrared-based estimates from geostationary platforms (GOES, METEOSAT, etc.), related cloud analysis products, and surface gauge observations in a complex and hierarchical blending process. Results from processing of the legacy AGRMET precipitation products over the U.S. using LIS-based methods for downscaling, both with and without climatological factors, are evaluated against high-resolution monthly analyses using the PRISM knowledge- based method (Daly et al. 2002) over a 4-year period. It is demonstrated that the incorporation of climatological information in a downscaling procedure can significantly enhance the accuracy, and potential utility, of AFWA precipitation products for customer applications, especially over mountainous terrain as in the western U.S.

  16. Resolving Complex Research Data Management Issues in Biomedical Laboratories: Qualitative Study of an Industry-Academia Collaboration

    PubMed Central

    Myneni, Sahiti; Patel, Vimla L.; Bova, G. Steven; Wang, Jian; Ackerman, Christopher F.; Berlinicke, Cynthia A.; Chen, Steve H.; Lindvall, Mikael; Zack, Donald J.

    2016-01-01

    This paper describes a distributed collaborative effort between industry and academia to systematize data management in an academic biomedical laboratory. Heterogeneous and voluminous nature of research data created in biomedical laboratories make information management difficult and research unproductive. One such collaborative effort was evaluated over a period of four years using data collection methods including ethnographic observations, semi-structured interviews, web-based surveys, progress reports, conference call summaries, and face-to-face group discussions. Data were analyzed using qualitative methods of data analysis to 1) characterize specific problems faced by biomedical researchers with traditional information management practices, 2) identify intervention areas to introduce a new research information management system called Labmatrix, and finally to 3) evaluate and delineate important general collaboration (intervention) characteristics that can optimize outcomes of an implementation process in biomedical laboratories. Results emphasize the importance of end user perseverance, human-centric interoperability evaluation, and demonstration of return on investment of effort and time of laboratory members and industry personnel for success of implementation process. In addition, there is an intrinsic learning component associated with the implementation process of an information management system. Technology transfer experience in a complex environment such as the biomedical laboratory can be eased with use of information systems that support human and cognitive interoperability. Such informatics features can also contribute to successful collaboration and hopefully to scientific productivity. PMID:26652980

  17. Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process

    NASA Technical Reports Server (NTRS)

    Racette, Paul

    2010-01-01

    Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.

  18. A Simplified Method of Eliciting Information from Novices.

    ERIC Educational Resources Information Center

    Brandt, D. Scott; Uden, Lorna

    2002-01-01

    Discusses the use of applied cognitive task analysis (ACTA) to interview novices and gain insight into their cognitive skills and processes. Focuses particularly on novice Internet searchers at the University of Staffordshire (United Kingdom) and reviews attempts to modify ACTA, which is intended to gather information from experts as part of…

  19. Informal Writing Assessment Linked to Instruction: A Continuous Process for Teachers, Students, and Parents

    ERIC Educational Resources Information Center

    Romeo, Lynn

    2008-01-01

    This article presents a comprehensive model of daily, classroom informal writing assessment that is constantly linked to instruction and the characteristics of proficient writers. Methods for promoting teacher, student, and parent collaboration and their roles in dialoguing, conferencing, and reflection are discussed. Strategies for including…

  20. 76 FR 69653 - Abamectin (avermectin); Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-09

    ... are available in Pesticide Analytical Manual II (PAM II) for citrus and processed fractions (Method I... (ID) number EPA-HQ-OPP-2010-0619. All documents in the docket are listed in the docket index available... available, e.g., Confidential Business Information (CBI) or other information whose disclosure is restricted...

Top