Information Processing Techniques Program. Volume II. Communications- Adaptive Internetting
1977-09-30
LABORATORY INFORMATION PROCESSING TECHNIQUES PROGRAM VOLUME II: COMMUNICATIONS-ADAPTIVE INTERNETTING I SEMIANNUAL TECHNICAL SUMMARY REPORT TO THE...MASSACHUSETTS ABSTRACT This repori describes work performed on the Communications-Adaptive Internetting program sponsored by the Information ... information processing techniques network speech terminal communicatlons-adaptive internetting 04 links digital voice communications time-varying
2001-04-01
part of the following report: TITLE: New Information Processing Techniques for Military Systems [les Nouvelles techniques de traitement de l’information...rapidly developing information increasing amount of time is needed for gathering and technology has until now not yet resulted in a substantial...Information Processing Techniques for Military Systems", held in Istanbul, Turkey, 9-11 October 2000, and published in RTO MP-049. 23-2 organisations. The
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
Knowledge Discovery and Data Mining: An Overview
NASA Technical Reports Server (NTRS)
Fayyad, U.
1995-01-01
The process of knowledge discovery and data mining is the process of information extraction from very large databases. Its importance is described along with several techniques and considerations for selecting the most appropriate technique for extracting information from a particular data set.
NASA Technical Reports Server (NTRS)
Wiswell, E. R.; Cooper, G. R. (Principal Investigator)
1978-01-01
The author has identified the following significant results. The concept of average mutual information in the received spectral random process about the spectral scene was developed. Techniques amenable to implementation on a digital computer were also developed to make the required average mutual information calculations. These techniques required identification of models for the spectral response process of scenes. Stochastic modeling techniques were adapted for use. These techniques were demonstrated on empirical data from wheat and vegetation scenes.
A vector scanning processing technique for pulsed laser velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1989-01-01
Pulsed laser sheet velocimetry yields nonintrusive measurements of two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high precision (1 pct) velocity estimates, but can require several hours of processing time on specialized array processors. Under some circumstances, a simple, fast, less accurate (approx. 5 pct), data reduction technique which also gives unambiguous velocity vector information is acceptable. A direct space domain processing technique was examined. The direct space domain processing technique was found to be far superior to any other techniques known, in achieving the objectives listed above. It employs a new data coding and reduction technique, where the particle time history information is used directly. Further, it has no 180 deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 minutes on an 80386 based PC, producing a 2-D velocity vector map of the flow field. Hence, using this new space domain vector scanning (VS) technique, pulsed laser velocimetry data can be reduced quickly and reasonably accurately, without specialized array processing hardware.
NASA Astrophysics Data System (ADS)
Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun
2014-08-01
In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
CMMI(Registered) for Acquisition, Version 1.3. CMMI-ACQ, V1.3
2010-11-01
and Software Engineering – System Life Cycle Processes [ ISO 2008b] ISO /IEC 27001 :2005 Information technology – Security techniques – Information...International Organization for Standardization and International Electrotechnical Commission. ISO /IEC 27001 Information Technology – Security Techniques...International Organization for Standardization/International Electrotechnical Commission ( ISO /IEC) body of standards. CMMs focus on improving processes
ERIC Educational Resources Information Center
Sager, Naomi
This investigation matches the emerging techniques in computerized natural language processing against emerging needs for such techniques in the information field to evaluate and extend such techniques for future applications and to establish a basis and direction for further research toward these goals. An overview describes developments in the…
Silicon photonics for neuromorphic information processing
NASA Astrophysics Data System (ADS)
Bienstman, Peter; Dambre, Joni; Katumba, Andrew; Freiberger, Matthias; Laporte, Floris; Lugnan, Alessio
2018-02-01
We present our latest results on silicon photonics neuromorphic information processing based a.o. on techniques like reservoir computing. We will discuss aspects like scalability, novel architectures for enhanced power efficiency, as well as all-optical readout. Additionally, we will touch upon new machine learning techniques to operate these integrated readouts. Finally, we will show how these systems can be used for high-speed low-power information processing for applications like recognition of biological cells.
An overview of selected information storage and retrieval issues in computerized document processing
NASA Technical Reports Server (NTRS)
Dominick, Wayne D. (Editor); Ihebuzor, Valentine U.
1984-01-01
The rapid development of computerized information storage and retrieval techniques has introduced the possibility of extending the word processing concept to document processing. A major advantage of computerized document processing is the relief of the tedious task of manual editing and composition usually encountered by traditional publishers through the immense speed and storage capacity of computers. Furthermore, computerized document processing provides an author with centralized control, the lack of which is a handicap of the traditional publishing operation. A survey of some computerized document processing techniques is presented with emphasis on related information storage and retrieval issues. String matching algorithms are considered central to document information storage and retrieval and are also discussed.
The application of artificial intelligence techniques to large distributed networks
NASA Technical Reports Server (NTRS)
Dubyah, R.; Smith, T. R.; Star, J. L.
1985-01-01
Data accessibility and transfer of information, including the land resources information system pilot, are structured as large computer information networks. These pilot efforts include the reduction of the difficulty to find and use data, reducing processing costs, and minimize incompatibility between data sources. Artificial Intelligence (AI) techniques were suggested to achieve these goals. The applicability of certain AI techniques are explored in the context of distributed problem solving systems and the pilot land data system (PLDS). The topics discussed include: PLDS and its data processing requirements, expert systems and PLDS, distributed problem solving systems, AI problem solving paradigms, query processing, and distributed data bases.
Discovery of Information Diffusion Process in Social Networks
NASA Astrophysics Data System (ADS)
Kim, Kwanho; Jung, Jae-Yoon; Park, Jonghun
Information diffusion analysis in social networks is of significance since it enables us to deeply understand dynamic social interactions among users. In this paper, we introduce approaches to discovering information diffusion process in social networks based on process mining. Process mining techniques are applied from three perspectives: social network analysis, process discovery and community recognition. We then present experimental results by using a real-life social network data. The proposed techniques are expected to employ as new analytical tools in online social networks such as blog and wikis for company marketers, politicians, news reporters and online writers.
Localized analysis of paint-coat drying using dynamic speckle interferometry
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel; Tebaldi, Myrian; Grumel, Eduardo; Rabal, Hector; Elmaghraby, Adel
2018-07-01
The paint-coating is part of several industrial processes, including the automotive industry, architectural coatings, machinery and appliances. These paint-coatings must comply with high quality standards, for this reason evaluation techniques from paint-coatings are in constant development. One important factor from the paint-coating process is the drying, as it has influence on the quality of final results. In this work we present an assessment technique based on the optical dynamic speckle interferometry, this technique allows for the temporal activity evaluation of the paint-coating drying process, providing localized information from drying. This localized information is relevant in order to address the drying homogeneity, optimal drying, and quality control. The technique relies in the definition of a new temporal history of the speckle patterns to obtain the local activity; this information is then clustered to provide a convenient indicative of different drying process stages. The experimental results presented were validated using the gravimetric drying curves
A Fifteen-Year Forecast of Information-Processing Technology. Final Report.
ERIC Educational Resources Information Center
Bernstein, George B.
This study developed a variation of the DELPHI approach, a polling technique for systematically soliciting opinions from experts, to produce a technological forecast of developments in the information-processing industry. SEER (System for Event Evaluation and Review) combines the more desirable elements of existing techniques: (1) intuitive…
Encoding techniques for complex information structures in connectionist systems
NASA Technical Reports Server (NTRS)
Barnden, John; Srinivas, Kankanahalli
1990-01-01
Two general information encoding techniques called relative position encoding and pattern similarity association are presented. They are claimed to be a convenient basis for the connectionist implementation of complex, short term information processing of the sort needed in common sense reasoning, semantic/pragmatic interpretation of natural language utterances, and other types of high level cognitive processing. The relationships of the techniques to other connectionist information-structuring methods, and also to methods used in computers, are discussed in detail. The rich inter-relationships of these other connectionist and computer methods are also clarified. The particular, simple forms are discussed that the relative position encoding and pattern similarity association techniques take in the author's own connectionist system, called Conposit, in order to clarify some issues and to provide evidence that the techniques are indeed useful in practice.
NASA Astrophysics Data System (ADS)
Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.
2016-05-01
Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.
Abstracts of Research. July 1974-June 1975.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Computer and Information Science Research Center.
Abstracts of research papers in computer and information science are given for 68 papers in the areas of information storage and retrieval; human information processing; information analysis; linguistic analysis; artificial intelligence; information processes in physical, biological, and social systems; mathematical techniques; systems…
Computer image processing in marine resource exploration
NASA Technical Reports Server (NTRS)
Paluzzi, P. R.; Normark, W. R.; Hess, G. R.; Hess, H. D.; Cruickshank, M. J.
1976-01-01
Pictographic data or imagery is commonly used in marine exploration. Pre-existing image processing techniques (software) similar to those used on imagery obtained from unmanned planetary exploration were used to improve marine photography and side-scan sonar imagery. Features and details not visible by conventional photo processing methods were enhanced by filtering and noise removal on selected deep-sea photographs. Information gained near the periphery of photographs allows improved interpretation and facilitates construction of bottom mosaics where overlapping frames are available. Similar processing techniques were applied to side-scan sonar imagery, including corrections for slant range distortion, and along-track scale changes. The use of digital data processing and storage techniques greatly extends the quantity of information that can be handled, stored, and processed.
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
ERIC Educational Resources Information Center
Wright, Phillip C.; Geroy, Gary D.
Exploring existing methodologies to determine whether they can be adapted or adopted to support strategic goal setting, this paper focuses on information gathering techniques as they relate to the human resource development professional's input into strategic planning processes. The information gathering techniques are all qualitative methods and…
ERIC Educational Resources Information Center
Chowdhury, Gobinda G.
2003-01-01
Discusses issues related to natural language processing, including theoretical developments; natural language understanding; tools and techniques; natural language text processing systems; abstracting; information extraction; information retrieval; interfaces; software; Internet, Web, and digital library applications; machine translation for…
Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.
Tute, Erik; Steiner, Jochen
2018-01-01
Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.
Operator Performance Measures for Assessing Voice Communication Effectiveness
1989-07-01
performance and work- load assessment techniques have been based.I Broadbent (1958) described a limited capacity filter model of human information...INFORMATION PROCESSING 20 3.1.1. Auditory Attention 20 3.1.2. Auditory Memory 24 3.2. MODELS OF INFORMATION PROCESSING 24 3.2.1. Capacity Theories 25...Learning 0 Attention * Language Specialization • Decision Making• Problem Solving Auditory Information Processing Models of Processing Ooemtor
Partial information decomposition as a spatiotemporal filter.
Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D
2011-09-01
Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.
Outram, Victoria; Lalander, Carl-Axel; Lee, Jonathan G M; Davis, E Timothy; Harvey, Adam P
2016-11-01
The productivity of the Acetone Butanol Ethanol (ABE) fermentation can be significantly increased by application of various in situ product recovery (ISPR) techniques. There are numerous technically viable processes, but it is not clear which is the most economically viable in practice. There is little available information about the energy requirements and economics of ISPR for the ABE fermentation. This work compares various ISPR techniques based on UniSim process simulations of the ABE fermentation. The simulations provide information on the process energy and separation efficiency, which is fed into an economic assessment. Perstraction was the only technique to reduce the energy demand below that of a batch process, by approximately 5%. Perstraction also had the highest profit increase over a batch process, by 175%. However, perstraction is an immature technology, so would need significant development before being integrated to an industrial process. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Advance in interferogram data processing technique].
Jing, Juan-Juan; Xiangli, Bin; Lü, Qun-Bo; Huang, Min; Zhou, Jin-Song
2011-04-01
Fourier transform spectrometry is a type of novel information obtaining technology, which integrated the functions of imaging and spectra, but the data that the instrument acquired is the interference data of the target, which is an intermediate data and couldn't be used directly, so data processing must be adopted for the successful application of the interferometric data In the present paper, data processing techniques are divided into two classes: general-purpose and special-type. First, the advance in universal interferometric data processing technique is introduced, then the special-type interferometric data extracting method and data processing technique is illustrated according to the classification of Fourier transform spectroscopy. Finally, the trends of interferogram data processing technique are discussed.
Nature of the optical information recorded in speckles
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.
1998-09-01
The process of encoding displacement information in electronic Holographic Interferometry is reviewed. Procedures to extend the applicability of this technique to large deformations are given. The proposed techniques are applied and results from these experiments are compared with results obtained by other means. The similarity between the two sets of results illustrates the validity for the new techniques.
Evaluation of Ultrasonic Fiber Structure Extraction Technique Using Autopsy Specimens of Liver
NASA Astrophysics Data System (ADS)
Yamaguchi, Tadashi; Hirai, Kazuki; Yamada, Hiroyuki; Ebara, Masaaki; Hachiya, Hiroyuki
2005-06-01
It is very important to diagnose liver cirrhosis noninvasively and correctly. In our previous studies, we proposed a processing technique to detect changes in liver tissue in vivo. In this paper, we propose the evaluation of the relationship between liver disease and echo information using autopsy specimens of a human liver in vitro. It is possible to verify the function of a processing parameter clearly and to compare the processing result and the actual human liver tissue structure by in vitro experiment. In the results of our processing technique, information that did not obey a Rayleigh distribution from the echo signal of the autopsy liver specimens was extracted depending on changes in a particular processing parameter. The fiber tissue structure of the same specimen was extracted from a number of histological images of stained tissue. We constructed 3D structures using the information extracted from the echo signal and the fiber structure of the stained tissue and compared the two. By comparing the 3D structures, it is possible to evaluate the relationship between the information that does not obey a Rayleigh distribution of the echo signal and the fibrosis structure.
ERIC Educational Resources Information Center
Xu, Xiaodong; Jiang, Xiaoming; Zhou, Xiaolin
2013-01-01
There have been a number of behavioral and neural studies on the processing of syntactic gender and number agreement information, marked by different morpho-syntactic features during sentence comprehension. By using the event-related potential (ERP) technique, the present study investigated whether the processing of semantic gender information and…
NASA Astrophysics Data System (ADS)
Demigha, Souâd.
2016-03-01
The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.
Automated synthesis of image processing procedures using AI planning techniques
NASA Technical Reports Server (NTRS)
Chien, Steve; Mortensen, Helen
1994-01-01
This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.
Usage of information safety requirements in improving tube bending process
NASA Astrophysics Data System (ADS)
Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.
2018-05-01
This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.
Advanced technology development for image gathering, coding, and processing
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1990-01-01
Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.
Protocol Analysis: A Methodology for Exploring the Information Processing of Gifted Students.
ERIC Educational Resources Information Center
Anderson, Margaret A.
1986-01-01
Protocol analysis techniques, in which subjects are taught to think aloud, can provide information on the mental operations used by gifted learners. Concerns over the use of such data are described and new directions for the technique are proposed. (CL)
Jacques Ellul and Democracy's "Vital Information" Premise. Journalism Monographs No. 45.
ERIC Educational Resources Information Center
Christians, Clifford G.
In the course of elaborating "la technique," Jacques Ellul stoutly contradicts the democratic assumption that citizens can have sufficient information to participate knowledgeably in the governing process. "La technique" converts message systems into propagandization networks and erects an inflexible boundary which democracy…
1D Seismic reflection technique to increase depth information in surface seismic investigations
NASA Astrophysics Data System (ADS)
Camilletti, Stefano; Fiera, Francesco; Umberto Pacini, Lando; Perini, Massimiliano; Prosperi, Andrea
2017-04-01
1D seismic methods, such as MASW Re.Mi. and HVSR, have been extensively used in engineering investigations, bedrock research, Vs profile and to some extent for hydrologic applications, during the past 20 years. Recent advances in equipment, sound sources and computer interpretation techniques, make 1D seismic methods highly effective in shallow subsoil modeling. Classical 1D seismic surveys allows economical collection of subsurface data however they fail to return accurate information for depths greater than 50 meters. Using a particular acquisition technique it is possible to collect data that can be quickly processed through reflection technique in order to obtain more accurate velocity information in depth. Furthermore, data processing returns a narrow stratigraphic section, alongside the 1D velocity model, where lithological boundaries are represented. This work will show how collect a single-CMP to determine: (1) depth of bedrock; (2) gravel layers in clayey domains; (3) accurate Vs profile. Seismic traces was processed by means a new software developed in collaboration with SARA electronics instruments S.r.l company, Perugia - ITALY. This software has the great advantage of being able to be used directly in the field in order to reduce the times elapsing between acquisition and processing.
Reshaping the Enterprise through an Information Architecture and Process Reengineering.
ERIC Educational Resources Information Center
Laudato, Nicholas C.; DeSantis, Dennis J.
1995-01-01
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
30 CFR 280.51 - What types of geophysical data and information must I submit to MMS?
Code of Federal Regulations, 2010 CFR
2010-07-01
..., shallow and deep subbottom profiles, bathymetry, sidescan sonar, gravity and magnetic surveys, and special... and of a quality suitable for processing; (c) Processed geophysical information derived from seismic... interpretive evaluation, reflecting state-of-the-art processing techniques; and (d) Other geophysical data...
Capturing and Modeling Domain Knowledge Using Natural Language Processing Techniques
2005-06-01
Intelligence Artificielle , France, May 2001, p. 109- 118 [Barrière, 2001] -----. “Investigating the Causal Relation in Informative Texts”. Terminology, 7:2...out of the flood of information, military have to create new ways of processing sensor and intelligence information, and of providing the results to...have to create new ways of processing sensor and intelligence information, and of providing the results to commanders who must take timely operational
NASA Technical Reports Server (NTRS)
1978-01-01
The discipline programs of the Space and Terrestrial (S&T) Applications Program are described and examples of research areas of current interest are given. Application of space techniques to improve conditions on earth are summarized. Discipline programs discussed include: resource observations; environmental observations; communications; materials processing in space; and applications systems/information systems. Format information on submission of unsolicited proposals for research related to the S&T Applications Program are given.
Rismanchian, Farhood; Lee, Young Hoon
2017-07-01
This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.
Terminology model discovery using natural language processing and visualization techniques.
Zhou, Li; Tao, Ying; Cimino, James J; Chen, Elizabeth S; Liu, Hongfang; Lussier, Yves A; Hripcsak, George; Friedman, Carol
2006-12-01
Medical terminologies are important for unambiguous encoding and exchange of clinical information. The traditional manual method of developing terminology models is time-consuming and limited in the number of phrases that a human developer can examine. In this paper, we present an automated method for developing medical terminology models based on natural language processing (NLP) and information visualization techniques. Surgical pathology reports were selected as the testing corpus for developing a pathology procedure terminology model. The use of a general NLP processor for the medical domain, MedLEE, provides an automated method for acquiring semantic structures from a free text corpus and sheds light on a new high-throughput method of medical terminology model development. The use of an information visualization technique supports the summarization and visualization of the large quantity of semantic structures generated from medical documents. We believe that a general method based on NLP and information visualization will facilitate the modeling of medical terminologies.
An Evaluation of Understandability of Patient Journey Models in Mental Health.
Percival, Jennifer; McGregor, Carolyn
2016-07-28
There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers.
Fowler, J C; Hilsenroth, M J; Handler, L
2000-08-01
In this article, we describe Martin Mayman's approach to early childhood memories as a projective technique, beginning with his scientific interest in learning theory, coupled with his interest in ego psychology and object relations theory. We describe Mayman's contributions to the use of the early memories technique to inform the psychotherapy process, tying assessment closely to psychotherapy and making assessment more useful in treatment. In this article, we describe a representative sample of research studies that demonstrate the reliability and validity of early memories, followed by case examples in which the early memories informed the therapy process, including issues of transference and countertransference.
MRI and unilateral NMR study of reindeer skin tanning processes.
Zhu, Lizheng; Del Federico, Eleonora; Ilott, Andrew J; Klokkernes, Torunn; Kehlet, Cindie; Jerschow, Alexej
2015-04-07
The study of arctic or subarctic indigenous skin clothing material, known for its design and ability to keep the body warm, provides information about the tanning materials and techniques. The study also provides clues about the culture that created it, since tanning processes are often specific to certain indigenous groups. Untreated skin samples and samples treated with willow (Salix sp) bark extract and cod liver oil are compared in this study using both MRI and unilateral NMR techniques. The two types of samples show different proton spatial distributions and different relaxation times, which may also provide information about the tanning technique and aging behavior.
A vector scanning processing technique for pulsed laser velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1989-01-01
Pulsed-laser-sheet velocimetry yields two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high-precision (1-percent) velocity estimates, but can require hours of processing time on specialized array processors. Sometimes, however, a less accurate (about 5 percent) data-reduction technique which also gives unambiguous velocity vector information is acceptable. Here, a direct space-domain processing technique is described and shown to be far superior to previous methods in achieving these objectives. It uses a novel data coding and reduction technique and has no 180-deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 min on an 80386-based PC, producing a two-dimensional velocity-vector map of the flowfield. Pulsed-laser velocimetry data can thus be reduced quickly and reasonably accurately, without specialized array processing hardware.
Critical incident technique: a user's guide for nurse researchers.
Schluter, Jessica; Seaton, Philippa; Chaboyer, Wendy
2008-01-01
This paper is a description of the development and processes of the critical incident technique and its applicability to nursing research, using a recently-conducted study of the Australian nursing workforce as an exemplar. Issues are raised for consideration prior to the technique being put into practice. Since 1954, the critical incident technique has been used to study people's activities in a variety of professions. This five-step technique can be modified for specific settings and research questions. The fruitfulness of a study using the technique relies on gaining three important pieces of information. First, participants' complete and rich descriptions of the situation or event to be explored; secondly, the specific actions of the person/s involved in the event to aid understanding of why certain decisions were made; thirdly, the outcome of the event, to ascertain the effectiveness of the behaviour. As in other qualitative methodologies, an inductive analysis process can be used with the critical incident technique. Rich contextual information can be obtained using this technique. It generates information and uncovers tacit knowledge through assisting participants to describe their thought processes and actions during the event. Use of probing questions that determine how participants take part in certain events, or act in the ways they do, greatly enhances the outcome. A full interpretation of the event can only occur when all its aspects are provided. The critical incident technique is a practical method that allows researchers to understand complexities of the nursing role and function, and the interactions between nurses and other clinicians.
TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook
1989-08-01
This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.
OPERATIONS RESEARCH IN THE DESIGN OF MANAGEMENT INFORMATION SYSTEMS
management information systems is concerned with the identification and detailed specification of the information and data processing...of advanced data processing techniques in management information systems today, the close coordination of operations research and data systems activities has become a practical necessity for the modern business firm.... information systems in which mathematical models are employed as the basis for analysis and systems design. Operations research provides a
ERIC Educational Resources Information Center
Alfonseca, Enrique; Rodriguez, Pilar; Perez, Diana
2007-01-01
This work describes a framework that combines techniques from Adaptive Hypermedia and Natural Language processing in order to create, in a fully automated way, on-line information systems from linear texts in electronic format, such as textbooks. The process is divided into two steps: an "off-line" processing step, which analyses the source text,…
Employee empowerment through team building and use of process control methods.
Willems, S
1998-02-01
The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.
Working on the Boundaries: Philosophies and Practices of the Design Process
NASA Technical Reports Server (NTRS)
Ryan, R.; Blair, J.; Townsend, J.; Verderaime, V.
1996-01-01
While systems engineering process is a program formal management technique and contractually binding, the design process is the informal practice of achieving the design project requirements throughout all design phases of the systems engineering process. The design process and organization are systems and component dependent. Informal reviews include technical information meetings and concurrent engineering sessions, and formal technical discipline reviews are conducted through the systems engineering process. This paper discusses and references major philosophical principles in the design process, identifies its role in interacting systems and disciplines analyses and integrations, and illustrates the process application in experienced aerostructural designs.
Data Entities and Information System Matrix for Integrated Agriculture Information System (IAIS)
NASA Astrophysics Data System (ADS)
Budi Santoso, Halim; Delima, Rosa
2018-03-01
Integrated Agriculture Information System is a system that is developed to process data, information, and knowledge in Agriculture sector. Integrated Agriculture Information System brings valuable information for farmers: (1) Fertilizer price; (2) Agriculture technique and practise; (3) Pest management; (4) Cultivation; (5) Irrigation; (6) Post harvest processing; (7) Innovation in agriculture processing. Integrated Agriculture Information System contains 9 subsystems. To bring an integrated information to the user and stakeholder, it needs an integrated database approach. Thus, researchers describes data entity and its matrix relate to subsystem in Integrated Agriculture Information System (IAIS). As a result, there are 47 data entities as entities in single and integrated database.
Recent advances in nuclear magnetic resonance quantum information processing.
Criger, Ben; Passante, Gina; Park, Daniel; Laflamme, Raymond
2012-10-13
Quantum information processors have the potential to drastically change the way we communicate and process information. Nuclear magnetic resonance (NMR) has been one of the first experimental implementations of quantum information processing (QIP) and continues to be an excellent testbed to develop new QIP techniques. We review the recent progress made in NMR QIP, focusing on decoupling, pulse engineering and indirect nuclear control. These advances have enhanced the capabilities of NMR QIP, and have useful applications in both traditional NMR and other QIP architectures.
The effects of solar incidence angle over digital processing of LANDSAT data
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Novo, E. M. L. M.
1983-01-01
A technique to extract the topography modulation component from digital data is described. The enhancement process is based on the fact that the pixel contains two types of information: (1) reflectance variation due to the target; (2) reflectance variation due to the topography. In order to enhance the signal variation due to topography, the technique recommends the extraction from original LANDSAT data of the component resulting from target reflectance. Considering that the role of topographic modulation over the pixel information will vary with solar incidence angle, the results of this technique of digital processing will differ from one season to another, mainly in highly dissected topography. In this context, the effects of solar incidence angle over the topographic modulation technique were evaluated. Two sets of MSS/LANDSAT data, with solar elevation angles varying from 22 to 41 deg were selected to implement the digital processing at the Image-100 System. A secondary watershed (Rio Bocaina) draining into Rio Paraiba do Sul (Sao Paulo State) was selected as a test site. The results showed that the technique used was more appropriate to MSS data acquired under higher Sun elevation angles. Topographic modulation components applied to low Sun elevation angles lessens rather than enhances topography.
ERIC Educational Resources Information Center
Bailey, Anthony
2013-01-01
The nominal group technique (NGT) is a structured process to gather information from a group. The technique was first described in 1975 and has since become a widely-used standard to facilitate working groups. The NGT is effective for generating large numbers of creative new ideas and for group priority setting. This paper describes the process of…
Merlyn J. Paulson
1979-01-01
This paper outlines a project level process (V.I.S.) which utilizes very accurate and flexible computer algorithms in combination with contemporary site analysis and design techniques for visual evaluation, design and management. The process provides logical direction and connecting bridges through problem identification, information collection and verification, visual...
2001-04-01
divisée en un certain nombre de sessions. SESSION I – SYSTEMES ET TECHNIQUES DE TRAITEMENT DE L’INFORMATION (I) Les deux premières sessions étaient...établissant des paramètres et en jetant les bases théoriques/mathématiques de futurs systèmes d’information. Il s’agit d’un domaine pour lequel la...transfert de données et de renseignements se fait de plus en plus sentir. Ce symposium couvrira un grand domaine opérationnel qui est d’une grande
Electronic-Power-Transformer Design Guide
NASA Technical Reports Server (NTRS)
Schwarze, G. E.; Lagadinos, J. C.; Ahearn, J. F.
1983-01-01
Compilation of information on design procedures, electrical properties, and fabrication. Guide provides information on design procedures; magnetic and insulating material electrical properties; impregnating, encapsulating and processing techniques.
White-Light Optical Information Processing and Holography.
1982-05-03
artifact noise . I. wever, the deblurring spatial filter that we used were a narrow spectral band centered at 5154A green light. To compensate for the scaling...Processing, White-Light 11olographyv, Image Profcessing, Optical Signal Process inI, Image Subtraction, Image Deblurring . 70. A S’ R ACT (Continua on crow ad...optical processing technique, we had shown that the incoherent source techniques provides better image quality, and very low coherent artifact noise
Refining the 'cucumber' technique for laryngeal biopsy.
Robertson, S; Cooper, L; McPhaden, A; MacKenzie, K
2011-06-01
To refine the case selection process for the 'cucumber' mounting system for laryngeal biopsies. We conducted a retrospective audit of cucumber technique specimens taken between January 2002 and December 2008. We analysed the clinical indications for biopsy and the pathological diagnosis, for each specimen, in order to inform our case selection process. The cucumber technique was used for 125 laryngeal specimens. 60 specimens were taken for diagnostic sampling, 46 were taken during endoscopic laser resection, and 19 for overtly benign pathology. The cucumber technique was most useful for the interpretation of margins in endoscopic laser resection specimens. The cucumber technique is most useful for endoscopic resection cases in which tumour, dysplasia or suspicious lesions have been excised. Detailed information on resection margins is invaluable during multidisciplinary team discussions on patient management. Detailed photography of mounted specimens enables both laryngologist and pathologist to orientate and interpret specimens accurately.
NASA Technical Reports Server (NTRS)
Eckel, J. S.; Crabtree, M. S.
1984-01-01
Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.
Digital image processing for information extraction.
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1973-01-01
The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.
Particle sizing in rocket motor studies utilizing hologram image processing
NASA Technical Reports Server (NTRS)
Netzer, David; Powers, John
1987-01-01
A technique of obtaining particle size information from holograms of combustion products is described. The holograms are obtained with a pulsed ruby laser through windows in a combustion chamber. The reconstruction is done with a krypton laser with the real image being viewed through a microscope. The particle size information is measured with a Quantimet 720 image processing system which can discriminate various features and perform measurements of the portions of interest in the image. Various problems that arise in the technique are discussed, especially those that are a consequence of the speckle due to the diffuse illumination used in the recording process.
ERIC Educational Resources Information Center
Trevathan, Jarrod; Myers, Trina
2013-01-01
Process-Oriented Guided Inquiry Learning (POGIL) is a technique used to teach in large lectures and tutorials. It invokes interaction, team building, learning and interest through highly structured group work. Currently, POGIL has only been implemented in traditional classroom settings where all participants are physically present. However,…
Information Extraction Using Controlled English to Support Knowledge-Sharing and Decision-Making
2012-06-01
or language variants. CE-based information extraction will greatly facilitate the processes in the cognitive and social domains that enable forces...terminology or language variants. CE-based information extraction will greatly facilitate the processes in the cognitive and social domains that...processor is run to turn the atomic CE into a more “ stylistically felicitous” CE, using techniques such as: aggregating all information about an entity
Microscale bioprocess optimisation.
Micheletti, Martina; Lye, Gary J
2006-12-01
Microscale processing techniques offer the potential to speed up the delivery of new drugs to the market, reducing development costs and increasing patient benefit. These techniques have application across both the chemical and biopharmaceutical sectors. The approach involves the study of individual bioprocess operations at the microlitre scale using either microwell or microfluidic formats. In both cases the aim is to generate quantitative bioprocess information early on, so as to inform bioprocess design and speed translation to the manufacturing scale. Automation can enhance experimental throughput and will facilitate the parallel evaluation of competing biocatalyst and process options.
Methods for Improving Information from ’Undesigned’ Human Factors Experiments.
Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction
NASA Astrophysics Data System (ADS)
Wiesmann, William P.; Pranger, L. Alex; Bogucki, Mary S.
1998-05-01
Remote monitoring of physiologic data from individual high- risk workers distributed over time and space is a considerable challenge. This is often due to an inadequate capability to accurately integrate large amounts of data into usable information in real time. In this report, we have used the vertical and horizontal organization of the 'fireground' as a framework to design a distributed network of sensors. In this system, sensor output is linked through a hierarchical object oriented programing process to accurately interpret physiological data, incorporate these data into a synchronous model and relay processed data, trends and predictions to members of the fire incident command structure. There are several unique aspects to this approach. The first includes a process to account for variability in vital parameter values for each individual's normal physiologic response by including an adaptive network in each data process. This information is used by the model in an iterative process to baseline a 'normal' physiologic response to a given stress for each individual and to detect deviations that indicate dysfunction or a significant insult. The second unique capability of the system orders the information for each user including the subject, local company officers, medical personnel and the incident commanders. Information can be retrieved and used for training exercises and after action analysis. Finally this system can easily be adapted to existing communication and processing links along with incorporating the best parts of current models through the use of object oriented programming techniques. These modern software techniques are well suited to handling multiple data processes independently over time in a distributed network.
Digital Image Processing Overview For Helmet Mounted Displays
NASA Astrophysics Data System (ADS)
Parise, Michael J.
1989-09-01
Digital image processing provides a means to manipulate an image and presents a user with a variety of display formats that are not available in the analog image processing environment. When performed in real time and presented on a Helmet Mounted Display, system capability and flexibility are greatly enhanced. The information content of a display can be increased by the addition of real time insets and static windows from secondary sensor sources, near real time 3-D imaging from a single sensor can be achieved, graphical information can be added, and enhancement techniques can be employed. Such increased functionality is generating a considerable amount of interest in the military and commercial markets. This paper discusses some of these image processing techniques and their applications.
Using object-oriented analysis techniques to support system testing
NASA Astrophysics Data System (ADS)
Zucconi, Lin
1990-03-01
Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.
Separating Item and Order Information through Process Dissociation
ERIC Educational Resources Information Center
Nairne, James S.; Kelley, Matthew R.
2004-01-01
In the present paper, we develop and apply a technique, based on the logic of process dissociation, for obtaining numerical estimates of item and order information. Certain variables, such as phonological similarity, are widely believed to produce dissociative effects on item and order retention. However, such beliefs rest on the questionable…
NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.
ERIC Educational Resources Information Center
Zhou, Lina; Zhang, Dongsong
2003-01-01
Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…
How Students Learn: Information Processing, Intellectual Development and Confrontation
ERIC Educational Resources Information Center
Entwistle, Noel
1975-01-01
A model derived from information processing theory is described, which helps to explain the complex verbal learning of students and suggests implications for lecturing techniques. Other factors affecting learning, which are not covered by the model, are discussed in relationship to it: student's intellectual development and effects of individual…
Fingerprint pattern restoration by digital image processing techniques.
Wen, Che-Yen; Yu, Chiu-Chung
2003-09-01
Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.
ERIC Educational Resources Information Center
Rimoldi, Horacio J. A.
The study of problem solving is made through the analysis of the process that leads to the final answer. The type of information obtained through the study of the process is compared with the information obtained by studying the final answer. The experimental technique used permits to identify the sequence of questions (tactics) that subjects ask…
14 CFR 17.13 - Dispute resolution process for protests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Resolution (ADR) techniques to resolve the protest, pursuant to subpart D of this part, or they will proceed....31(c), informal ADR techniques may be utilized simultaneously with ongoing adjudication. (e) The...
14 CFR 17.13 - Dispute resolution process for protests.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Resolution (ADR) techniques to resolve the protest, pursuant to subpart D of this part, or they will proceed....31(c), informal ADR techniques may be utilized simultaneously with ongoing adjudication. (e) The...
Mandarin Chinese Tone Identification in Cochlear Implants: Predictions from Acoustic Models
Morton, Kenneth D.; Torrione, Peter A.; Throckmorton, Chandra S.; Collins, Leslie M.
2015-01-01
It has been established that current cochlear implants do not supply adequate spectral information for perception of tonal languages. Comprehension of a tonal language, such as Mandarin Chinese, requires recognition of lexical tones. New strategies of cochlear stimulation such as variable stimulation rate and current steering may provide the means of delivering more spectral information and thus may provide the auditory fine structure required for tone recognition. Several cochlear implant signal processing strategies are examined in this study, the continuous interleaved sampling (CIS) algorithm, the frequency amplitude modulation encoding (FAME) algorithm, and the multiple carrier frequency algorithm (MCFA). These strategies provide different types and amounts of spectral information. Pattern recognition techniques can be applied to data from Mandarin Chinese tone recognition tasks using acoustic models as a means of testing the abilities of these algorithms to transmit the changes in fundamental frequency indicative of the four lexical tones. The ability of processed Mandarin Chinese tones to be correctly classified may predict trends in the effectiveness of different signal processing algorithms in cochlear implants. The proposed techniques can predict trends in performance of the signal processing techniques in quiet conditions but fail to do so in noise. PMID:18706497
Review of chart recognition in document images
NASA Astrophysics Data System (ADS)
Liu, Yan; Lu, Xiaoqing; Qin, Yeyang; Tang, Zhi; Xu, Jianbo
2013-01-01
As an effective information transmitting way, chart is widely used to represent scientific statistics datum in books, research papers, newspapers etc. Though textual information is still the major source of data, there has been an increasing trend of introducing graphs, pictures, and figures into the information pool. Text recognition techniques for documents have been accomplished using optical character recognition (OCR) software. Chart recognition techniques as a necessary supplement of OCR for document images are still an unsolved problem due to the great subjectiveness and variety of charts styles. This paper reviews the development process of chart recognition techniques in the past decades and presents the focuses of current researches. The whole process of chart recognition is presented systematically, which mainly includes three parts: chart segmentation, chart classification, and chart Interpretation. In each part, the latest research work is introduced. In the last, the paper concludes with a summary and promising future research direction.
Jitter model and signal processing techniques for pulse width modulation optical recording
NASA Technical Reports Server (NTRS)
Liu, Max M.-K.
1991-01-01
A jitter model and signal processing techniques are discussed for data recovery in Pulse Width Modulation (PWM) optical recording. In PWM, information is stored through modulating sizes of sequential marks alternating in magnetic polarization or in material structure. Jitter, defined as the deviation from the original mark size in the time domain, will result in error detection if it is excessively large. A new approach is taken in data recovery by first using a high speed counter clock to convert time marks to amplitude marks, and signal processing techniques are used to minimize jitter according to the jitter model. The signal processing techniques include motor speed and intersymbol interference equalization, differential and additive detection, and differential and additive modulation.
Higher resolution satellite remote sensing and the impact on image mapping
Watkins, Allen H.; Thormodsgard, June M.
1987-01-01
Recent advances in spatial, spectral, and temporal resolution of civil land remote sensing satellite data are presenting new opportunities for image mapping applications. The U.S. Geological Survey's experimental satellite image mapping program is evolving toward larger scale image map products with increased information content as a result of improved image processing techniques and increased resolution. Thematic mapper data are being used to produce experimental image maps at 1:100,000 scale that meet established U.S. and European map accuracy standards. Availability of high quality, cloud-free, 30-meter ground resolution multispectral data from the Landsat thematic mapper sensor, along with 10-meter ground resolution panchromatic and 20-meter ground resolution multispectral data from the recently launched French SPOT satellite, present new cartographic and image processing challenges.The need to fully exploit these higher resolution data increases the complexity of processing the images into large-scale image maps. The removal of radiometric artifacts and noise prior to geometric correction can be accomplished by using a variety of image processing filters and transforms. Sensor modeling and image restoration techniques allow maximum retention of spatial and radiometric information. An optimum combination of spectral information and spatial resolution can be obtained by merging different sensor types. These processing techniques are discussed and examples are presented.
The Neuroscience of Dance and the Dance of Neuroscience: Defining a Path of Inquiry
ERIC Educational Resources Information Center
Dale, J. Alexander; Hyatt, Janyce; Hollerman, Jeff
2007-01-01
The neural processes of a person comprehending or creating music have intrigued neuroscientists and prompted them to examine the processing of information and emotion with some of the most recent and sophisticated techniques in the brain sciences (see, for example, Zatorre and his colleagues' work). These techniques and the excitement of studying…
Lin, Yen-Ko; Chen, Chao-Wen; Lee, Wei-Che; Lin, Tsung-Ying; Kuo, Liang-Chi; Lin, Chia-Ju; Shi, Leiyu; Tien, Yin-Chun; Cheng, Yuan-Chia
2017-11-29
Ensuring adequate informed consent for surgery in a trauma setting is challenging. We developed and pilot tested an educational video containing information regarding the informed consent process for surgery in trauma patients and a knowledge measure instrument and evaluated whether the audiovisual presentation improved the patients' knowledge regarding their procedure and aftercare and their satisfaction with the informed consent process. A modified Delphi technique in which a panel of experts participated in successive rounds of shared scoring of items to forecast outcomes was applied to reach a consensus among the experts. The resulting consensus was used to develop the video content and questions for measuring the understanding of the informed consent for debridement surgery in limb trauma patients. The expert panel included experienced patients. The participants in this pilot study were enrolled as a convenience sample of adult trauma patients scheduled to receive surgery. The modified Delphi technique comprised three rounds over a 4-month period. The items given higher scores by the experts in several categories were chosen for the subsequent rounds until consensus was reached. The experts reached a consensus on each item after the three-round process. The final knowledge measure comprising 10 questions was developed and validated. Thirty eligible trauma patients presenting to the Emergency Department (ED) were approached and completed the questionnaires in this pilot study. The participants exhibited significantly higher mean knowledge and satisfaction scores after watching the educational video than before watching the video. Our process is promising for developing procedure-specific informed consent and audiovisual aids in medical and surgical specialties. The educational video was developed using a scientific method that integrated the opinions of different stakeholders, particularly patients. This video is a useful tool for improving the knowledge and satisfaction of trauma patients in the ED. The modified Delphi technique is an effective method for collecting experts' opinions and reaching a consensus on the content of educational materials for informed consent. Institutions should prioritize patient-centered health care and develop a structured informed consent process to improve the quality of care. The ClinicalTrials.gov Identifier is NCT01338480 . The date of registration was April 18, 2011 (retrospectively registered).
41 CFR 102-118.35 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-07-01
... published formats and codes as authorized by the applicable Federal Information Processing Standards... techniques for carrying out transportation transactions using electronic transmissions of the information...
78 FR 47784 - Notice of Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... Standards and Technology (NIST) Federal Information Processing Standard (FIPS) 201: Personal Identity...), address, employment history, biometric identifiers (e.g. fingerprints), signature, digital photograph... collection techniques or the use of other forms of information technology. Comments submitted in response to...
An Evaluation of Understandability of Patient Journey Models in Mental Health
2016-01-01
Background There is a significant trend toward implementing health information technology to reduce administrative costs and improve patient care. Unfortunately, little awareness exists of the challenges of integrating information systems with existing clinical practice. The systematic integration of clinical processes with information system and health information technology can benefit the patients, staff, and the delivery of care. Objectives This paper presents a comparison of the degree of understandability of patient journey models. In particular, the authors demonstrate the value of a relatively new patient journey modeling technique called the Patient Journey Modeling Architecture (PaJMa) when compared with traditional manufacturing based process modeling tools. The paper also presents results from a small pilot case study that compared the usability of 5 modeling approaches in a mental health care environment. Method Five business process modeling techniques were used to represent a selected patient journey. A mix of both qualitative and quantitative methods was used to evaluate these models. Techniques included a focus group and survey to measure usability of the various models. Results The preliminary evaluation of the usability of the 5 modeling techniques has shown increased staff understanding of the representation of their processes and activities when presented with the models. Improved individual role identification throughout the models was also observed. The extended version of the PaJMa methodology provided the most clarity of information flows for clinicians. Conclusions The extended version of PaJMa provided a significant improvement in the ease of interpretation for clinicians and increased the engagement with the modeling process. The use of color and its effectiveness in distinguishing the representation of roles was a key feature of the framework not present in other modeling approaches. Future research should focus on extending the pilot case study to a more diversified group of clinicians and health care support workers. PMID:27471006
Diagnostic techniques in deflagration and detonation studies.
Proud, William G; Williamson, David M; Field, John E; Walley, Stephen M
2015-12-01
Advances in experimental, high-speed techniques can be used to explore the processes occurring within energetic materials. This review describes techniques used to study a wide range of processes: hot-spot formation, ignition thresholds, deflagration, sensitivity and finally the detonation process. As this is a wide field the focus will be on small-scale experiments and quantitative studies. It is important that such studies are linked to predictive models, which inform the experimental design process. The stimuli range includes, thermal ignition, drop-weight, Hopkinson Bar and Plate Impact studies. Studies made with inert simulants are also included as these are important in differentiating between reactive response and purely mechanical behaviour.
NASA Technical Reports Server (NTRS)
Cornish, C. R.
1983-01-01
Following reception and analog to digital conversion (A/D) conversion, atmospheric radar backscatter echoes need to be processed so as to obtain desired information about atmospheric processes and to eliminate or minimize contaminating contributions from other sources. Various signal processing techniques have been implemented at mesosphere-stratosphere-troposphere (MST) radar facilities to estimate parameters of interest from received spectra. Such estimation techniques need to be both accurate and sufficiently efficient to be within the capabilities of the particular data-processing system. The various techniques used to parameterize the spectra of received signals are reviewed herein. Noise estimation, electromagnetic interference, data smoothing, correlation, and the Doppler effect are among the specific points addressed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pharhizgar, K.D.; Lunce, S.E.
1994-12-31
Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less
NASA Astrophysics Data System (ADS)
Altunbek, Mine; Kelestemur, Seda; Culha, Mustafa
2015-12-01
Surface-enhanced Raman scattering (SERS) continues to strive to gather molecular level information from dynamic biological systems. It is our ongoing effort to utilize the technique for understanding of the biomolecular processes in living systems such as eukaryotic and prokaryotic cells. In this study, the technique is investigated to identify cell death mechanisms in 2D and 3D in vitro cell culture models, which is a very important process in tissue engineering and pharmaceutical applications. Second, in situ biofilm formation monitoring is investigated to understand how microorganisms respond to the environmental stimuli, which inferred information can be used to interfere with biofilm formation and fight against their pathogenic activity.
Science Teachers' Information Processing Behaviours in Nepal: A Reflective Comparative Study
ERIC Educational Resources Information Center
Acharya, Kamal Prasad
2017-01-01
This study examines the investigation of the information processing behaviours of secondary level science teachers. It is based on the data collected from 50 secondary level school science teachers working in Kathmandy valley. The simple random sampling and the Cognitive Style Inventory have been used respectively as the technique and tool to…
John H. Schomaker; David W. Lime
1988-01-01
The "nominal group" process is a proven technique to systematically arrive at a consensus about critical information needs in recreation planning and management. Using this process, 41 managers who attended a 1983 conference on river management identified 114 specific information needs grouped under 11 general questions. Clearly, some concerns of...
NASA Astrophysics Data System (ADS)
Larger, Laurent; Baylón-Fuentes, Antonio; Martinenghi, Romain; Udaltsov, Vladimir S.; Chembo, Yanne K.; Jacquot, Maxime
2017-01-01
Reservoir computing, originally referred to as an echo state network or a liquid state machine, is a brain-inspired paradigm for processing temporal information. It involves learning a "read-out" interpretation for nonlinear transients developed by high-dimensional dynamics when the latter is excited by the information signal to be processed. This novel computational paradigm is derived from recurrent neural network and machine learning techniques. It has recently been implemented in photonic hardware for a dynamical system, which opens the path to ultrafast brain-inspired computing. We report on a novel implementation involving an electro-optic phase-delay dynamics designed with off-the-shelf optoelectronic telecom devices, thus providing the targeted wide bandwidth. Computational efficiency is demonstrated experimentally with speech-recognition tasks. State-of-the-art speed performances reach one million words per second, with very low word error rate. Additionally, to record speed processing, our investigations have revealed computing-efficiency improvements through yet-unexplored temporal-information-processing techniques, such as simultaneous multisample injection and pitched sampling at the read-out compared to information "write-in".
Mathematics and Information Retrieval.
ERIC Educational Resources Information Center
Salton, Gerald
1979-01-01
Examines the main mathematical approaches to information retrieval, including both algebraic and probabilistic models, and describes difficulties which impede formalization of information retrieval processes. A number of developments are covered where new theoretical understandings have directly led to improved retrieval techniques and operations.…
Public Participation Guide: Tools to Inform the Public
Tools to inform the public include techniques that you can use to provide members of the public with the information they need to understand the project, the decision process, and also to provide feedback on how public input influenced the decision.
Kaczmarek, Magdalena C.; Steffens, Melanie C.
2017-01-01
Recent studies demonstrated that the sequential induction of contrasting negative and positive emotions can be used as a social influence technique. The original field experiments found that whenever a sudden change in the emotional dynamic occurs – from negative to positive or vice versa – an increase in compliant behavior and an impairment in cognitive functioning can be observed. The goal of the present experiments was a conceptual replication and extension of the results in a more controlled and counterbalanced fashion. To this aim a novel emotion induction technique was created using an outcome related expectancy violation to induce and change emotions. In a first experiment, the influence of contrasting emotions (vs. only one emotion) on compliance, message processing and information recall was assessed among 80 undergraduate students. We were able to show that a positive, then negative experience, and vice versa, led to losses in processing efficacy, not only leaving individuals momentarily vulnerable to social influence attempts, but also impairing information recall. We replicated this pattern of findings in a second experiment (N = 41). The implications of this innovative induction technique and its findings for theory and future research on the emerging field on contrasting emotions as social-influence techniques are discussed. PMID:28270788
Kaczmarek, Magdalena C; Steffens, Melanie C
2017-01-01
Recent studies demonstrated that the sequential induction of contrasting negative and positive emotions can be used as a social influence technique. The original field experiments found that whenever a sudden change in the emotional dynamic occurs - from negative to positive or vice versa - an increase in compliant behavior and an impairment in cognitive functioning can be observed. The goal of the present experiments was a conceptual replication and extension of the results in a more controlled and counterbalanced fashion. To this aim a novel emotion induction technique was created using an outcome related expectancy violation to induce and change emotions. In a first experiment, the influence of contrasting emotions (vs. only one emotion) on compliance, message processing and information recall was assessed among 80 undergraduate students. We were able to show that a positive, then negative experience, and vice versa, led to losses in processing efficacy, not only leaving individuals momentarily vulnerable to social influence attempts, but also impairing information recall. We replicated this pattern of findings in a second experiment ( N = 41). The implications of this innovative induction technique and its findings for theory and future research on the emerging field on contrasting emotions as social-influence techniques are discussed.
NASA Astrophysics Data System (ADS)
Imms, Ryan; Hu, Sijung; Azorin-Peris, Vicente; Trico, Michaël.; Summers, Ron
2014-03-01
Non-contact imaging photoplethysmography (PPG) is a recent development in the field of physiological data acquisition, currently undergoing a large amount of research to characterize and define the range of its capabilities. Contact-based PPG techniques have been broadly used in clinical scenarios for a number of years to obtain direct information about the degree of oxygen saturation for patients. With the advent of imaging techniques, there is strong potential to enable access to additional information such as multi-dimensional blood perfusion and saturation mapping. The further development of effective opto-physiological monitoring techniques is dependent upon novel modelling techniques coupled with improved sensor design and effective signal processing methodologies. The biometric signal and imaging processing platform (bSIPP) provides a comprehensive set of features for extraction and analysis of recorded iPPG data, enabling direct comparison with other biomedical diagnostic tools such as ECG and EEG. Additionally, utilizing information about the nature of tissue structure has enabled the generation of an engineering model describing the behaviour of light during its travel through the biological tissue. This enables the estimation of the relative oxygen saturation and blood perfusion in different layers of the tissue to be calculated, which has the potential to be a useful diagnostic tool.
Modern Techniques in Acoustical Signal and Image Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candy, J V
2002-04-04
Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less
Clinical report writing: Process and perspective
NASA Technical Reports Server (NTRS)
Ewald, H. R.
1981-01-01
Clinical report writing in psychology and psychiatry is addressed. Audience/use analysis and the basic procedures of information gathering, diagnosis, and prognosis are described. Two interlinking processes are involved: the process of creation and the process of communication. Techniques for good report writing are presented.
Bibliography of articles and reports on mineral-separation techniques, processes, and applications
NASA Technical Reports Server (NTRS)
Harmon, R. S.
1971-01-01
A bibliography of published articles and reports on mineral-separation techniques, processes, and applications is presented along with an author and subject index. This information is intended for use in the mineral-separation facility of the Lunar Receiving Laboratory at the NASA Manned Spacecraft Center and as an aid and reference to persons involved or interested in mineral separation.
ERIC Educational Resources Information Center
Pozzi, Francesca; Ceregini, Andrea; Ferlino, Lucia; Persico, Donatella
2016-01-01
The Peer Review (PR) is a very popular technique to support socio-constructivist and connectivist learning processes, online or face-to-face, at all educational levels, in both formal and informal contexts. The idea behind this technique is that sharing views and opinions with others by discussing with peers and receiving and providing formative…
NASA Technical Reports Server (NTRS)
Sowers, J.; Mehrotra, R.; Sethi, I. K.
1989-01-01
A method for extracting road boundaries using the monochrome image of a visual road scene is presented. The statistical information regarding the intensity levels present in the image along with some geometrical constraints concerning the road are the basics of this approach. Results and advantages of this technique compared to others are discussed. The major advantages of this technique, when compared to others, are its ability to process the image in only one pass, to limit the area searched in the image using only knowledge concerning the road geometry and previous boundary information, and dynamically adjust for inconsistencies in the located boundary information, all of which helps to increase the efficacy of this technique.
Higgs, Gary
2006-04-01
Despite recent U.K. Government commitments' to encourage public participation in environmental decision making, those exercises conducted to date have been largely confined to 'traditional' modes of participation such as the dissemination of information and in encouraging feedback on proposals through, for example, questionnaires or surveys. It is the premise of this paper that participative approaches that use IT-based methods, based on combined geographical information systems (GIS) and multi-criteria evaluation techniques that could involve the public in the decision-making process, have the potential to build consensus and reduce disputes and conflicts such as those arising from the siting of different types of waste facilities. The potential of these techniques are documented through a review of the existing literature in order to highlight the opportunities and challenges facing decision makers in increasing the involvement of the public at different stages of the waste facility management process. It is concluded that there are important lessons to be learned by researchers, consultants, managers and decision makers if barriers hindering the wider use of such techniques are to be overcome.
Zhi, Wei; Ge, Zheng; He, Zhen; Zhang, Husen
2014-11-01
Microbial fuel cells (MFCs) employ microorganisms to recover electric energy from organic matter. However, fundamental knowledge of electrochemically active bacteria is still required to maximize MFCs power output for practical applications. This review presents microbiological and electrochemical techniques to help researchers choose the appropriate methods for the MFCs study. Pre-genomic and genomic techniques such as 16S rRNA based phylogeny and metagenomics have provided important information in the structure and genetic potential of electrode-colonizing microbial communities. Post-genomic techniques such as metatranscriptomics allow functional characterizations of electrode biofilm communities by quantifying gene expression levels. Isotope-assisted phylogenetic analysis can further link taxonomic information to microbial metabolisms. A combination of electrochemical, phylogenetic, metagenomic, and post-metagenomic techniques offers opportunities to a better understanding of the extracellular electron transfer process, which in turn can lead to process optimization for power output. Copyright © 2014 Elsevier Ltd. All rights reserved.
Text mining and its potential applications in systems biology.
Ananiadou, Sophia; Kell, Douglas B; Tsujii, Jun-ichi
2006-12-01
With biomedical literature increasing at a rate of several thousand papers per week, it is impossible to keep abreast of all developments; therefore, automated means to manage the information overload are required. Text mining techniques, which involve the processes of information retrieval, information extraction and data mining, provide a means of solving this. By adding meaning to text, these techniques produce a more structured analysis of textual knowledge than simple word searches, and can provide powerful tools for the production and analysis of systems biology models.
NASA Technical Reports Server (NTRS)
Silverman, B.
1979-01-01
All available newly developed nonmetallic thermally stable polymers were examined for the development of processes and techniques by compression molding, injection molding, or thermoforming cabin interior parts. Efforts were directed toward developing molding techniques of new polymers to economically produce usable nonmetallic molded parts. Data on the flame resistant characteristics of the materials were generated from pilot plant batches. Preliminary information on the molding characteristics of the various thermoplastic materials was obtained by producing actual parts.
Information processing for aerospace structural health monitoring
NASA Astrophysics Data System (ADS)
Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.
1998-06-01
Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-13
... Rights Discrimination Complaint); Comment Request AGENCY: Veterans Health Administration, Department of... solicits comments on information needed to process a claimant's civil rights discrimination complaint... techniques or the use of other forms of information technology. Title: Civil Rights Discrimination Complaint...
77 FR 1093 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-09
..., including the use of automated collection techniques or other forms of information technology. 1. Title and... obtains information from an applicant about their marital history, work history, military service... application process obtains information from an applicant about their marital history, work history, benefits...
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
Optical analysis of crystal growth
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Passeur, Andrea; Harper, Sabrina
1994-01-01
Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.
Welding, brazing, and soldering handbook
NASA Technical Reports Server (NTRS)
Kilgore, A. B.; Koehler, M. L.; Metzler, J. W.; Sturges, S. R.
1969-01-01
Handbook gives information on the selection and application of welding, brazing, and soldering techniques for joining various metals. Summary descriptions of processes, criteria for process selection, and advantages of different methods are given.
Quantitative Aspects of Single Molecule Microscopy
Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally
2015-01-01
Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102
Comparing Noun Phrasing Techniques for Use with Medical Digital Library Tools.
ERIC Educational Resources Information Center
Tolle, Kristin M.; Chen, Hsinchun
2000-01-01
Describes a study that investigated the use of a natural language processing technique called noun phrasing to determine whether it is a viable technique for medical information retrieval. Evaluates four noun phrase generation tools for their ability to isolate noun phrases from medical journal abstracts, focusing on precision and recall.…
An Electronic Engineering Curriculum Design Based on Concept-Mapping Techniques
ERIC Educational Resources Information Center
Toral, S. L.; Martinez-Torres, M. R.; Barrero, F.; Gallardo, S.; Duran, M. J.
2007-01-01
Curriculum design is a concern in European Universities as they face the forthcoming European Higher Education Area (EHEA). This process can be eased by the use of scientific tools such as Concept-Mapping Techniques (CMT) that extract and organize the most relevant information from experts' experience using statistics techniques, and helps a…
Opportunities to Create Active Learning Techniques in the Classroom
ERIC Educational Resources Information Center
Camacho, Danielle J.; Legare, Jill M.
2015-01-01
The purpose of this article is to contribute to the growing body of research that focuses on active learning techniques. Active learning techniques require students to consider a given set of information, analyze, process, and prepare to restate what has been learned--all strategies are confirmed to improve higher order thinking skills. Active…
Document Examination: Applications of Image Processing Systems.
Kopainsky, B
1989-12-01
Dealing with images is a familiar business for an expert in questioned documents: microscopic, photographic, infrared, and other optical techniques generate images containing the information he or she is looking for. A recent method for extracting most of this information is digital image processing, ranging from the simple contrast and contour enhancement to the advanced restoration of blurred texts. When combined with a sophisticated physical imaging system, an image pricessing system has proven to be a powerful and fast tool for routine non-destructive scanning of suspect documents. This article reviews frequent applications, comprising techniques to increase legibility, two-dimensional spectroscopy (ink discrimination, alterations, erased entries, etc.), comparison techniques (stamps, typescript letters, photo substitution), and densitometry. Computerized comparison of handwriting is not included. Copyright © 1989 Central Police University.
Idbeaa, Tarik; Abdul Samad, Salina; Husain, Hafizah
2016-01-01
This paper presents a novel secure and robust steganographic technique in the compressed video domain namely embedding-based byte differencing (EBBD). Unlike most of the current video steganographic techniques which take into account only the intra frames for data embedding, the proposed EBBD technique aims to hide information in both intra and inter frames. The information is embedded into a compressed video by simultaneously manipulating the quantized AC coefficients (AC-QTCs) of luminance components of the frames during MPEG-2 encoding process. Later, during the decoding process, the embedded information can be detected and extracted completely. Furthermore, the EBBD basically deals with two security concepts: data encryption and data concealing. Hence, during the embedding process, secret data is encrypted using the simplified data encryption standard (S-DES) algorithm to provide better security to the implemented system. The security of the method lies in selecting candidate AC-QTCs within each non-overlapping 8 × 8 sub-block using a pseudo random key. Basic performance of this steganographic technique verified through experiments on various existing MPEG-2 encoded videos over a wide range of embedded payload rates. Overall, the experimental results verify the excellent performance of the proposed EBBD with a better trade-off in terms of imperceptibility and payload, as compared with previous techniques while at the same time ensuring minimal bitrate increase and negligible degradation of PSNR values. PMID:26963093
Idbeaa, Tarik; Abdul Samad, Salina; Husain, Hafizah
2016-01-01
This paper presents a novel secure and robust steganographic technique in the compressed video domain namely embedding-based byte differencing (EBBD). Unlike most of the current video steganographic techniques which take into account only the intra frames for data embedding, the proposed EBBD technique aims to hide information in both intra and inter frames. The information is embedded into a compressed video by simultaneously manipulating the quantized AC coefficients (AC-QTCs) of luminance components of the frames during MPEG-2 encoding process. Later, during the decoding process, the embedded information can be detected and extracted completely. Furthermore, the EBBD basically deals with two security concepts: data encryption and data concealing. Hence, during the embedding process, secret data is encrypted using the simplified data encryption standard (S-DES) algorithm to provide better security to the implemented system. The security of the method lies in selecting candidate AC-QTCs within each non-overlapping 8 × 8 sub-block using a pseudo random key. Basic performance of this steganographic technique verified through experiments on various existing MPEG-2 encoded videos over a wide range of embedded payload rates. Overall, the experimental results verify the excellent performance of the proposed EBBD with a better trade-off in terms of imperceptibility and payload, as compared with previous techniques while at the same time ensuring minimal bitrate increase and negligible degradation of PSNR values.
Abstracts of Research, July 1973 through June 1974.
ERIC Educational Resources Information Center
Ohio State Univ., Columbus. Computer and Information Science Research Center.
Abstracts of research papers in the fields of computer and information science are given; 72 papers are abstracted in the areas of information storage and retrieval, information processing, linguistic analysis, artificial intelligence, mathematical techniques, systems programing, and computer networks. In addition, the Ohio State University…
Professional Learning Networks Designed for Teacher Learning
ERIC Educational Resources Information Center
Trust, Torrey
2012-01-01
In the information age, students must learn to navigate and evaluate an expanding network of information. Highly effective teachers model this process of information analysis and knowledge acquisition by continually learning through collaboration, professional development, and studying pedagogical techniques and best practices. Many teachers have…
Applications of mass spectrometry techniques to autoclave curing of materials
NASA Technical Reports Server (NTRS)
Smith, A. C.
1983-01-01
Mass spectrometer analysis of gases evolved from polymer materials during a cure cycle can provide a wealth of information useful for studying cure properties and procedures. In this paper data is presented for two materials to support the feasibility of using mass spectrometer gas analysis techniques to enhance the knowledge of autoclave curing of composite materials and provide additional information for process control evaluation. It is expected that this technique will also be useful in working out the details involved in determining the proper cure cycle for new or experimental materials.
NDE of ceramics and ceramic composites
NASA Technical Reports Server (NTRS)
Vary, Alex; Klima, Stanley J.
1991-01-01
Although nondestructive evaluation (NDE) techniques for ceramics are fairly well developed, they are difficult to apply in many cases for high probability detection of the minute flaws that can cause failure in monolithic ceramics. Conventional NDE techniques are available for monolithic and fiber reinforced ceramic matrix composites, but more exact quantitative techniques needed are still being investigated and developed. Needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in ceramic composites. NDE techniques that will ultimately be applicable to production and quality control of ceramic structures are still emerging from the lab. Needs are different depending on the processing stage, fabrication method, and nature of the finished product. NDE techniques are being developed in concert with materials processing research where they can provide feedback information to processing development and quality improvement. NDE techniques also serve as research tools for materials characterization and for understanding failure processes, e.g., during thermomechanical testing.
Ontology-Driven Information Integration
NASA Technical Reports Server (NTRS)
Tissot, Florence; Menzel, Chris
2005-01-01
Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.
A secure and robust information hiding technique for covert communication
NASA Astrophysics Data System (ADS)
Parah, S. A.; Sheikh, J. A.; Hafiz, A. M.; Bhat, G. M.
2015-08-01
The unprecedented advancement of multimedia and growth of the internet has made it possible to reproduce and distribute digital media easier and faster. This has given birth to information security issues, especially when the information pertains to national security, e-banking transactions, etc. The disguised form of encrypted data makes an adversary suspicious and increases the chance of attack. Information hiding overcomes this inherent problem of cryptographic systems and is emerging as an effective means of securing sensitive data being transmitted over insecure channels. In this paper, a secure and robust information hiding technique referred to as Intermediate Significant Bit Plane Embedding (ISBPE) is presented. The data to be embedded is scrambled and embedding is carried out using the concept of Pseudorandom Address Vector (PAV) and Complementary Address Vector (CAV) to enhance the security of the embedded data. The proposed ISBPE technique is fully immune to Least Significant Bit (LSB) removal/replacement attack. Experimental investigations reveal that the proposed technique is more robust to various image processing attacks like JPEG compression, Additive White Gaussian Noise (AWGN), low pass filtering, etc. compared to conventional LSB techniques. The various advantages offered by ISBPE technique make it a good candidate for covert communication.
An automatic optimum kernel-size selection technique for edge enhancement
Chavez, Pat S.; Bauer, Brian P.
1982-01-01
Edge enhancement is a technique that can be considered, to a first order, a correction for the modulation transfer function of an imaging system. Digital imaging systems sample a continuous function at discrete intervals so that high-frequency information cannot be recorded at the same precision as lower frequency data. Because of this, fine detail or edge information in digital images is lost. Spatial filtering techniques can be used to enhance the fine detail information that does exist in the digital image, but the filter size is dependent on the type of area being processed. A technique has been developed by the authors that uses the horizontal first difference to automatically select the optimum kernel-size that should be used to enhance the edges that are contained in the image.
Camouflage target detection via hyperspectral imaging plus information divergence measurement
NASA Astrophysics Data System (ADS)
Chen, Yuheng; Chen, Xinhua; Zhou, Jiankang; Ji, Yiqun; Shen, Weimin
2016-01-01
Target detection is one of most important applications in remote sensing. Nowadays accurate camouflage target distinction is often resorted to spectral imaging technique due to its high-resolution spectral/spatial information acquisition ability as well as plenty of data processing methods. In this paper, hyper-spectral imaging technique together with spectral information divergence measure method is used to solve camouflage target detection problem. A self-developed visual-band hyper-spectral imaging device is adopted to collect data cubes of certain experimental scene before spectral information divergences are worked out so as to discriminate target camouflage and anomaly. Full-band information divergences are measured to evaluate target detection effect visually and quantitatively. Information divergence measurement is proved to be a low-cost and effective tool for target detection task and can be further developed to other target detection applications beyond spectral imaging technique.
NASA Technical Reports Server (NTRS)
Corey, Stephen; Carnahan, Richard S., Jr.
1990-01-01
A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.
Data fusion for delivering advanced traveler information services
DOT National Transportation Integrated Search
2003-05-01
Many transportation professionals have suggested that improved ATIS data fusion techniques and processing will improve the overall quality, timeliness, and usefulness of traveler information. The purpose of this study was four fold. First, conduct a ...
ERIC Educational Resources Information Center
Ramsey-Klee, Diane M.; Richman, Vivian
The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…
Applications of optical sensing for laser cutting and drilling.
Fox, Mahlen D T; French, Paul; Peters, Chris; Hand, Duncan P; Jones, Julian D C
2002-08-20
Any reliable automated production system must include process control and monitoring techniques. Two laser processing techniques potentially lending themselves to automation are percussion drilling and cutting. For drilling we investigate the performance of a modification of a nonintrusive optical focus control system we previously developed for laser welding, which exploits the chromatic aberrations of the processing optics to determine focal error. We further developed this focus control system for closed-loop control of laser cutting. We show that an extension of the technique can detect deterioration in cut quality, and we describe practical trials carried out on different materials using both oxygen and nitrogen assist gas. We base our techniques on monitoring the light generated by the process, captured nonintrusively by the effector optics and processed remotely from the workpiece. We describe the relationship between the temporal and the chromatic modulation of the detected light and process quality and show how the information can be used as the basis of a process control system.
A Survey of Stemming Algorithms in Information Retrieval
ERIC Educational Resources Information Center
Moral, Cristian; de Antonio, Angélica; Imbert, Ricardo; Ramírez, Jaime
2014-01-01
Background: During the last fifty years, improved information retrieval techniques have become necessary because of the huge amount of information people have available, which continues to increase rapidly due to the use of new technologies and the Internet. Stemming is one of the processes that can improve information retrieval in terms of…
Honda, Masayuki; Matsumoto, Takehiro
2017-01-01
Several kinds of event log data produced in daily clinical activities have yet to be used for secure and efficient improvement of hospital activities. Data Warehouse systems in Hospital Information Systems used for the analysis of structured data such as disease, lab-tests, and medications, have also shown efficient outcomes. This article is focused on two kinds of essential functions: process mining using log data and non-structured data analysis via Natural Language Processing.
Applicability and Limitations of Reliability Allocation Methods
NASA Technical Reports Server (NTRS)
Cruz, Jose A.
2016-01-01
Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.
Optimization of Visual Information Presentation for Visual Prosthesis.
Guo, Fei; Yang, Yuan; Gao, Yong
2018-01-01
Visual prosthesis applying electrical stimulation to restore visual function for the blind has promising prospects. However, due to the low resolution, limited visual field, and the low dynamic range of the visual perception, huge loss of information occurred when presenting daily scenes. The ability of object recognition in real-life scenarios is severely restricted for prosthetic users. To overcome the limitations, optimizing the visual information in the simulated prosthetic vision has been the focus of research. This paper proposes two image processing strategies based on a salient object detection technique. The two processing strategies enable the prosthetic implants to focus on the object of interest and suppress the background clutter. Psychophysical experiments show that techniques such as foreground zooming with background clutter removal and foreground edge detection with background reduction have positive impacts on the task of object recognition in simulated prosthetic vision. By using edge detection and zooming technique, the two processing strategies significantly improve the recognition accuracy of objects. We can conclude that the visual prosthesis using our proposed strategy can assist the blind to improve their ability to recognize objects. The results will provide effective solutions for the further development of visual prosthesis.
Optimization of Visual Information Presentation for Visual Prosthesis
Gao, Yong
2018-01-01
Visual prosthesis applying electrical stimulation to restore visual function for the blind has promising prospects. However, due to the low resolution, limited visual field, and the low dynamic range of the visual perception, huge loss of information occurred when presenting daily scenes. The ability of object recognition in real-life scenarios is severely restricted for prosthetic users. To overcome the limitations, optimizing the visual information in the simulated prosthetic vision has been the focus of research. This paper proposes two image processing strategies based on a salient object detection technique. The two processing strategies enable the prosthetic implants to focus on the object of interest and suppress the background clutter. Psychophysical experiments show that techniques such as foreground zooming with background clutter removal and foreground edge detection with background reduction have positive impacts on the task of object recognition in simulated prosthetic vision. By using edge detection and zooming technique, the two processing strategies significantly improve the recognition accuracy of objects. We can conclude that the visual prosthesis using our proposed strategy can assist the blind to improve their ability to recognize objects. The results will provide effective solutions for the further development of visual prosthesis. PMID:29731769
2014-03-27
fidelity. This pairing is accomplished through the use of a space mapping technique, which is a process where the design space of a lower fidelity model...is aligned a higher fidelity model. The intent of applying space mapping techniques to the field of surrogate construction is to leverage the
Johnson, M M
1990-03-01
This study explored the use of process tracing techniques in examining the decision-making processes of older and younger adults. Thirty-six college-age and thirty-six retirement-age participants decided which one of six cars they would purchase on the basis of computer-accessed data. They provided information search protocols. Results indicate that total time to reach a decision did not differ according to age. However, retirement-age participants used less information, spent more time viewing, and re-viewed fewer bits of information than college-age participants. Information search patterns differed markedly between age groups. Patterns of retirement-age adults indicated their use of noncompensatory decision rules which, according to decision-making literature (Payne, 1976), reduce cognitive processing demands. The patterns of the college-age adults indicated their use of compensatory decision rules, which have higher processing demands.
NASA Technical Reports Server (NTRS)
Isaac, Bryan J.
1994-01-01
Electrochemical Impedance Spectroscopy (EIS) is a valuable tool for investigating the chemical and physical processes occurring at electrode surfaces. It offers information about electron transfer at interfaces, kinetics of reactions, and diffusion characteristics of the bulk phase between the electrodes. For battery cells, this technique offers another advantage in that it can be done without taking the battery apart. This non-destructive analysis technique can thus be used to gain a better understanding of the processes occurring within a battery cell. This also raises the possibility of improvements in battery design and identification or prediction of battery characteristics useful in industry and aerospace applications. EIS as a technique is powerful and capable of yielding significant information about the cell, but it also requires that the many parameters under investigation can be resolved. This implies an understanding of the processes occurring in a battery cell. Many battery types were surveyed in this work, but the main emphasis was on nickel/metal hydride batteries.
Retinal Information Processing for Minimum Laser Lesion Detection and Cumulative Damage
1992-09-17
TAL3Unaqr~orJ:ccd [] J ,;--Wicic tion --------------... MYRON....... . ................... ... ....... ...........................MYRON L. WOLBARSHT B D ist...possible beneficial visual function of the small retinal image movements. B . Visual System Models Prior models of visual system information processing have...against standard secondary sources whose calibrations can be traced to the National Bureau of Standards. B . Electrophysiological Techniques Extracellular
In situ spectroradiometric quantification of ERTS data. [Prescott and Phoenix, Arizona
NASA Technical Reports Server (NTRS)
Yost, E. F. (Principal Investigator)
1975-01-01
The author has identified the following significant results. Analyses of ERTS-1 photographic data were made to quantitatively relate ground reflectance measurements to photometric characteristics of the images. Digital image processing of photographic data resulted in a nomograph to correct for atmospheric effects over arid terrain. Optimum processing techniques to derive maximum geologic information from desert areas were established. Additive color techniques to provide quantitative measurements of surface water between different orbits were developed which were accepted as the standard flood mapping techniques using ERTS.
Assesment on the performance of electrode arrays using image processing technique
NASA Astrophysics Data System (ADS)
Usman, N.; Khiruddin, A.; Nawawi, Mohd
2017-08-01
Interpreting inverted resistivity section is time consuming, tedious and requires other sources of information to be relevant geologically. Image processing technique was used in order to perform post inversion processing which make geophysical data interpretation easier. The inverted data sets were imported into the PCI Geomatica 9.0.1 for further processing. The data sets were clipped and merged together in order to match the coordinates of the three layers and permit pixel to pixel analysis. Dipole-dipole array is more sensitive to resistivity variation with depth in comparison with Werner-Schlumberger and pole-dipole. Image processing serves as good post-inversion tool in geophysical data processing.
A Rapid Information Dissemination System--A Follow-Up Report.
ERIC Educational Resources Information Center
Miner, Lynn E.; Niederjohn, Russel J.
1980-01-01
A rapid information dissemination system at Marquette University which uses an audio-based technique for quickly transmitting time-dependent information to research faculty is described. The system uses a tape recorder, a special purpose speech processing system, and a telephone auto-answer recorder. Present uses and proposed future modifications…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-26
... Act (PRA) Officer, Office of Information Technology (OIT), TSA-11, Transportation Security..., electronic, mechanical, or other technological collection techniques or other forms of information technology... criminal history records check (CHRC). As part of the CHRC process, the individual must provide identifying...
Digital image processing for photo-reconnaissance applications
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1972-01-01
Digital image-processing techniques developed for processing pictures from NASA space vehicles are analyzed in terms of enhancement, quantitative restoration, and information extraction. Digital filtering, and the action of a high frequency filter in the real and Fourier domain are discussed along with color and brightness.
ERIC Educational Resources Information Center
Booker, Queen Esther
2009-01-01
An approach used to tackle the problem of helping online students find the classes they want and need is a filtering technique called "social information filtering," a general approach to personalized information filtering. Social information filtering essentially automates the process of "word-of-mouth" recommendations: items are recommended to a…
Photoelectrochemical information storage using an azobenzene derivative
NASA Astrophysics Data System (ADS)
Liu, Z. F.; Hashimoto, K.; Fujishima, A.
1990-10-01
HIGH-DENSITY information storage is becoming an increasingly important technological objective. The 'heat-mode' storage techniques (in which only the thermal energy of laser light is used in the recording process and hence information usually stored as a physical change of the storage media) that are used in current optical memories are limited by the diffraction properties of light1, and the alternative 'photon-mode' (in which information is stored as a photon-induced chemical change of the storage media) has attracted attention recently for high-density storage. The most promising candidates for realizing this mode seem to be photochro-ism and photochemical hole burning; but these have some intrinsic drawbacks1,2. Here we present a novel 'photon-mode' technique that uses the photoelectrochemical properties of a Langmuir-Blodgett film of an azobenzene derivative. The system can be interconverted photochemically or electrochemically between three chemical states, and this three-state system is shown to provide a potential storage process that allows for ultra-high storage density, multi-function memory and non-destructive information readout.
Mapping invasive weeds and their control with spatial information technologies
USDA-ARS?s Scientific Manuscript database
We discuss applications of airborne multispectral digital imaging systems, imaging processing techniques, global positioning systems (GPS), and geographic information systems (GIS) for mapping the invasive weeds giant salvinia (Salvinia molesta) and Brazilian pepper (Schinus terebinthifolius) and fo...
Improving travel information products via robust estimation techniques : final report, March 2009.
DOT National Transportation Integrated Search
2009-03-01
Traffic-monitoring systems, such as those using loop detectors, are prone to coverage gaps, arising from sensor noise, processing errors and : transmission problems. Such gaps adversely affect the accuracy of Advanced Traveler Information Systems. Th...
NASA Astrophysics Data System (ADS)
Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto
2016-06-01
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methodsmore » and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.« less
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
NASA Technical Reports Server (NTRS)
1984-01-01
Topics discussed at the symposium include hardware, geographic information system (GIS) implementation, processing remotely sensed data, spatial data structures, and NASA programs in remote sensing information systems. Attention is also given GIS applications, advanced techniques, artificial intelligence, graphics, spatial navigation, and classification. Papers are included on the design of computer software for geographic image processing, concepts for a global resource information system, algorithm development for spatial operators, and an application of expert systems technology to remotely sensed image analysis.
The web-surfing bariatic patient: the role of the internet in the decision-making process.
Paolino, Luca; Genser, Laurent; Fritsch, Sylvie; De' Angelis, Nicola; Azoulay, Daniel; Lazzati, Andrea
2015-04-01
Health-related information on the Internet is constantly increasing, but its quality and accountability are difficult to assess. Patients browse the Net to get more information, but the impact of the Internet on their decisions about surgical techniques, referral centers, or surgeon choice are still not clear. This study aimed to describe the role of the Internet in the decision-making process of obese patients seeking bariatric surgery. Two hundred and twelve candidates for bariatric surgery were asked to answer a questionnaire evaluating their access to the Internet, the usefulness and trustworthiness of Internet-retrieved information, the verification of the information, and the role of the information in the decision-making process. Two hundred and twelve patients answered the questionnaire. Of these, 95.1% had access to the Internet and 77.8% reported having researched about bariatric surgery. Their main interests were the surgical techniques (81.4%) and other patients' experiences (72.3%). The favorite Web sites were those affiliated to public hospitals or edited by other patients. The accountability of the e-information was mainly evaluated by discussion with the general practitioner (GP) (83.0%) or family members and friends (46.8%). One patient in four decided to undergo bariatric surgery mainly based on e-information, while discussion about treatment options with the GP and the hospital reputation were taken into account in 77.8 and 51.7% of cases, respectively. Most patients seeking bariatric surgery search for health information online. E-information seems to have an important role in the decision-making process of patients who are candidates for bariatric surgery.
1985-11-01
User Interface that consists of a set of callable execution time routines available to an application program for form processing . IISS Function Screen...provisions for test consists of the normal testing techniques that are accomplished during the construction process . They consist of design and code...application presents a form * to the user which must be filled in with information for processing by that application. The application then
NASA Technical Reports Server (NTRS)
Landgrebe, D.
1974-01-01
A broad study is described to evaluate a set of machine analysis and processing techniques applied to ERTS-1 data. Based on the analysis results in urban land use analysis and soil association mapping together with previously reported results in general earth surface feature identification and crop species classification, a profile of general applicability of this procedure is beginning to emerge. Put in the hands of a user who knows well the information needed from the data and also is familiar with the region to be analyzed it appears that significantly useful information can be generated by these methods. When supported by preprocessing techniques such as the geometric correction and temporal registration capabilities, final products readily useable by user agencies appear possible. In parallel with application, through further research, there is much potential for further development of these techniques both with regard to providing higher performance and in new situations not yet studied.
Industrial application of semantic process mining
NASA Astrophysics Data System (ADS)
Espen Ingvaldsen, Jon; Atle Gulla, Jon
2012-05-01
Process mining relates to the extraction of non-trivial and useful information from information system event logs. It is a new research discipline that has evolved significantly since the early work on idealistic process logs. Over the last years, process mining prototypes have incorporated elements from semantics and data mining and targeted visualisation techniques that are more user-friendly to business experts and process owners. In this article, we present a framework for evaluating different aspects of enterprise process flows and address practical challenges of state-of-the-art industrial process mining. We also explore the inherent strengths of the technology for more efficient process optimisation.
A Novel Catalyst Deposition Technique for the Growth of Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Delzeit, Lance; Cassell, A.; Stevens, R.; Nguyen, C.; Meyyappan, M.; DeVincenzi, Donald L. (Technical Monitor)
2001-01-01
This viewgraph presentation provides information on the development of a technique at NASA's Ames Research Center by which carbon nanotubes (NT) can be grown. The project had several goals which included: 1) scaleability, 2) ability to control single wall nanotube (SWNT) and multiwall nanotube (MWNT) formation, 3) ability to control the density of nanotubes as they grow, 4) ability to apply standard masking techniques for NT patterning. Information regarding the growth technique includes its use of a catalyst deposition process. SWNTs of varying thicknesses can be grown by changing the catalyst composition. Demonstrations are given of various methods of masking including the use of transmission electron microscopic (TEM) grids.
Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.
Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu
2017-05-23
This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.
The Use of Nominal Group Technique: Case Study in Vietnam
ERIC Educational Resources Information Center
Dang, Vi Hoang
2015-01-01
The Nominal Group Technique (NGT) is a structured process to gather information from a group. The technique was first described in early 1970s and has since become a widely-used standard to facilitate working groups. The NGT is effective for generating large numbers of creative new ideas and for group priority setting. This article reports on a…
Bayır, Şafak
2016-01-01
With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC. PMID:27110272
NASA Astrophysics Data System (ADS)
Grossman, Barry G.; Gonzalez, Frank S.; Blatt, Joel H.; Hooker, Jeffery A.
1992-03-01
The development of efficient high speed techniques to recognize, locate, and quantify damage is vitally important for successful automated inspection systems such as ones used for the inspection of undersea pipelines. Two critical problems must be solved to achieve these goals: the reduction of nonuseful information present in the video image and automatic recognition and quantification of extent and location of damage. Artificial neural network processed moire profilometry appears to be a promising technique to accomplish this. Real time video moire techniques have been developed which clearly distinguish damaged and undamaged areas on structures, thus reducing the amount of extraneous information input into an inspection system. Artificial neural networks have demonstrated advantages for image processing, since they can learn the desired response to a given input and are inherently fast when implemented in hardware due to their parallel computing architecture. Video moire images of pipes with dents of different depths were used to train a neural network, with the desired output being the location and severity of the damage. The system was then successfully tested with a second series of moire images. The techniques employed and the results obtained are discussed.
Raisutis, Renaldas; Samaitis, Vykintas
2017-01-01
This work proposes a novel hybrid signal processing technique to extract information on disbond-type defects from a single B-scan in the process of non-destructive testing (NDT) of glass fiber reinforced plastic (GFRP) material using ultrasonic guided waves (GW). The selected GFRP sample has been a segment of wind turbine blade, which possessed an aerodynamic shape. Two disbond type defects having diameters of 15 mm and 25 mm were artificially constructed on its trailing edge. The experiment has been performed using the low-frequency ultrasonic system developed at the Ultrasound Institute of Kaunas University of Technology and only one side of the sample was accessed. A special configuration of the transmitting and receiving transducers fixed on a movable panel with a separation distance of 50 mm was proposed for recording the ultrasonic guided wave signals at each one-millimeter step along the scanning distance up to 500 mm. Finally, the hybrid signal processing technique comprising the valuable features of the three most promising signal processing techniques: cross-correlation, wavelet transform, and Hilbert–Huang transform has been applied to the received signals for the extraction of defects information from a single B-scan image. The wavelet transform and cross-correlation techniques have been combined in order to extract the approximated size and location of the defects and measurements of time delays. Thereafter, Hilbert–Huang transform has been applied to the wavelet transformed signal to compare the variation of instantaneous frequencies and instantaneous amplitudes of the defect-free and defective signals. PMID:29232845
Spectroscopic analysis technique for arc-welding process control
NASA Astrophysics Data System (ADS)
Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel
2005-09-01
The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.
The Strategic Thinking Process: Efficient Mobilization of Human Resources for System Definition
Covvey, H. D.
1987-01-01
This paper describes the application of several group management techniques to the creation of needs specifications and information systems strategic plans in health care institutions. The overall process is called the “Strategic Thinking Process”. It is a formal methodology that can reduce the time and cost of creating key documents essential for the successful implementation of health care information systems.
Radar transponder apparatus and signal processing technique
Axline, Jr., Robert M.; Sloan, George R.; Spalding, Richard E.
1996-01-01
An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance the transponder's echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR.
Radar transponder apparatus and signal processing technique
Axline, R.M. Jr.; Sloan, G.R.; Spalding, R.E.
1996-01-23
An active, phase-coded, time-grating transponder and a synthetic-aperture radar (SAR) and signal processor means, in combination, allow the recognition and location of the transponder (tag) in the SAR image and allow communication of information messages from the transponder to the SAR. The SAR is an illuminating radar having special processing modifications in an image-formation processor to receive an echo from a remote transponder, after the transponder receives and retransmits the SAR illuminations, and to enhance the transponder`s echo relative to surrounding ground clutter by recognizing special transponder modulations from phase-shifted from the transponder retransmissions. The remote radio-frequency tag also transmits information to the SAR through a single antenna that also serves to receive the SAR illuminations. Unique tag-modulation and SAR signal processing techniques, in combination, allow the detection and precise geographical location of the tag through the reduction of interfering signals from ground clutter, and allow communication of environmental and status information from said tag to be communicated to said SAR. 4 figs.
MT+, integrating magnetotellurics to determine earth structure, physical state, and processes
Bedrosian, P.A.
2007-01-01
As one of the few deep-earth imaging techniques, magnetotellurics provides information on both the structure and physical state of the crust and upper mantle. Magnetotellurics is sensitive to electrical conductivity, which varies within the earth by many orders of magnitude and is modified by a range of earth processes. As with all geophysical techniques, magnetotellurics has a non-unique inverse problem and has limitations in resolution and sensitivity. As such, an integrated approach, either via the joint interpretation of independent geophysical models, or through the simultaneous inversion of independent data sets is valuable, and at times essential to an accurate interpretation. Magnetotelluric data and models are increasingly integrated with geological, geophysical and geochemical information. This review considers recent studies that illustrate the ways in which such information is combined, from qualitative comparisons to statistical correlation studies to multi-property inversions. Also emphasized are the range of problems addressed by these integrated approaches, and their value in elucidating earth structure, physical state, and processes. ?? Springer Science+Business Media B.V. 2007.
Process simulation for advanced composites production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Allendorf, M.D.; Ferko, S.M.; Griffiths, S.
1997-04-01
The objective of this project is to improve the efficiency and lower the cost of chemical vapor deposition (CVD) processes used to manufacture advanced ceramics by providing the physical and chemical understanding necessary to optimize and control these processes. Project deliverables include: numerical process models; databases of thermodynamic and kinetic information related to the deposition process; and process sensors and software algorithms that can be used for process control. Target manufacturing techniques include CVD fiber coating technologies (used to deposit interfacial coatings on continuous fiber ceramic preforms), chemical vapor infiltration, thin-film deposition processes used in the glass industry, and coatingmore » techniques used to deposit wear-, abrasion-, and corrosion-resistant coatings for use in the pulp and paper, metals processing, and aluminum industries.« less
The Hidden Technology: Dictation Systems.
ERIC Educational Resources Information Center
Barton, Kathy; And Others
This booklet provides business and office teachers with background information, supporting materials, recruiting techniques, and a suggested unit plan that integrates the concepts related to dictation systems into information processing curricula. An "Introduction" (Donna Everett) discusses the need for dictation skills. "Need for Dictation…
Preceiving Patterns of Reference Service: A Survey
ERIC Educational Resources Information Center
Blakely, Florence
1971-01-01
Reference librarians must, if they hope to survive, retool in preparation for becoming the interface between the patron and computer-based information systems. This involves sharpening the interview technique and understanding where to plug into the information flow process. (4 references) (Author)
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-30
... technological collection techniques or other forms of information technology, e.g., permitting electronic... form I-751 and an estimated time burden per response of 1.17 hours for the biometric processing. (6) An...
White-Light Optical Information Processing and Holography.
1985-07-29
this technique is the processing system does not require to carry its own light source. It is very suitable for spaceborne and satellite application. We...developed a technique of generating a spatialtrequency color coded speech spectrogram with a white-light optical system . This system not only offers a low...that the annoying moire fringes can be eliminated. In short, we have once again demonstrated the versatility of the white-light progress system ; a
Effects of foveal information processing
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.
1984-01-01
The scanning behavior of pilots must be understood so that cockpit displays can be assembled which will provide the most information accurately and quickly to the pilot. The results of seven years of collecting and analyzing pilot scanning data are summarized. The data indicate that pilot scanning behavior is: (1) subsconscious; (2) situation dependent; and (3) can be disrupted if pilots are forced to make conscious decisions. Testing techniques and scanning analysis techniques have been developed that are sensitive to pilot workload.
Tuberculosis diagnosis support analysis for precarious health information systems.
Orjuela-Cañón, Alvaro David; Camargo Mendoza, Jorge Eliécer; Awad García, Carlos Enrique; Vergara Vela, Erika Paola
2018-04-01
Pulmonary tuberculosis is a world emergency for the World Health Organization. Techniques and new diagnosis tools are important to battle this bacterial infection. There have been many advances in all those fields, but in developing countries such as Colombia, where the resources and infrastructure are limited, new fast and less expensive strategies are increasingly needed. Artificial neural networks are computational intelligence techniques that can be used in this kind of problems and offer additional support in the tuberculosis diagnosis process, providing a tool to medical staff to make decisions about management of subjects under suspicious of tuberculosis. A database extracted from 105 subjects with precarious information of people under suspect of pulmonary tuberculosis was used in this study. Data extracted from sex, age, diabetes, homeless, AIDS status and a variable with clinical knowledge from the medical personnel were used. Models based on artificial neural networks were used, exploring supervised learning to detect the disease. Unsupervised learning was used to create three risk groups based on available information. Obtained results are comparable with traditional techniques for detection of tuberculosis, showing advantages such as fast and low implementation costs. Sensitivity of 97% and specificity of 71% where achieved. Used techniques allowed to obtain valuable information that can be useful for physicians who treat the disease in decision making processes, especially under limited infrastructure and data. Copyright © 2018 Elsevier B.V. All rights reserved.
Technique and cue selection for graphical presentation of generic hyperdimensional data
NASA Astrophysics Data System (ADS)
Howard, Lee M.; Burton, Robert P.
2013-12-01
Several presentation techniques have been created for visualization of data with more than three variables. Packages have been written, each of which implements a subset of these techniques. However, these packages generally fail to provide all the features needed by the user during the visualization process. Further, packages generally limit support for presentation techniques to a few techniques. A new package called Petrichor accommodates all necessary and useful features together in one system. Any presentation technique may be added easily through an extensible plugin system. Features are supported by a user interface that allows easy interaction with data. Annotations allow users to mark up visualizations and share information with others. By providing a hyperdimensional graphics package that easily accommodates presentation techniques and includes a complete set of features, including those that are rarely or never supported elsewhere, the user is provided with a tool that facilitates improved interaction with multivariate data to extract and disseminate information.
Simulation Techniques in Training College Administrators.
ERIC Educational Resources Information Center
Fincher, Cameron
Traditional methods of recruitment and selection in academic administration have not placed an emphasis on formal training or preparation but have relied heavily on informal notions of experiential learning. Simulation as a device for representing complex processes in a manageable form, gaming as an organizing technique for training and…
Technique for experimental determination of radiation interchange factors in solar wavelengths
NASA Technical Reports Server (NTRS)
Bobco, R. P.; Nolte, L. J.; Wensley, J. R.
1971-01-01
Process obtains solar heating data which support analytical design. Process yields quantitative information on local solar exposure of models which are geometrically and reflectively similar to prototypes under study. Models are tested in a shirtsleeve environment.
Syllabus for a Course in File Management. Curriculum for the Information Sciences, Report No. 9.
ERIC Educational Resources Information Center
Carroll, John M.
The course treats the organization and structure of files including relationships between information representation and processing techniques, transformations between storage media, and the referencing of information as related to the structure of its representation. The intent of the course is fourfold: (a) To teach the underlying principles of…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-26
... Project- Based Section 8 Contracts AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice... through the use of appropriate automated collection techniques or other forms of information technology, e... Section 8 project-based assistance contracts are renewed. The Section 8 contract renewal process is an...
Tagline: Information Extraction for Semi-Structured Text Elements in Medical Progress Notes
ERIC Educational Resources Information Center
Finch, Dezon Kile
2012-01-01
Text analysis has become an important research activity in the Department of Veterans Affairs (VA). Statistical text mining and natural language processing have been shown to be very effective for extracting useful information from medical documents. However, neither of these techniques is effective at extracting the information stored in…
The Information Impact: Ensuring New Product Winners.
ERIC Educational Resources Information Center
Trubkin, Loene
Despite investment in new research tools and techniques, the product development success rate has not improved within the last 25 years. One way to increase the success rate is to have the right information at each stage of the process. Today, a relatively new method of gathering information--online access to electronic files called…
Comparative performance evaluation of transform coding in image pre-processing
NASA Astrophysics Data System (ADS)
Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha
2017-07-01
We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.
Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement
NASA Astrophysics Data System (ADS)
Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.
In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.
Radiant thinking and the use of the mind map in nurse practitioner education.
Spencer, Julie R; Anderson, Kelley M; Ellis, Kathryn K
2013-05-01
The concept of radiant thinking, which led to the concept of mind mapping, promotes all aspects of the brain working in synergy, with thought beginning from a central point. The mind map, which is a graphical technique to improve creative thinking and knowledge attainment, utilizes colors, images, codes, and dimensions to amplify and enhance key ideas. This technique augments the visualization of relationships and links between concepts, which aids in information acquisition, data retention, and overall comprehension. Faculty can promote students' use of the technique for brainstorming, organizing ideas, taking notes, learning collaboratively, presenting, and studying. These applications can be used in problem-based learning, developing plans of care, health promotion activities, synthesizing disease processes, and forming differential diagnoses. Mind mapping is a creative way for students to engage in a unique method of learning that can expand memory recall and help create a new environment for processing information. Copyright 2013, SLACK Incorporated.
Estevez, Claudio; Kailas, Aravind
2012-01-01
Millimeter-wave technology shows high potential for future wireless personal area networks, reaching over 1 Gbps transmissions using simple modulation techniques. Current specifications consider dividing the spectrum into effortlessly separable spectrum ranges. These low requirements open a research area in time and space multiplexing techniques for millimeter-waves. In this work a process-stacking multiplexing access algorithm is designed for single channel operation. The concept is intuitive, but its implementation is not trivial. The key to stacking single channel events is to operate while simultaneously obtaining and handling a-posteriori time-frame information of scheduled events. This information is used to shift a global time pointer that the wireless access point manages and uses to synchronize all serviced nodes. The performance of the proposed multiplexing access technique is lower bounded by the performance of legacy TDMA and can significantly improve the effective throughput. Work is validated by simulation results.
Analysis of a municipal wastewater treatment plant using a neural network-based pattern analysis
Hong, Y.-S.T.; Rosen, Michael R.; Bhamidimarri, R.
2003-01-01
This paper addresses the problem of how to capture the complex relationships that exist between process variables and to diagnose the dynamic behaviour of a municipal wastewater treatment plant (WTP). Due to the complex biological reaction mechanisms, the highly time-varying, and multivariable aspects of the real WTP, the diagnosis of the WTP are still difficult in practice. The application of intelligent techniques, which can analyse the multi-dimensional process data using a sophisticated visualisation technique, can be useful for analysing and diagnosing the activated-sludge WTP. In this paper, the Kohonen Self-Organising Feature Maps (KSOFM) neural network is applied to analyse the multi-dimensional process data, and to diagnose the inter-relationship of the process variables in a real activated-sludge WTP. By using component planes, some detailed local relationships between the process variables, e.g., responses of the process variables under different operating conditions, as well as the global information is discovered. The operating condition and the inter-relationship among the process variables in the WTP have been diagnosed and extracted by the information obtained from the clustering analysis of the maps. It is concluded that the KSOFM technique provides an effective analysing and diagnosing tool to understand the system behaviour and to extract knowledge contained in multi-dimensional data of a large-scale WTP. ?? 2003 Elsevier Science Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Hasler, A. F.; Desjardins, M.; Shenk, W. E.
1979-01-01
Simultaneous Geosynchronous Operational Environmental Satellite (GOES) 1 km resolution visible image pairs can provide quantitative three dimensional measurements of clouds. These data have great potential for severe storms research and as a basic parameter measurement source for other areas of meteorology (e.g. climate). These stereo cloud height measurements are not subject to the errors and ambiguities caused by unknown cloud emissivity and temperature profiles that are associated with infrared techniques. This effort describes the display and measurement of stereo data using digital processing techniques.
Hershberger, Patricia E; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer
2013-01-01
Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. The purpose of this article is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Using a grounded theory approach, 27 women with cancer participated in individual, semistructured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by 5 dimensions within the Contemplate phase of the decision-making process framework. In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Better understanding of theoretical underpinnings surrounding women's information processes can facilitate decision support and improve clinical care.
Task-Driven Dynamic Text Summarization
ERIC Educational Resources Information Center
Workman, Terri Elizabeth
2011-01-01
The objective of this work is to examine the efficacy of natural language processing (NLP) in summarizing bibliographic text for multiple purposes. Researchers have noted the accelerating growth of bibliographic databases. Information seekers using traditional information retrieval techniques when searching large bibliographic databases are often…
INcreasing Security and Protection through Infrastructure REsilience: The INSPIRE Project
NASA Astrophysics Data System (ADS)
D'Antonio, Salvatore; Romano, Luigi; Khelil, Abdelmajid; Suri, Neeraj
The INSPIRE project aims at enhancing the European potential in the field of security by ensuring the protection of critical information infrastructures through (a) the identification of their vulnerabilities and (b) the development of innovative techniques for securing networked process control systems. To increase the resilience of such systems INSPIRE will develop traffic engineering algorithms, diagnostic processes and self-reconfigurable architectures along with recovery techniques. Hence, the core idea of the INSPIRE project is to protect critical information infrastructures by appropriately configuring, managing, and securing the communication network which interconnects the distributed control systems. A working prototype will be implemented as a final demonstrator of selected scenarios. Controls/Communication Experts will support project partners in the validation and demonstration activities. INSPIRE will also contribute to standardization process in order to foster multi-operator interoperability and coordinated strategies for securing lifeline systems.
NASA Astrophysics Data System (ADS)
Maciel, Thiago O.; Vianna, Reinaldo O.; Sarthour, Roberto S.; Oliveira, Ivan S.
2015-11-01
We reconstruct the time dependent quantum map corresponding to the relaxation process of a two-spin system in liquid-state NMR at room temperature. By means of quantum tomography techniques that handle informational incomplete data, we show how to properly post-process and normalize the measurements data for the simulation of quantum information processing, overcoming the unknown number of molecules prepared in a non-equilibrium magnetization state (Nj) by an initial sequence of radiofrequency pulses. From the reconstructed quantum map, we infer both longitudinal (T1) and transversal (T2) relaxation times, and introduce the J-coupling relaxation times ({T}1J,{T}2J), which are relevant for quantum information processing simulations. We show that the map associated to the relaxation process cannot be assumed approximated unital and trace-preserving for times greater than {T}2J.
Carbone, Elena T; Campbell, Marci K; Honess-Morreale, Lauren
2002-05-01
The effectiveness of dietary surveys and educational messages is dependent in part on how well the target audience's information processing needs and abilities are addressed. Use of pilot testing is helpful; however, problems with wording and language are often not revealed. Cognitive interview techniques offer 1 approach to assist dietitians in understanding how audiences process information. With this method, respondents are led through a survey or message and asked to paraphrase items; discuss thoughts, feelings, and ideas that come to mind; and suggest alternative wording. As part of a US Department of Agriculture-funded nutrition education project, 23 cognitive interviews were conducted among technical community college students in North Carolina. Interview findings informed the development of tailored computer messages and survey questions. Better understanding of respondents' cognitive processes significantly improved the language and approach used in this intervention. Interview data indicated 4 problem areas: vague or ineffective instructions, confusing questions and response options, variable interpretation of terms, and misinterpretation of dietary recommendations. Interviews also provided insight into the meaning of diet-related stages of change. These findings concur with previous research suggesting that cognitive interview techniques are a valuable tool in the formative evaluation and development of nutrition surveys and materials.
Challenges of microtome‐based serial block‐face scanning electron microscopy in neuroscience
WANNER, A. A.; KIRSCHMANN, M. A.
2015-01-01
Summary Serial block‐face scanning electron microscopy (SBEM) is becoming increasingly popular for a wide range of applications in many disciplines from biology to material sciences. This review focuses on applications for circuit reconstruction in neuroscience, which is one of the major driving forces advancing SBEM. Neuronal circuit reconstruction poses exceptional challenges to volume EM in terms of resolution, field of view, acquisition time and sample preparation. Mapping the connections between neurons in the brain is crucial for understanding information flow and information processing in the brain. However, information on the connectivity between hundreds or even thousands of neurons densely packed in neuronal microcircuits is still largely missing. Volume EM techniques such as serial section TEM, automated tape‐collecting ultramicrotome, focused ion‐beam scanning electron microscopy and SBEM (microtome serial block‐face scanning electron microscopy) are the techniques that provide sufficient resolution to resolve ultrastructural details such as synapses and provides sufficient field of view for dense reconstruction of neuronal circuits. While volume EM techniques are advancing, they are generating large data sets on the terabyte scale that require new image processing workflows and analysis tools. In this review, we present the recent advances in SBEM for circuit reconstruction in neuroscience and an overview of existing image processing and analysis pipelines. PMID:25907464
Plasma-edge studies using carbon resistance probes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wampler, W.R.
1984-01-01
Characterization of erosion and hydrogen-recycling processes occurring at the edge of magnetically confined plasmas requires knowledge of the energy and flux of hydrogen isotopes incident on the materials. A new plasma-edge probe technique, the carbon resistance probe, has been developed to obtain this information. This technique uti
Remote sensing. [land use mapping
NASA Technical Reports Server (NTRS)
Jinich, A.
1979-01-01
Various imaging techniques are outlined for use in mapping, land use, and land management in Mexico. Among the techniques discussed are pattern recognition and photographic processing. The utilization of information from remote sensing devices on satellites are studied. Multispectral band scanners are examined and software, hardware, and other program requirements are surveyed.
Role of Discrepant Questioning Leading to Model Element Modification
ERIC Educational Resources Information Center
Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria Cecilia; Clement, John
2009-01-01
Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…
Image interpolation and denoising for division of focal plane sensors using Gaussian processes.
Gilboa, Elad; Cunningham, John P; Nehorai, Arye; Gruev, Viktor
2014-06-16
Image interpolation and denoising are important techniques in image processing. These methods are inherent to digital image acquisition as most digital cameras are composed of a 2D grid of heterogeneous imaging sensors. Current polarization imaging employ four different pixelated polarization filters, commonly referred to as division of focal plane polarization sensors. The sensors capture only partial information of the true scene, leading to a loss of spatial resolution as well as inaccuracy of the captured polarization information. Interpolation is a standard technique to recover the missing information and increase the accuracy of the captured polarization information. Here we focus specifically on Gaussian process regression as a way to perform a statistical image interpolation, where estimates of sensor noise are used to improve the accuracy of the estimated pixel information. We further exploit the inherent grid structure of this data to create a fast exact algorithm that operates in ����(N(3/2)) (vs. the naive ���� (N³)), thus making the Gaussian process method computationally tractable for image data. This modeling advance and the enabling computational advance combine to produce significant improvements over previously published interpolation methods for polarimeters, which is most pronounced in cases of low signal-to-noise ratio (SNR). We provide the comprehensive mathematical model as well as experimental results of the GP interpolation performance for division of focal plane polarimeter.
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.; Fales, Carl L.
1990-01-01
Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.
Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs
NASA Astrophysics Data System (ADS)
O'Connor, Rory V.
This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.
Model-checking techniques based on cumulative residuals.
Lin, D Y; Wei, L J; Ying, Z
2002-03-01
Residuals have long been used for graphical and numerical examinations of the adequacy of regression models. Conventional residual analysis based on the plots of raw residuals or their smoothed curves is highly subjective, whereas most numerical goodness-of-fit tests provide little information about the nature of model misspecification. In this paper, we develop objective and informative model-checking techniques by taking the cumulative sums of residuals over certain coordinates (e.g., covariates or fitted values) or by considering some related aggregates of residuals, such as moving sums and moving averages. For a variety of statistical models and data structures, including generalized linear models with independent or dependent observations, the distributions of these stochastic processes tinder the assumed model can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be easily generated by computer simulation. Each observed process can then be compared, both graphically and numerically, with a number of realizations from the Gaussian process. Such comparisons enable one to assess objectively whether a trend seen in a residual plot reflects model misspecification or natural variation. The proposed techniques are particularly useful in checking the functional form of a covariate and the link function. Illustrations with several medical studies are provided.
Stock, Ann-Kathrin; Mückschel, Moritz; Beste, Christian
2017-01-01
Recent research has drawn interest to the effects of binge drinking on response selection. However, choosing an appropriate response is a complex endeavor that usually requires us to process and integrate several streams of information. One of them is proprioceptive information about the position of limbs. As to now, it has however remained elusive how binge drinking affects the processing of proprioceptive information during response selection and control in healthy individuals. We investigated this question using neurophysiological (EEG) techniques in a response selection task, where we manipulated proprioceptive information. The results show a reversal of alcohol-induced effects on response control due to changes in proprioceptive information processing. The most likely explanation for this finding is that proprioceptive information does not seem to be properly integrated in response selection processes during acute alcohol intoxication as found in binge drinking. The neurophysiological data suggest that processes related to the preparation and execution of the motor response, but not upstream processes related to conflict monitoring and spatial attentional orienting, underlie these binge drinking-dependent modulations. Taken together, the results show that even high doses of alcohol have very specific effects within the cascade of neurophysiological processes underlying response control and the integration of proprioceptive information during this process. © 2015 Society for the Study of Addiction.
Ultra-processed family foods in Australia: nutrition claims, health claims and marketing techniques.
Pulker, Claire Elizabeth; Scott, Jane Anne; Pollard, Christina Mary
2018-01-01
To objectively evaluate voluntary nutrition and health claims and marketing techniques present on packaging of high-market-share ultra-processed foods (UPF) in Australia for their potential impact on public health. Cross-sectional. Packaging information from five high-market-share food manufacturers and one retailer were obtained from supermarket and manufacturers' websites. Ingredients lists for 215 UPF were examined for presence of added sugar. Packaging information was categorised using a taxonomy of nutrition and health information which included nutrition and health claims and five common food marketing techniques. Compliance of statements and claims with the Australia New Zealand Food Standards Code and with Health Star Ratings (HSR) were assessed for all products. Almost all UPF (95 %) contained added sugars described in thirty-four different ways; 55 % of UPF displayed a HSR; 56 % had nutrition claims (18 % were compliant with regulations); 25 % had health claims (79 % were compliant); and 97 % employed common food marketing techniques. Packaging of 47 % of UPF was designed to appeal to children. UPF carried a mean of 1·5 health and nutrition claims (range 0-10) and 2·6 marketing techniques (range 0-5), and 45 % had HSR≤3·0/5·0. Most UPF packaging featured nutrition and health statements or claims despite the high prevalence of added sugars and moderate HSR. The degree of inappropriate or inaccurate statements and claims present is concerning, particularly on packaging designed to appeal to children. Public policies to assist parents to select healthy family foods should address the quality and accuracy of information provided on UPF packaging.
Proton magnetic resonance spectroscopy imaging in the study of human brain cancer.
Martínez-Bisbal, M C; Celda, B
2009-12-01
Magnetic resonance spectroscopic imaging (MRSI) is a non-invasive imaging technique that provides metabolic information on brain tumor. This biochemical information can be processed and presented as density maps of several metabolites, among them N-acetylaspartate (marker of neuronal viability), choline (marker of membrane turnover), creatine (related to the energy state of the cells), myo-Inositol (exclusively found in astrocytes), lipids and lactate (observed in necrosis and other pathological processes) which mean relevant information in the context of brain tumors. Thus, this technique is a multiparametrical molecular imaging method that can complete the magnetic resonance imaging (MRI) study enabling the detection of biochemical patterns of different features and aspects of brain tumors. In this article, the role of MRSI as a molecular imaging technique to provide biochemical information on human brain tumors is reviewed. The most frequent questions and situations in the study of human brain tumors in clinical settings will be considered, as well as the distinction of neoplastic lesions from non neoplastic, the tumor type identification, the study of heterogeneity and infiltration of normal appearing white matter and the therapy following with detection of side effects. The great amount of data in MRSI acquisition compared to the single voxel techniques requires the use of automated methods of quantification, but the possibility to obtain self-reference in the non-affected areas allows different strategies for data handling and interpretation, as presented in the literature. The combination of MRSI with other physiological MRI techniques and positron emission tomography is also included in this review.
Factory approach can streamline patient accounting.
Rands, J; Muench, M
1991-08-01
Although they may seem fundamentally different, similarities exist between operations of factories and healthcare organizations' business offices. As a result, a patient accounting approach based on manufacturing firms' management techniques may help smooth healthcare business processes. Receivables performance management incorporates the Japanese techniques of "just-in-time" and total quality management to reduce unbilled accounts and information backlog and accelerate payment. A preliminary diagnostic assessment of a patient accounting process helps identify bottlenecks and set priorities for work flow.
1988-09-01
could use the assistance of a microcomputer-based management information system . However, adequate system design and development requires an in-depth...understanding of the Equipment Management Section and the environment in which it functions were asked and answered. Then, a management information system was...designed, developed, and tested. The management information system is called the Equipment Management Information System (EMIS).
Wang, Degeng
2008-01-01
Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers - multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks – biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird’s-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation. PMID:18757239
Wang, Degeng
2008-12-01
Discrepancy between the abundance of cognate protein and RNA molecules is frequently observed. A theoretical understanding of this discrepancy remains elusive, and it is frequently described as surprises and/or technical difficulties in the literature. Protein and RNA represent different steps of the multi-stepped cellular genetic information flow process, in which they are dynamically produced and degraded. This paper explores a comparison with a similar process in computers-multi-step information flow from storage level to the execution level. Functional similarities can be found in almost every facet of the retrieval process. Firstly, common architecture is shared, as the ribonome (RNA space) and the proteome (protein space) are functionally similar to the computer primary memory and the computer cache memory, respectively. Secondly, the retrieval process functions, in both systems, to support the operation of dynamic networks-biochemical regulatory networks in cells and, in computers, the virtual networks (of CPU instructions) that the CPU travels through while executing computer programs. Moreover, many regulatory techniques are implemented in computers at each step of the information retrieval process, with a goal of optimizing system performance. Cellular counterparts can be easily identified for these regulatory techniques. In other words, this comparative study attempted to utilize theoretical insight from computer system design principles as catalysis to sketch an integrative view of the gene expression process, that is, how it functions to ensure efficient operation of the overall cellular regulatory network. In context of this bird's-eye view, discrepancy between protein and RNA abundance became a logical observation one would expect. It was suggested that this discrepancy, when interpreted in the context of system operation, serves as a potential source of information to decipher regulatory logics underneath biochemical network operation.
NASA Astrophysics Data System (ADS)
Laforest, Martin
Quantum information processing has been the subject of countless discoveries since the early 1990's. It is believed to be the way of the future for computation: using quantum systems permits one to perform computation exponentially faster than on a regular classical computer. Unfortunately, quantum systems that not isolated do not behave well. They tend to lose their quantum nature due to the presence of the environment. If key information is known about the noise present in the system, methods such as quantum error correction have been developed in order to reduce the errors introduced by the environment during a given quantum computation. In order to harness the quantum world and implement the theoretical ideas of quantum information processing and quantum error correction, it is imperative to understand and quantify the noise present in the quantum processor and benchmark the quality of the control over the qubits. Usual techniques to estimate the noise or the control are based on quantum process tomography (QPT), which, unfortunately, demands an exponential amount of resources. This thesis presents work towards the characterization of noisy processes in an efficient manner. The protocols are developed from a purely abstract setting with no system-dependent variables. To circumvent the exponential nature of quantum process tomography, three different efficient protocols are proposed and experimentally verified. The first protocol uses the idea of quantum error correction to extract relevant parameters about a given noise model, namely the correlation between the dephasing of two qubits. Following that is a protocol using randomization and symmetrization to extract the probability that a given number of qubits are simultaneously corrupted in a quantum memory, regardless of the specifics of the error and which qubits are affected. Finally, a last protocol, still using randomization ideas, is developed to estimate the average fidelity per computational gates for single and multi qubit systems. Even though liquid state NMR is argued to be unsuitable for scalable quantum information processing, it remains the best test-bed system to experimentally implement, verify and develop protocols aimed at increasing the control over general quantum information processors. For this reason, all the protocols described in this thesis have been implemented in liquid state NMR, which then led to further development of control and analysis techniques.
Williamson, J; Ranyard, R; Cuthbert, L
2000-05-01
This study is an evaluation of a process tracing method developed for naturalistic decisions, in this case a consumer choice task. The method is based on Huber et al.'s (1997) Active Information Search (AIS) technique, but develops it by providing spoken rather than written answers to respondents' questions, and by including think aloud instructions. The technique is used within a conversation-based situation, rather than the respondent thinking aloud 'into an empty space', as is conventionally the case in think aloud techniques. The method results in a concurrent verbal protocol as respondents make their decisions, and a retrospective report in the form of a post-decision summary. The method was found to be virtually non-reactive in relation to think aloud, although the variable of Preliminary Attribute Elicitation showed some evidence of reactivity. This was a methodological evaluation, and as such the data reported are essentially descriptive. Nevertheless, the data obtained indicate that the method is capable of producing information about decision processes which could have theoretical importance in terms of evaluating models of decision-making.
A Changing Information Environment Challenges Public Administrations.
ERIC Educational Resources Information Center
Otten, Klaus W.
1989-01-01
Describes ways in which information handling techniques will eventually be used in public administration, focusing on technologies that automate routine administrative processes and support decision making. The need to develop a long range concept for continued full employment of administrative staff is discussed. (two references) (CLB)
Speech Recognition for A Digital Video Library.
ERIC Educational Resources Information Center
Witbrock, Michael J.; Hauptmann, Alexander G.
1998-01-01
Production of the meta-data supporting the Informedia Digital Video Library interface is automated using techniques derived from artificial intelligence research. Speech recognition and natural-language processing, information retrieval, and image analysis are applied to produce an interface that helps users locate information and navigate more…
D Tracking Based Augmented Reality for Cultural Heritage Data Management
NASA Astrophysics Data System (ADS)
Battini, C.; Landi, G.
2015-02-01
The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.
Methodological development of topographic correction in 2D/3D ToF-SIMS images using AFM images
NASA Astrophysics Data System (ADS)
Jung, Seokwon; Lee, Nodo; Choi, Myungshin; Lee, Jungmin; Cho, Eunkyunng; Joo, Minho
2018-02-01
Time-of-flight secondary-ion mass spectrometry (ToF-SIMS) is an emerging technique that provides chemical information directly from the surface of electronic materials, e.g. OLED and solar cell. It is very versatile and highly sensitive mass spectrometric technique that provides surface molecular information with their lateral distribution as a two-dimensional (2D) molecular image. Extending the usefulness of ToF-SIMS, a 3D molecular image can be generated by acquiring multiple 2D images in a stack. These imaging techniques by ToF-SIMS provide an insight into understanding the complex structures of unknown composition in electronic material. However, one drawback in ToF-SIMS is not able to represent topographical information in 2D and 3D mapping images. To overcome this technical limitation, topographic information by ex-situ technique such as atomic force microscopy (AFM) has been combined with chemical information from SIMS that provides both chemical and physical information in one image. The key to combine two different images obtained from ToF-SIMS and AFM techniques is to develop the image processing algorithm, which performs resize and alignment by comparing the specific pixel information of each image. In this work, we present methodological development of the semiautomatic alignment and the 3D structure interpolation system for the combination of 2D/3D images obtained by ToF-SIMS and AFM measurements, which allows providing useful analytical information in a single representation.
NASA Astrophysics Data System (ADS)
McFadden, D.; Tavakkoli, A.; Regenbrecht, J.; Wilson, B.
2017-12-01
Virtual Reality (VR) and Augmented Reality (AR) applications have recently seen an impressive growth, thanks to the advent of commercial Head Mounted Displays (HMDs). This new visualization era has opened the possibility of presenting researchers from multiple disciplines with data visualization techniques not possible via traditional 2D screens. In a purely VR environment researchers are presented with the visual data in a virtual environment, whereas in a purely AR application, a piece of virtual object is projected into the real world with which researchers could interact. There are several limitations to the purely VR or AR application when taken within the context of remote planetary exploration. For example, in a purely VR environment, contents of the planet surface (e.g. rocks, terrain, or other features) should be created off-line from a multitude of images using image processing techniques to generate 3D mesh data that will populate the virtual surface of the planet. This process usually takes a tremendous amount of computational resources and cannot be delivered in real-time. As an alternative, video frames may be superimposed on the virtual environment to save processing time. However, such rendered video frames will lack 3D visual information -i.e. depth information. In this paper, we present a technique to utilize a remotely situated robot's stereoscopic cameras to provide a live visual feed from the real world into the virtual environment in which planetary scientists are immersed. Moreover, the proposed technique will blend the virtual environment with the real world in such a way as to preserve both the depth and visual information from the real world while allowing for the sensation of immersion when the entire sequence is viewed via an HMD such as Oculus Rift. The figure shows the virtual environment with an overlay of the real-world stereoscopic video being presented in real-time into the virtual environment. Notice the preservation of the object's shape, shadows, and depth information. The distortions shown in the image are due to the rendering of the stereoscopic data into a 2D image for the purposes of taking screenshots.
NASA Technical Reports Server (NTRS)
Dickinson, William B.
1995-01-01
An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.
Code of Federal Regulations, 2010 CFR
2010-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
Code of Federal Regulations, 2011 CFR
2011-07-01
... being used will be based on information available to the Administrator, which may include, but is not... techniques, or the control system and process monitoring equipment during a malfunction in a manner... the process and control system monitoring equipment, and shall include a standardized checklist to...
A Program in Semiconductor Processing.
ERIC Educational Resources Information Center
McConica, Carol M.
1984-01-01
A graduate program at Colorado State University which focuses on integrated circuit processing is described. The program utilizes courses from several departments while allowing students to apply chemical engineering techniques to an integrated circuit fabrication research topic. Information on employment of chemical engineers by electronics…
NASA Astrophysics Data System (ADS)
García Plaza, E.; Núñez López, P. J.
2018-01-01
On-line monitoring of surface finish in machining processes has proven to be a substantial advancement over traditional post-process quality control techniques by reducing inspection times and costs and by avoiding the manufacture of defective products. This study applied techniques for processing cutting force signals based on the wavelet packet transform (WPT) method for the monitoring of surface finish in computer numerical control (CNC) turning operations. The behaviour of 40 mother wavelets was analysed using three techniques: global packet analysis (G-WPT), and the application of two packet reduction criteria: maximum energy (E-WPT) and maximum entropy (SE-WPT). The optimum signal decomposition level (Lj) was determined to eliminate noise and to obtain information correlated to surface finish. The results obtained with the G-WPT method provided an in-depth analysis of cutting force signals, and frequency ranges and signal characteristics were correlated to surface finish with excellent results in the accuracy and reliability of the predictive models. The radial and tangential cutting force components at low frequency provided most of the information for the monitoring of surface finish. The E-WPT and SE-WPT packet reduction criteria substantially reduced signal processing time, but at the expense of discarding packets with relevant information, which impoverished the results. The G-WPT method was observed to be an ideal procedure for processing cutting force signals applied to the real-time monitoring of surface finish, and was estimated to be highly accurate and reliable at a low analytical-computational cost.
"Glitch Logic" and Applications to Computing and Information Security
NASA Technical Reports Server (NTRS)
Stoica, Adrian; Katkoori, Srinivas
2009-01-01
This paper introduces a new method of information processing in digital systems, and discusses its potential benefits to computing and information security. The new method exploits glitches caused by delays in logic circuits for carrying and processing information. Glitch processing is hidden to conventional logic analyses and undetectable by traditional reverse engineering techniques. It enables the creation of new logic design methods that allow for an additional controllable "glitch logic" processing layer embedded into a conventional synchronous digital circuits as a hidden/covert information flow channel. The combination of synchronous logic with specific glitch logic design acting as an additional computing channel reduces the number of equivalent logic designs resulting from synthesis, thus implicitly reducing the possibility of modification and/or tampering with the design. The hidden information channel produced by the glitch logic can be used: 1) for covert computing/communication, 2) to prevent reverse engineering, tampering, and alteration of design, and 3) to act as a channel for information infiltration/exfiltration and propagation of viruses/spyware/Trojan horses.
Assessing clutter reduction in parallel coordinates using image processing techniques
NASA Astrophysics Data System (ADS)
Alhamaydh, Heba; Alzoubi, Hussein; Almasaeid, Hisham
2018-01-01
Information visualization has appeared as an important research field for multidimensional data and correlation analysis in recent years. Parallel coordinates (PCs) are one of the popular techniques to visual high-dimensional data. A problem with the PCs technique is that it suffers from crowding, a clutter which hides important data and obfuscates the information. Earlier research has been conducted to reduce clutter without loss in data content. We introduce the use of image processing techniques as an approach for assessing the performance of clutter reduction techniques in PC. We use histogram analysis as our first measure, where the mean feature of the color histograms of the possible alternative orderings of coordinates for the PC images is calculated and compared. The second measure is the extracted contrast feature from the texture of PC images based on gray-level co-occurrence matrices. The results show that the best PC image is the one that has the minimal mean value of the color histogram feature and the maximal contrast value of the texture feature. In addition to its simplicity, the proposed assessment method has the advantage of objectively assessing alternative ordering of PC visualization.
2004-06-01
Information Systems, Faculty of ICT, International Islamic University, Malaysia . Abstract. Several techniques for evaluating a groupware...inspection based techniques couldn’t be carried out in other parts of Pakistan where the IT industry has mushroomed in the past few years. Nevertheless...there are no set standards for using any particular technique. Evaluating a groupware interface is an evolving process and requires more investigation
Mohamad, Nur Royhaila; Marzuki, Nur Haziqah Che; Buang, Nor Aziah; Huyop, Fahrul; Wahab, Roswanira Abdul
2015-01-01
The current demands of sustainable green methodologies have increased the use of enzymatic technology in industrial processes. Employment of enzyme as biocatalysts offers the benefits of mild reaction conditions, biodegradability and catalytic efficiency. The harsh conditions of industrial processes, however, increase propensity of enzyme destabilization, shortening their industrial lifespan. Consequently, the technology of enzyme immobilization provides an effective means to circumvent these concerns by enhancing enzyme catalytic properties and also simplify downstream processing and improve operational stability. There are several techniques used to immobilize the enzymes onto supports which range from reversible physical adsorption and ionic linkages, to the irreversible stable covalent bonds. Such techniques produce immobilized enzymes of varying stability due to changes in the surface microenvironment and degree of multipoint attachment. Hence, it is mandatory to obtain information about the structure of the enzyme protein following interaction with the support surface as well as interactions of the enzymes with other proteins. Characterization technologies at the nanoscale level to study enzymes immobilized on surfaces are crucial to obtain valuable qualitative and quantitative information, including morphological visualization of the immobilized enzymes. These technologies are pertinent to assess efficacy of an immobilization technique and development of future enzyme immobilization strategies. PMID:26019635
Optical Processing Techniques For Pseudorandom Sequence Prediction
NASA Astrophysics Data System (ADS)
Gustafson, Steven C.
1983-11-01
Pseudorandom sequences are series of apparently random numbers generated, for example, by linear or nonlinear feedback shift registers. An important application of these sequences is in spread spectrum communication systems, in which, for example, the transmitted carrier phase is digitally modulated rapidly and pseudorandomly and in which the information to be transmitted is incorporated as a slow modulation in the pseudorandom sequence. In this case the transmitted information can be extracted only by a receiver that uses for demodulation the same pseudorandom sequence used by the transmitter, and thus this type of communication system has a very high immunity to third-party interference. However, if a third party can predict in real time the probable future course of the transmitted pseudorandom sequence given past samples of this sequence, then interference immunity can be significantly reduced.. In this application effective pseudorandom sequence prediction techniques should be (1) applicable in real time to rapid (e.g., megahertz) sequence generation rates, (2) applicable to both linear and nonlinear pseudorandom sequence generation processes, and (3) applicable to error-prone past sequence samples of limited number and continuity. Certain optical processing techniques that may meet these requirements are discussed in this paper. In particular, techniques based on incoherent optical processors that perform general linear transforms or (more specifically) matrix-vector multiplications are considered. Computer simulation examples are presented which indicate that significant prediction accuracy can be obtained using these transforms for simple pseudorandom sequences. However, the useful prediction of more complex pseudorandom sequences will probably require the application of more sophisticated optical processing techniques.
Discrepant Questioning as a Tool To Build Complex Mental Models of Respiration.
ERIC Educational Resources Information Center
Rea-Ramirez, Mary Anne; Nunez-Oviedo, Maria C.
Discrepant questioning is a teaching technique that can help students "unlearn" misconceptions and process science ideas for deep understanding. Discrepant questioning is a technique in which teachers question students in a way that requires them to examine their ideas or models, without giving information prematurely to the student or passing…
State-of-the-art of optics in China reviewed
NASA Astrophysics Data System (ADS)
Wang, Daheng; Wo, Xinneng
1985-06-01
The state-of-the-art of optics and applied optics in China is reviewed. Developments in lasers, infrared and opto-electronic techniques, optical metrology, high-speed photography, holography and information processing, nonlinear optics, optical fiber communications and optical techniques are described. Further development of optics and applied optics in China are proposed.
Usefulness of Simultaneous EEG-NIRS Recording in Language Studies
ERIC Educational Resources Information Center
Wallois, F.; Mahmoudzadeh, M.; Patil, A.; Grebe, R.
2012-01-01
One of the most challenging tasks in neuroscience in language studies, is investigation of the brain's ability to integrate and process information. This task can only be successfully addressed by applying various assessment techniques integrated into a multimodal approach. Each of these techniques has its advantages and disadvantages, but help to…
An Empirical Test of the Nominal Group Technique in State Solar Energy Planning.
ERIC Educational Resources Information Center
Stephenson, Blair Y.; And Others
1982-01-01
Investigated use of the Nominal Group Technique (NGT) as an informational input mechanism into the formulation of a Solar Energy Plan. Data collected from a questionnaire indicated that the NGT was rated as being a highly effective mechanism providing input into the solar energy planning process. (Author/RC)
How Students Learn: Improving Teaching Techniques for Business Discipline Courses
ERIC Educational Resources Information Center
Cluskey, Bob; Elbeck, Matt; Hill, Kathy L.; Strupeck, Dave
2011-01-01
The focus of this paper is to familiarize business discipline faculty with cognitive psychology theories of how students learn together with teaching techniques to assist and improve student learning. Student learning can be defined as the outcome from the retrieval (free recall) of desired information. Student learning occurs in two processes.…
Signal processing methods for MFE plasma diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Candy, J.V.; Casper, T.; Kane, R.
1985-02-01
The application of various signal processing methods to extract energy storage information from plasma diamagnetism sensors occurring during physics experiments on the Tandom Mirror Experiment-Upgrade (TMX-U) is discussed. We show how these processing techniques can be used to decrease the uncertainty in the corresponding sensor measurements. The algorithms suggested are implemented using SIG, an interactive signal processing package developed at LLNL.
Localization Using Visual Odometry and a Single Downward-Pointing Camera
NASA Technical Reports Server (NTRS)
Swank, Aaron J.
2012-01-01
Stereo imaging is a technique commonly employed for vision-based navigation. For such applications, two images are acquired from different vantage points and then compared using transformations to extract depth information. The technique is commonly used in robotics for obstacle avoidance or for Simultaneous Localization And Mapping, (SLAM). Yet, the process requires a number of image processing steps and therefore tends to be CPU-intensive, which limits the real-time data rate and use in power-limited applications. Evaluated here is a technique where a monocular camera is used for vision-based odometry. In this work, an optical flow technique with feature recognition is performed to generate odometry measurements. The visual odometry sensor measurements are intended to be used as control inputs or measurements in a sensor fusion algorithm using low-cost MEMS based inertial sensors to provide improved localization information. Presented here are visual odometry results which demonstrate the challenges associated with using ground-pointing cameras for visual odometry. The focus is for rover-based robotic applications for localization within GPS-denied environments.
Staccini, Pascal; Joubert, Michel; Quaranta, Jean-François; Fieschi, Marius
2003-01-01
Today, the economic and regulatory environment are pressuring hospitals and healthcare professionals to account for their results and methods of care delivery. The evaluation of the quality and the safety of care, the traceability of the acts performed and the evaluation of practices are some of the reasons underpinning current interest in clinical and hospital information systems. The structured collection of users' needs and system requirements is fundamental when installing such systems. This stage takes time and is generally misconstrued by caregivers and is of limited efficacy to analysis. We used a modelling technique designed for manufacturing processes (SADT: Structured Analysis and Design Technique). We enhanced the initial model of activity of this method and programmed a web-based tool in an object-oriented environment. This tool makes it possible to extract the data dictionary from the description of a given process and to locate documents (procedures, recommendations, instructions). Aimed at structuring needs and storing information provided by teams directly involved regarding the workings of an institution (or at least part of it), the process mapping approach has an important contribution to make in the analysis of clinical information systems.
NASA Technical Reports Server (NTRS)
Anderson, B. H.; Putt, C. W.; Giamati, C. C.
1981-01-01
Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.
Applied in situ product recovery in ABE fermentation
Lalander, Carl‐Axel; Lee, Jonathan G. M.; Davies, E. Timothy; Harvey, Adam P.
2017-01-01
The production of biobutanol is hindered by the product's toxicity to the bacteria, which limits the productivity of the process. In situ product recovery of butanol can improve the productivity by removing the source of inhibition. This paper reviews in situ product recovery techniques applied to the acetone butanol ethanol fermentation in a stirred tank reactor. Methods of in situ recovery include gas stripping, vacuum fermentation, pervaporation, liquid–liquid extraction, perstraction, and adsorption, all of which have been investigated for the acetone, butanol, and ethanol fermentation. All techniques have shown an improvement in substrate utilization, yield, productivity or both. Different fermentation modes favored different techniques. For batch processing gas stripping and pervaporation were most favorable, but in fed‐batch fermentations gas stripping and adsorption were most promising. During continuous processing perstraction appeared to offer the best improvement. The use of hybrid techniques can increase the final product concentration beyond that of single‐stage techniques. Therefore, the selection of an in situ product recovery technique would require comparable information on the energy demand and economics of the process. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:563–579, 2017 PMID:28188696
48 CFR 15.506 - Postaward debriefing of offerors.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) The contracting officer should normally chair any debriefing session held. Individuals who conducted...) Privileged or confidential manufacturing processes and techniques; (3) Commercial and financial information...
Knowledge Discovery and Data Mining in Iran's Climatic Researches
NASA Astrophysics Data System (ADS)
Karimi, Mostafa
2013-04-01
Advances in measurement technology and data collection is the database gets larger. Large databases require powerful tools for analysis data. Iterative process of acquiring knowledge from information obtained from data processing is done in various forms in all scientific fields. However, when the data volume large, and many of the problems the Traditional methods cannot respond. in the recent years, use of databases in various scientific fields, especially atmospheric databases in climatology expanded. in addition, increases in the amount of data generated by the climate models is a challenge for analysis of it for extraction of hidden pattern and knowledge. The approach to this problem has been made in recent years uses the process of knowledge discovery and data mining techniques with the use of the concepts of machine learning, artificial intelligence and expert (professional) systems is overall performance. Data manning is analytically process for manning in massive volume data. The ultimate goal of data mining is access to information and finally knowledge. climatology is a part of science that uses variety and massive volume data. Goal of the climate data manning is Achieve to information from variety and massive atmospheric and non-atmospheric data. in fact, Knowledge Discovery performs these activities in a logical and predetermined and almost automatic process. The goal of this research is study of uses knowledge Discovery and data mining technique in Iranian climate research. For Achieve This goal, study content (descriptive) analysis and classify base method and issue. The result shown that in climatic research of Iran most clustering, k-means and wards applied and in terms of issues precipitation and atmospheric circulation patterns most introduced. Although several studies in geography and climate issues with statistical techniques such as clustering and pattern extraction is done, Due to the nature of statistics and data mining, but cannot say for internal climate studies in data mining and knowledge discovery techniques are used. However, it is necessary to use the KDD Approach and DM techniques in the climatic studies, specific interpreter of climate modeling result.
Techniques and potential capabilities of multi-resolutional information (knowledge) processing
NASA Technical Reports Server (NTRS)
Meystel, A.
1989-01-01
A concept of nested hierarchical (multi-resolutional, pyramidal) information (knowledge) processing is introduced for a variety of systems including data and/or knowledge bases, vision, control, and manufacturing systems, industrial automated robots, and (self-programmed) autonomous intelligent machines. A set of practical recommendations is presented using a case study of a multiresolutional object representation. It is demonstrated here that any intelligent module transforms (sometimes, irreversibly) the knowledge it deals with, and this tranformation affects the subsequent computation processes, e.g., those of decision and control. Several types of knowledge transformation are reviewed. Definite conditions are analyzed, satisfaction of which is required for organization and processing of redundant information (knowledge) in the multi-resolutional systems. Providing a definite degree of redundancy is one of these conditions.
Recovering the fine structures in solar images
NASA Technical Reports Server (NTRS)
Karovska, Margarita; Habbal, S. R.; Golub, L.; Deluca, E.; Hudson, Hugh S.
1994-01-01
Several examples of the capability of the blind iterative deconvolution (BID) technique to recover the real point spread function, when limited a priori information is available about its characteristics. To demonstrate the potential of image post-processing for probing the fine scale and temporal variability of the solar atmosphere, the BID technique is applied to different samples of solar observations from space. The BID technique was originally proposed for correction of the effects of atmospheric turbulence on optical images. The processed images provide a detailed view of the spatial structure of the solar atmosphere at different heights in regions with different large-scale magnetic field structures.
Recent Advances in Techniques for Hyperspectral Image Processing
NASA Technical Reports Server (NTRS)
Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony;
2009-01-01
Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms
Hershberger, Patricia E.; Finnegan, Lorna; Altfeld, Susan; Lake, Sara; Hirshfeld-Cytron, Jennifer
2014-01-01
Background Young women with cancer now face the complex decision about whether to undergo fertility preservation. Yet little is known about how these women process information involved in making this decision. Objective The purpose of this paper is to expand theoretical understanding of the decision-making process by examining aspects of information processing among young women diagnosed with cancer. Methods Using a grounded theory approach, 27 women with cancer participated in individual, semi-structured interviews. Data were coded and analyzed using constant-comparison techniques that were guided by five dimensions within the Contemplate phase of the decision-making process framework. Results In the first dimension, young women acquired information primarily from clinicians and Internet sources. Experiential information, often obtained from peers, occurred in the second dimension. Preferences and values were constructed in the third dimension as women acquired factual, moral, and ethical information. Women desired tailored, personalized information that was specific to their situation in the fourth dimension; however, women struggled with communicating these needs to clinicians. In the fifth dimension, women offered detailed descriptions of clinician behaviors that enhance or impede decisional debriefing. Conclusion Better understanding of theoretical underpinnings surrounding women’s information processes can facilitate decision support and improve clinical care. PMID:24552086
Signature extension: An approach to operational multispectral surveys
NASA Technical Reports Server (NTRS)
Nalepka, R. F.; Morgenstern, J. P.
1973-01-01
Two data processing techniques were suggested as applicable to the large area survey problem. One approach was to use unsupervised classification (clustering) techniques. Investigation of this method showed that since the method did nothing to reduce the signal variability, the use of this method would be very time consuming and possibly inaccurate as well. The conclusion is that unsupervised classification techniques of themselves are not a solution to the large area survey problem. The other method investigated was the use of signature extension techniques. Such techniques function by normalizing the data to some reference condition. Thus signatures from an isolated area could be used to process large quantities of data. In this manner, ground information requirements and computer training are minimized. Several signature extension techniques were tested. The best of these allowed signatures to be extended between data sets collected four days and 80 miles apart with an average accuracy of better than 90%.
Functional neuroanatomy of the rhinophore of Archidoris pseudoargus
NASA Astrophysics Data System (ADS)
Wertz, Adrian; Rössler, Wolfgang; Obermayer, Malu; Bickmeyer, Ulf
2007-06-01
For sea slugs, chemosensory information represents an important sensory modality, because optical and acoustical information are limited. In the present study, we focussed on the neuroanatomy of the rhinophores and processing of olfactory stimuli in the rhinophore ganglion of Archidoris pseudoargus, belonging to the order of Nudibranchia in the subclass of Opisthobranchia. Histological techniques, fluorescent markers, and immunohistochemistry were used to analyse neuroanatomical features of the rhinophore. A large ganglion and a prominent central lymphatic channel are surrounded by longitudinal muscles. Many serotonin-immunoreactive (IR) processes were found around the centre and between the ganglion and the highly folded lobes of the rhinophore, but serotonin-IR cell bodies were absent inside the rhinophore. In contrast to the conditions recently found in Aplysia punctata, we found no evidence for the presence of olfactory glomeruli within the rhinophore. Using calcium-imaging techniques with Fura II as a calcium indicator, we found differential calcium responses in various regions within the ganglion to stimulation of the rhinophore with different amino acids. The lack of glomeruli in the rhinophores induces functional questions about processing of chemical information in the rhinophore.
NASA Technical Reports Server (NTRS)
Saveker, D. R. (Editor)
1973-01-01
The preliminary design of a satellite plus computer earth resources information system is proposed for potential uses in fire prevention and control in the wildland fire community. Suggested are satellite characteristics, sensor characteristics, discrimination algorithms, data communication techniques, data processing requirements, display characteristics, and costs in achieving the integrated wildland fire information system.
Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering
NASA Astrophysics Data System (ADS)
Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.
2004-05-01
Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.
The Computer "Discredit Bureau": An Extension of a Community Information Utility.
ERIC Educational Resources Information Center
Carroll, John M.
The "Discredit" Bureau borrows some of the computerized information-processing techniques adopted by credit-reporting agencies and uses them in the interest of consumers to help them press complaints against suppliers and prospective employers. This is an additional service currently being incorporated into those already afforded by a…
Change detection from remotely sensed images: From pixel-based to object-based approaches
NASA Astrophysics Data System (ADS)
Hussain, Masroor; Chen, Dongmei; Cheng, Angela; Wei, Hui; Stanley, David
2013-06-01
The appetite for up-to-date information about earth's surface is ever increasing, as such information provides a base for a large number of applications, including local, regional and global resources monitoring, land-cover and land-use change monitoring, and environmental studies. The data from remote sensing satellites provide opportunities to acquire information about land at varying resolutions and has been widely used for change detection studies. A large number of change detection methodologies and techniques, utilizing remotely sensed data, have been developed, and newer techniques are still emerging. This paper begins with a discussion of the traditionally pixel-based and (mostly) statistics-oriented change detection techniques which focus mainly on the spectral values and mostly ignore the spatial context. This is succeeded by a review of object-based change detection techniques. Finally there is a brief discussion of spatial data mining techniques in image processing and change detection from remote sensing data. The merits and issues of different techniques are compared. The importance of the exponential increase in the image data volume and multiple sensors and associated challenges on the development of change detection techniques are highlighted. With the wide use of very-high-resolution (VHR) remotely sensed images, object-based methods and data mining techniques may have more potential in change detection.
NASA Astrophysics Data System (ADS)
Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi
2018-02-01
The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.
Design requirements for operational earth resources ground data processing
NASA Technical Reports Server (NTRS)
Baldwin, C. J.; Bradford, L. H.; Burnett, E. S.; Hutson, D. E.; Kinsler, B. A.; Kugle, D. R.; Webber, D. S.
1972-01-01
Realistic tradeoff data and evaluation techniques were studied that permit conceptual design of operational earth resources ground processing systems. Methodology for determining user requirements that utilize the limited information available from users is presented along with definitions of sensor capabilities projected into the shuttle/station era. A tentative method is presented for synthesizing candidate ground processing concepts.
Psychodrama: A Creative Approach for Addressing Parallel Process in Group Supervision
ERIC Educational Resources Information Center
Hinkle, Michelle Gimenez
2008-01-01
This article provides a model for using psychodrama to address issues of parallel process during group supervision. Information on how to utilize the specific concepts and techniques of psychodrama in relation to group supervision is discussed. A case vignette of the model is provided.
MPLP and the Catalog Record as a Finding Aid
ERIC Educational Resources Information Center
Bowen Maier, Shannon
2011-01-01
The cataloging of otherwise unprocessed collections is an innovative minimal processing technique with important implications for reference service. This article mines the existing literature for how institutions engaged in minimal processing view reference, the strengths and weaknesses of catalog records as finding aids, and information about…
Composing, Songwriting, and Producing: Informing Popular Music Pedagogy
ERIC Educational Resources Information Center
Tobias, Evan S.
2013-01-01
In forwarding comprehensive popular music pedagogies, music educators might acknowledge and address expanded notions of composition in popular music that include processes of recording, engineering, mixing, and producing along with the technologies, techniques, and ways of being musical that encompass these processes. This article advances a…
Process Mining Online Assessment Data
ERIC Educational Resources Information Center
Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul
2009-01-01
Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…
Channelling information flows from observation to decision; or how to increase certainty
NASA Astrophysics Data System (ADS)
Weijs, S. V.
2015-12-01
To make adequate decisions in an uncertain world, information needs to reach the decision problem, to enable overseeing the full consequences of each possible decision.On its way from the physical world to a decision problem, information is transferred through the physical processes that influence the sensor, then through processes that happen in the sensor, through wires or electromagnetic waves. For the last decade, most information becomes digitized at some point. From moment of digitization, information can in principle be transferred losslessly. Information about the physical world is often also stored, sometimes in compressed form, such as physical laws, concepts, or models of specific hydrological systems. It is important to note, however, that all information about a physical system eventually has to originate from observation (although inevitably coloured by some prior assumptions). This colouring makes the compression lossy, but is effectively the only way to make use of similarities in time and space that enable predictions while measuring only a a few macro-states of a complex hydrological system.Adding physical process knowledge to a hydrological model can thus be seen as a convenient way to transfer information from observations from a different time or place, to make predictions about another situation, assuming the same dynamics are at work.The key challenge to achieve more certainty in hydrological prediction can therefore be formulated as a challenge to tap and channel information flows from the environment. For tapping more information flows, new measurement techniques, large scale campaigns, historical data sets, and large sample hydrology and regionalization efforts can bring progress. For channelling the information flows with minimum loss, model calibration, and model formulation techniques should be critically investigated. Some experience from research in a Swiss high alpine catchment are used as an illustration.
NASA Astrophysics Data System (ADS)
Bykovskii, Yurii A.; Eloev, E. N.; Kukharenko, K. L.; Panin, A. M.; Solodovnikov, N. P.; Torgashin, A. N.; Arestova, E. L.
1995-10-01
An acousto-optical system for input, display, and coherent-optical processing of information was implemented experimentally. The information transmission capacity, the structure of the information fluxes, and the efficiency of spaceborne telemetric systems were taken into account. The number of equivalent frequency-resolved channels corresponded to the structure of a telemetric frame of a two-step switch. The number of intensity levels of laser radiation corresponded to the scale of changes in the parameters. Use was made of the technology of a liquid optical contact between a wedge-shaped piezoelectric transducer made of lithium niobate and an anisotropic light-and-sound guide made of paratellurite with asymmetric scattering geometry. The simplest technique for optical filtering of multiparameter signals was analysed.
Transmission ultrasonography. [time delay spectrometry for soft tissue transmission imaging
NASA Technical Reports Server (NTRS)
Heyser, R. C.; Le Croissette, D. H.
1973-01-01
Review of the results of the application of an advanced signal-processing technique, called time delay spectrometry, in obtaining soft tissue transmission images by transmission ultrasonography, both in vivo and in vitro. The presented results include amplitude ultrasound pictures and phase ultrasound pictures obtained by this technique. While amplitude ultrasonographs of tissue are closely analogous to X-ray pictures in that differential absorption is imaged, phase ultrasonographs represent an entirely new source of information based on differential time of propagation. Thus, a new source of information is made available for detailed analysis.
NASA Technical Reports Server (NTRS)
Barkstrom, B. R.
1983-01-01
The measurement of the earth's radiation budget has been chosen to illustrate the technique of objective system design. The measurement process is an approximately linear transformation of the original field of radiant exitances, so that linear statistical techniques may be employed. The combination of variability, measurement strategy, and error propagation is presently made with the help of information theory, as suggested by Kondratyev et al. (1975) and Peckham (1974). Covariance matrices furnish the quantitative statement of field variability.
Fujimori, Kiyoshi; Lee, Hans; Sloey, Christopher; Ricci, Margaret S; Wen, Zai-Qing; Phillips, Joseph; Nashed-Samuel, Yasser
2016-01-01
Certain types of glass vials used as primary containers for liquid formulations of biopharmaceutical drug products have been observed with delamination that produced small glass like flakes termed lamellae under certain conditions during storage. The cause of this delamination is in part related to the glass surface defects, which renders the vials susceptible to flaking, and lamellae are formed during the high-temperature melting and annealing used for vial fabrication and shaping. The current European Pharmacopoeia method to assess glass vial quality utilizes acid titration of vial extract pools to determine hydrolytic resistance or alkalinity. Four alternative techniques with improved throughput, convenience, and/or comprehension were examined by subjecting seven lots of vials to analysis by all techniques. The first three new techniques of conductivity, flame photometry, and inductively coupled plasma mass spectrometry measured the same sample pools as acid titration. All three showed good correlation with alkalinity: conductivity (R(2) = 0.9951), flame photometry sodium (R(2) = 0.9895), and several elements by inductively coupled plasma mass spectrometry [(sodium (R(2) = 0.9869), boron (R(2) = 0.9796), silicon (R(2) = 0.9426), total (R(2) = 0.9639)]. The fourth technique processed the vials under conditions that promote delamination, termed accelerated lamellae formation, and then inspected those vials visually for lamellae. The visual inspection results without the lot with different processing condition correlated well with alkalinity (R(2) = 0.9474). Due to vial processing differences affecting alkalinity measurements and delamination propensity differently, the ratio of silicon and sodium measurements from inductively coupled plasma mass spectrometry was the most informative technique to assess overall vial quality and vial propensity for lamellae formation. The other techniques of conductivity, flame photometry, and accelerated lamellae formation condition may still be suitable for routine screening of vial lots produced under consistent processes. Recently, delamination that produced small glass like flakes termed lamellae has been observed in glass vials that are commonly used as primary containers for pharmaceutical drug products under certain conditions during storage. The main cause of these lamellae was the quality of the glass itself related to the manufacturing process. Current European Pharmacopoeia method to assess glass vial quality utilizes acid titration of vial extract pools to determine hydrolytic resistance or alkalinity. As alternative to the European Pharmacopoeia method, four other techniques were assessed. Three new techniques of conductivity, flame photometry, and inductively coupled plasma mass spectrometry measured the vial extract pool as acid titration to quantify quality, and they demonstrated good correlation with original alkalinity. The fourth technique processed the vials under conditions that promote delamination, termed accelerated lamellae formation, and the vials were then inspected visually for lamellae. The accelerated lamellae formation technique also showed good correlation with alkalinity. Of the new four techniques, inductively coupled plasma mass spectrometry was the most informative technique to assess overall vial quality even with differences in processing between vial lots. Other three techniques were still suitable for routine screening of vial lots produced under consistent processes. © PDA, Inc. 2016.
Earth Observation Services (Image Processing Software)
NASA Technical Reports Server (NTRS)
1992-01-01
San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.
Parallel Visualization Co-Processing of Overnight CFD Propulsion Applications
NASA Technical Reports Server (NTRS)
Edwards, David E.; Haimes, Robert
1999-01-01
An interactive visualization system pV3 is being developed for the investigation of advanced computational methodologies employing visualization and parallel processing for the extraction of information contained in large-scale transient engineering simulations. Visual techniques for extracting information from the data in terms of cutting planes, iso-surfaces, particle tracing and vector fields are included in this system. This paper discusses improvements to the pV3 system developed under NASA's Affordable High Performance Computing project.
NASA Technical Reports Server (NTRS)
Baron, S.; Levison, W. H.
1977-01-01
Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.
Reduce Fluid Experiment System: Flight data from the IML-1 Mission
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Harper, Sabrina
1995-01-01
Processing and data reduction of holographic images from the International Microgravity Laboratory 1 (IML-1) presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Use of several processing techniques, including the Computerized Holographic Image Processing System and the Software Development Package (SDP-151) will provide fundamental information for holographic and schlieren analysis of the space flight data.
Vecchio, Riccardo; Lisanti, Maria Tiziana; Caracciolo, Francesco; Cembalo, Luigi; Gambuti, Angelita; Moio, Luigi; Siani, Tiziana; Marotta, Giuseppe; Nazzaro, Concetta; Piombino, Paola
2018-05-28
The present research aims to analyse, by combining sensory and experimental economics techniques, to what extent production process, and the information about it, may affect consumer preferences. Sparkling wines produced by Champenoise and Charmat methods were the object of the study. A quantitative descriptive sensory analysis with a trained panel and non-hypothetical auctions combined with hedonic ratings involving young wine consumers (N=100), under different information scenarios(Blind, Info and Info Taste), were performed. Findings show that the production process impacts both the sensory profile of sparkling wines and consumer expectations. In particular, the hedonic ratings revealed that when tasting the products, both with no information on the production process (Blind) and with such information (Info Taste), the consumers preferred the Charmat wines. On the contrary, when detailed information on the production methods was given without tasting (Info), consumers liked more the two Champenoise wines. It can be concluded that sensory and non-sensory attributes of sparkling wines affect consumers' preferences. Specifically, the study suggests that production process information strongly impacts liking expectations, while not affecting informed liking. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Granular computing with multiple granular layers for brain big data processing.
Wang, Guoyin; Xu, Ji
2014-12-01
Big data is the term for a collection of datasets so huge and complex that it becomes difficult to be processed using on-hand theoretical models and technique tools. Brain big data is one of the most typical, important big data collected using powerful equipments of functional magnetic resonance imaging, multichannel electroencephalography, magnetoencephalography, Positron emission tomography, near infrared spectroscopic imaging, as well as other various devices. Granular computing with multiple granular layers, referred to as multi-granular computing (MGrC) for short hereafter, is an emerging computing paradigm of information processing, which simulates the multi-granular intelligent thinking model of human brain. It concerns the processing of complex information entities called information granules, which arise in the process of data abstraction and derivation of information and even knowledge from data. This paper analyzes three basic mechanisms of MGrC, namely granularity optimization, granularity conversion, and multi-granularity joint computation, and discusses the potential of introducing MGrC into intelligent processing of brain big data.
High-end clinical domain information systems for effective healthcare delivery.
Mangalampalli, Ashish; Rama, Chakravarthy; Muthiyalian, Raja; Jain, Ajeet K
2007-01-01
The Electronic Health Record (EHR) provides doctors with a quick, reliable, secure, real-time and user-friendly source of all relevant patient data. The latest information system technologies, such as Clinical Data Warehouses (CDW), Clinical Decision-Support (CDS) systems and data-mining techniques (Online Analytical Processing (OLAP) and Online Transactional Processing (OLTP)), are used to maintain and utilise patient data intelligently, based on the users' requirements. Moreover, clinical trial reports for new drug approvals are now being submitted electronically for faster and easier processing. Also, information systems are used in educating patients about the latest developments in medical science through the internet and specially configured kiosks in hospitals and clinics.
ERIC Educational Resources Information Center
Pike, Pamela D.; Carter, Rebecca
2010-01-01
The purpose of this study was to compare the effect of cognitive chunking techniques among first-semester group-piano music majors. The ability to group discrete pieces of information into larger, more meaningful chunks is essential for efficient cognitive processing. Since reading keyboard music and playing the piano is a cognitively complex…
The Taped Monologue as Narrative Technique for Reflective Practice
ERIC Educational Resources Information Center
Ford, Keith
2016-01-01
In this article, I describe how an EFL teacher engaged in a process of reflective practice. As she looked back on her teaching career, she explored the critical incidents, principles, and practices that have informed her present teaching identity. I focus on how a taped monologue narrative technique was used, and on the rationale, practice, and…
Precise orbit determination for NASA's earth observing system using GPS (Global Positioning System)
NASA Technical Reports Server (NTRS)
Williams, B. G.
1988-01-01
An application of a precision orbit determination technique for NASA's Earth Observing System (EOS) using the Global Positioning System (GPS) is described. This technique allows the geometric information from measurements of GPS carrier phase and P-code pseudo-range to be exploited while minimizing requirements for precision dynamical modeling. The method combines geometric and dynamic information to determine the spacecraft trajectory; the weight on the dynamic information is controlled by adjusting fictitious spacecraft accelerations in three dimensions which are treated as first order exponentially time correlated stochastic processes. By varying the time correlation and uncertainty of the stochastic accelerations, the technique can range from purely geometric to purely dynamic. Performance estimates for this technique as applied to the orbit geometry planned for the EOS platforms indicate that decimeter accuracies for EOS orbit position may be obtainable. The sensitivity of the predicted orbit uncertainties to model errors for station locations, nongravitational platform accelerations, and Earth gravity is also presented.
Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review.
Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F
2016-03-05
In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works.
Localisation of epileptic foci using novel imaging modalities
De Ciantis, Alessio; Lemieux, Louis
2013-01-01
Purpose of review This review examines recent reports on the use of advanced techniques to map the regions and networks involved during focal epileptic seizure generation in humans. Recent findings A number of imaging techniques are capable of providing new localizing information on the ictal processes and epileptogenic zone. Evaluating the clinical utility of these findings has been mainly performed through post-hoc comparison with the findings of invasive EEG and ictal single-photon emission computed tomography, using postsurgical seizure reduction as the main outcome measure. Added value has been demonstrated in MRI-negative cases. Improved understanding of the human ictiogenic processes and the focus vs. network hypothesis is likely to result from the application of multimodal techniques that combine electrophysiological, semiological, and whole-brain coverage of brain activity changes. Summary On the basis of recent research in the field of neuroimaging, several novel imaging modalities have been improved and developed to provide information about the localization of epileptic foci. PMID:23823464
Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review
Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F.
2016-01-01
In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works. PMID:26959030
Volcano alert level systems: managing the challenges of effective volcanic crisis communication
NASA Astrophysics Data System (ADS)
Fearnley, C. J.; Beaven, S.
2018-05-01
Over the last four decades, volcano observatories have adopted a number of different communication strategies for the dissemination of information on changes in volcanic behaviour and potential hazards to a wide range of user groups. These commonly include a standardised volcano alert level system (VALS), used in conjunction with other uni-valent communication techniques (such as information statements, reports and maps) and multi-directional techniques (such as meetings and telephone calls). This research, based on interviews and observation conducted 2007-2009 at the five US Geological Survey (USGS) volcano observatories, and including some of the key users of the VALS, argues for the importance of understanding how communicating volcanic hazard information takes place as an everyday social practice, focusing on the challenges of working across the boundaries between the scientific and decision-making communities. It is now widely accepted that the effective use, value and deployment of information across science-policy interfaces of this kind depend on three criteria: the scientific credibility of the information, its relevance to the needs of stakeholders and the legitimacy of both the information and the processes that produced it. Translation and two-way communication are required to ensure that all involved understand what information is credible and relevant. Findings indicate that whilst VALS play a role in raising awareness of an unfolding situation, supplementary communication techniques are crucial in facilitating situational understanding of that situation, and the uncertainties inherent to its scientific assessment, as well as in facilitating specific responses. In consequence, `best practice' recommendations eschew further standardisation, and focus on the in situ cultivation of dialogue between scientists and stakeholders as a means of ensuring that information, and the processes through which it is produced are perceived to be legitimate by all involved.
Image processing and analysis using neural networks for optometry area
NASA Astrophysics Data System (ADS)
Netto, Antonio V.; Ferreira de Oliveira, Maria C.
2002-11-01
In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.
Activity Summaries as a Classroom Assessment Tool.
ERIC Educational Resources Information Center
McGee, Steven; Kirby, Jennifer; Croft, Steven K.
This study explored the usefulness of a classroom assessment technique called the activity summary template. It is proposed that the activity summary template enables students to process and organize information learning during an investigation. This process will in turn help students to achieve greater learning outcomes. The activity summary…
Information Processing Abilities and Reading.
ERIC Educational Resources Information Center
Samuels, S. Jay
1987-01-01
A major focus in reading difficulty is lack of automaticity in decoding, which overloads the attentional system, leads to the use of small, meaningless visual processing units such as the individual letter, places heavy demands on short-term memory, and interferes with comprehension. Techniques for diagnosis and remediation are noted. (Author/JW)
Displays, memories, and signal processing: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
Articles on electronics systems and techniques were presented. The first section is on displays and other electro-optical systems; the second section is devoted to signal processing. The third section presented several new memory devices for digital equipment, including articles on holographic memories. The latest patent information available is also given.
ERIC Educational Resources Information Center
Alfred, Richard L.; Hummel, Mary L.
Postsecondary instructional dynamics is a complex process in which inputs (student characteristics and expectations, resources, and faculty characteristics and preparation) are converted through the educational process (instruction strategies, models, and techniques as well as supportive services) into outputs (outcomes and benefits of instruction…
Diffusion processes in tumors: A nuclear medicine approach
NASA Astrophysics Data System (ADS)
Amaya, Helman
2016-07-01
The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and 18F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer software was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical 18F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.
[Medical data warehousing as a generator of system component for decision support in health care].
Catibusić, Sulejman; Hadzagić-Catibusić, Feriha; Zubcević, Smail
2004-01-01
Growth in role of data warehousing as strategic information for decision makers is significant. Many health institutions have data warehouse implementations in process of development or even in production. This article was made with intention to improve general understanding of data warehousing requirements form the point of view of end-users, and information system as well. For that reason, in this document advantages and arguments for implementation, techniques and methods of data warehousing, data warehouse foundation and exploration of information as final product of data warehousing process have been described.
TOF-SIMS imaging technique with information entropy
NASA Astrophysics Data System (ADS)
Aoyagi, Satoka; Kawashima, Y.; Kudo, Masahiro
2005-05-01
Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is capable of chemical imaging of proteins on insulated samples in principal. However, selection of specific peaks related to a particular protein, which are necessary for chemical imaging, out of numerous candidates had been difficult without an appropriate spectrum analysis technique. Therefore multivariate analysis techniques, such as principal component analysis (PCA), and analysis with mutual information defined by information theory, have been applied to interpret SIMS spectra of protein samples. In this study mutual information was applied to select specific peaks related to proteins in order to obtain chemical images. Proteins on insulated materials were measured with TOF-SIMS and then SIMS spectra were analyzed by means of the analysis method based on the comparison using mutual information. Chemical mapping of each protein was obtained using specific peaks related to each protein selected based on values of mutual information. The results of TOF-SIMS images of proteins on the materials provide some useful information on properties of protein adsorption, optimality of immobilization processes and reaction between proteins. Thus chemical images of proteins by TOF-SIMS contribute to understand interactions between material surfaces and proteins and to develop sophisticated biomaterials.
Extraction of Data from a Hospital Information System to Perform Process Mining.
Neira, Ricardo Alfredo Quintano; de Vries, Gert-Jan; Caffarel, Jennifer; Stretton, Erin
2017-01-01
The aim of this work is to share our experience in relevant data extraction from a hospital information system in preparation for a research study using process mining techniques. The steps performed were: research definition, mapping the normative processes, identification of tables and fields names of the database, and extraction of data. We then offer lessons learned during data extraction phase. Any errors made in the extraction phase will propagate and have implications on subsequent analyses. Thus, it is essential to take the time needed and devote sufficient attention to detail to perform all activities with the goal of ensuring high quality of the extracted data. We hope this work will be informative for other researchers to plan and execute extraction of data for process mining research studies.
QPA-CLIPS: A language and representation for process control
NASA Technical Reports Server (NTRS)
Freund, Thomas G.
1994-01-01
QPA-CLIPS is an extension of CLIPS oriented towards process control applications. Its constructs define a dependency network of process actions driven by sensor information. The language consists of three basic constructs: TASK, SENSOR, and FILTER. TASK's define the dependency network describing alternative state transitions for a process. SENSOR's and FILTER's define sensor information sources used to activate state transitions within the network. Deftemplate's define these constructs and their run-time environment is an interpreter knowledge base, performing pattern matching on sensor information and so activating TASK's in the dependency network. The pattern matching technique is based on the repeatable occurrence of a sensor data pattern. QPA-CIPS has been successfully tested on a SPARCStation providing supervisory control to an Allen-Bradley PLC 5 controller driving molding equipment.
A framework for software fault tolerance in real-time systems
NASA Technical Reports Server (NTRS)
Anderson, T.; Knight, J. C.
1983-01-01
A classification scheme for errors and a technique for the provision of software fault tolerance in cyclic real-time systems is presented. The technique requires that the process structure of a system be represented by a synchronization graph which is used by an executive as a specification of the relative times at which they will communicate during execution. Communication between concurrent processes is severely limited and may only take place between processes engaged in an exchange. A history of error occurrences is maintained by an error handler. When an error is detected, the error handler classifies it using the error history information and then initiates appropriate recovery action.
ERIC Educational Resources Information Center
Landmesser, John Andrew
2014-01-01
Information technology (IT) investment decision makers are required to process large volumes of complex data. An existing body of knowledge relevant to IT portfolio management (PfM), decision analysis, visual comprehension of large volumes of information, and IT investment decision making suggest Multi-Criteria Decision Making (MCDM) and…
User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases
ERIC Educational Resources Information Center
Hartley, Roger; Almuhaidib, Saud M. Y.
2007-01-01
Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…
Waste Management, Treatment, and Disposal for the Food Processing Industry. Special Circular 113.
ERIC Educational Resources Information Center
Wooding, N. Henry
This publication contains information relating to waste prevention, treatment and disposal, and waste product utilization. Its primary purpose is to provide information that will help the food industry executive recognize waste problems and make wise management decisions. The discussion of the methods, techniques, and the state-of-the-art is…
Ubiquitous Learning Website: Scaffold Learners by Mobile Devices with Information-Aware Techniques
ERIC Educational Resources Information Center
Chen, G. D.; Chang, C. K.; Wang, C. Y.
2008-01-01
The portability and immediate communication properties of mobile devices influence the learning processes in interacting with peers, accessing resources and transferring data. For example, the short message and browsing functions in a cell phone provide users with timely and adaptive information access. Although many studies of mobile learning…
ERIC Educational Resources Information Center
Guzzetta, Francesco; Conti, Guido; Mercuri, Eugenio
2011-01-01
Increasing attention has been devoted to the maturation of sensory processing in the first year of life. While the development of cortical visual function has been thoroughly studied, much less information is available on auditory processing and its early disorders. The aim of this paper is to provide an overview of the assessment techniques for…
Cleanliness inspection tool for RSRM bond surfaces
NASA Technical Reports Server (NTRS)
Mattes, Robert A.
1995-01-01
Using optically stimulated electron emission (OSEE), Thiokol has monitored bond surfaces in process for contamination on the Redesigned Solid Rocket Motor (RSRM). This technique provides process control information to help assure bond surface quality and repeatability prior to bonding. This paper will describe OSEE theory of operation and the instrumentation implemented at Thiokol Corporation since 1987. Data from process hardware will be presented.
Cloud-based adaptive exon prediction for DNA analysis.
Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen
2018-02-01
Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.
Quantum technology and cryptology for information security
NASA Astrophysics Data System (ADS)
Naqvi, Syed; Riguidel, Michel
2007-04-01
Cryptology and information security are set to play a more prominent role in the near future. In this regard, quantum communication and cryptography offer new opportunities to tackle ICT security. Quantum Information Processing and Communication (QIPC) is a scientific field where new conceptual foundations and techniques are being developed. They promise to play an important role in the future of information Security. It is therefore essential to have a cross-fertilizing development between quantum technology and cryptology in order to address the security challenges of the emerging quantum era. In this article, we discuss the impact of quantum technology on the current as well as future crypto-techniques. We then analyse the assumptions on which quantum computers may operate. Then we present our vision for the distribution of security attributes using a novel form of trust based on Heisenberg's uncertainty; and, building highly secure quantum networks based on the clear transmission of single photons and/or bundles of photons able to withstand unauthorized reading as a result of secure protocols based on the observations of quantum mechanics. We argue how quantum cryptographic systems need to be developed that can take advantage of the laws of physics to provide long-term security based on solid assumptions. This requires a structured integration effort to deploy quantum technologies within the existing security infrastructure. Finally, we conclude that classical cryptographic techniques need to be redesigned and upgraded in view of the growing threat of cryptanalytic attacks posed by quantum information processing devices leading to the development of post-quantum cryptography.
Image information content and patient exposure.
Motz, J W; Danos, M
1978-01-01
Presently, patient exposure and x-ray tube kilovoltage are determined by image visibility requirements on x-ray film. With the employment of image-processing techniques, image visibility may be manipulated and the exposure may be determined only by the desired information content, i.e., by the required degree of tissue-density descrimination and spatial resolution. This work gives quantitative relationships between the image information content and the patient exposure, give estimates of the minimum exposures required for the detection of image signals associated with particular radiological exams. Also, for subject thickness larger than approximately 5 cm, the results show that the maximum information content may be obtained at a single kilovoltage and filtration with the simultaneous employment of image-enhancement and antiscatter techniques. This optimization may be used either to reduce the patient exposure or to increase the retrieved information.
NASA Technical Reports Server (NTRS)
Britt, C. L., Jr.
1975-01-01
The development of an RF Multilateration system to provide accurate position and velocity measurements during the approach and landing phase of Vertical Takeoff Aircraft operation is discussed. The system uses an angle-modulated ranging signal to provide both range and range rate measurements between an aircraft transponder and multiple ground stations. Range and range rate measurements are converted to coordinate measurements and the coordinate and coordinate rate information is transmitted by an integral data link to the aircraft. Data processing techniques are analyzed to show advantages and disadvantages. Error analyses are provided to permit a comparison of the various techniques.
Application of Remote Sensing Techniques for Appraising Changes in Wildlife Habitat
NASA Technical Reports Server (NTRS)
Nelson, H. K.; Klett, A. T.; Johnston, J. E.
1971-01-01
An attempt was made to investigate the potential of airborne, multispectral, line scanner data acquisition and computer-implemented automatic recognition techniques for providing useful information about waterfowl breeding habitat in North Dakota. The spectral characteristics of the components of a landscape containing waterfowl habitat can be detected with airborne scanners. By analyzing these spectral characteristics it is possible to identify and map the landscape components through analog and digital processing methods. At the present stage of development multispectral remote sensing techniques are not ready for operational application to surveys of migratory bird habitat and other such resources. Further developments are needed to: (1) increase accuracy; (2) decrease retrieval and processing time; and (3) reduce costs.
Congestion estimation technique in the optical network unit registration process.
Kim, Geunyong; Yoo, Hark; Lee, Dongsoo; Kim, Youngsun; Lim, Hyuk
2016-07-01
We present a congestion estimation technique (CET) to estimate the optical network unit (ONU) registration success ratio for the ONU registration process in passive optical networks. An optical line terminal (OLT) estimates the number of collided ONUs via the proposed scheme during the serial number state. The OLT can obtain congestion level among ONUs to be registered such that this information may be exploited to change the size of a quiet window to decrease the collision probability. We verified the efficiency of the proposed method through simulation and experimental results.
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao
The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
Compressed-domain video indexing techniques using DCT and motion vector information in MPEG video
NASA Astrophysics Data System (ADS)
Kobla, Vikrant; Doermann, David S.; Lin, King-Ip; Faloutsos, Christos
1997-01-01
Development of various multimedia applications hinges on the availability of fast and efficient storage, browsing, indexing, and retrieval techniques. Given that video is typically stored efficiently in a compressed format, if we can analyze the compressed representation directly, we can avoid the costly overhead of decompressing and operating at the pixel level. Compressed domain parsing of video has been presented in earlier work where a video clip is divided into shots, subshots, and scenes. In this paper, we describe key frame selection, feature extraction, and indexing and retrieval techniques that are directly applicable to MPEG compressed video. We develop a frame-type independent representation of the various types of frames present in an MPEG video in which al frames can be considered equivalent. Features are derived from the available DCT, macroblock, and motion vector information and mapped to a low-dimensional space where they can be accessed with standard database techniques. The spatial information is used as primary index while the temporal information is used to enhance the robustness of the system during the retrieval process. The techniques presented enable fast archiving, indexing, and retrieval of video. Our operational prototype typically takes a fraction of a second to retrieve similar video scenes from our database, with over 95% success.
Applied in situ product recovery in ABE fermentation.
Outram, Victoria; Lalander, Carl-Axel; Lee, Jonathan G M; Davies, E Timothy; Harvey, Adam P
2017-05-01
The production of biobutanol is hindered by the product's toxicity to the bacteria, which limits the productivity of the process. In situ product recovery of butanol can improve the productivity by removing the source of inhibition. This paper reviews in situ product recovery techniques applied to the acetone butanol ethanol fermentation in a stirred tank reactor. Methods of in situ recovery include gas stripping, vacuum fermentation, pervaporation, liquid-liquid extraction, perstraction, and adsorption, all of which have been investigated for the acetone, butanol, and ethanol fermentation. All techniques have shown an improvement in substrate utilization, yield, productivity or both. Different fermentation modes favored different techniques. For batch processing gas stripping and pervaporation were most favorable, but in fed-batch fermentations gas stripping and adsorption were most promising. During continuous processing perstraction appeared to offer the best improvement. The use of hybrid techniques can increase the final product concentration beyond that of single-stage techniques. Therefore, the selection of an in situ product recovery technique would require comparable information on the energy demand and economics of the process. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:563-579, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
Artificial Intelligence and Information Management
NASA Astrophysics Data System (ADS)
Fukumura, Teruo
After reviewing the recent popularization of the information transmission and processing technologies, which are supported by the progress of electronics, the authors describe that by the introduction of the opto-electronics into the information technology, the possibility of applying the artificial intelligence (AI) technique to the mechanization of the information management has emerged. It is pointed out that althuogh AI deals with problems in the mental world, its basic methodology relies upon the verification by evidence, so the experiment on computers become indispensable for the study of AI. The authors also describe that as computers operate by the program, the basic intelligence which is concerned in AI is that expressed by languages. This results in the fact that the main tool of AI is the logical proof and it involves an intrinsic limitation. To answer a question “Why do you employ AI in your problem solving”, one must have ill-structured problems and intend to conduct deep studies on the thinking and the inference, and the memory and the knowledge-representation. Finally the authors discuss the application of AI technique to the information management. The possibility of the expert-system, processing of the query, and the necessity of document knowledge-base are stated.
Automation and hypermedia technology applications
NASA Technical Reports Server (NTRS)
Jupin, Joseph H.; Ng, Edward W.; James, Mark L.
1993-01-01
This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.
Michael A. Fosberg
1987-01-01
Future improvements in the meteorological forecasts used in fire management will come from improvements in three areas: observational systems, forecast techniques, and postprocessing of forecasts and better integration of this information into the fire management process.
A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications
NASA Astrophysics Data System (ADS)
Llinas, James
This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.
Image storage in coumarin-based copolymer thin films by photoinduced dimerization.
Gindre, Denis; Iliopoulos, Konstantinos; Krupka, Oksana; Champigny, Emilie; Morille, Yohann; Sallé, Marc
2013-11-15
We report a technique to encode grayscale digital images in thin films composed of copolymers containing coumarins. A nonlinear microscopy setup was implemented and two nonlinear optical processes were used to store and read information. A third-order process (two-photon absorption) was used to photoinduce a controlled dimer-to-monomer ratio within a defined tiny volume in the material, which corresponds to each recorded bit of data. Moreover, a second-order process (second-harmonic generation) was used to read the stored information, which has been found to be highly dependent upon the monomer-to-dimer ratio.
Deep learning with convolutional neural network in radiology.
Yasaka, Koichiro; Akai, Hiroyuki; Kunimatsu, Akira; Kiryu, Shigeru; Abe, Osamu
2018-04-01
Deep learning with a convolutional neural network (CNN) is gaining attention recently for its high performance in image recognition. Images themselves can be utilized in a learning process with this technique, and feature extraction in advance of the learning process is not required. Important features can be automatically learned. Thanks to the development of hardware and software in addition to techniques regarding deep learning, application of this technique to radiological images for predicting clinically useful information, such as the detection and the evaluation of lesions, etc., are beginning to be investigated. This article illustrates basic technical knowledge regarding deep learning with CNNs along the actual course (collecting data, implementing CNNs, and training and testing phases). Pitfalls regarding this technique and how to manage them are also illustrated. We also described some advanced topics of deep learning, results of recent clinical studies, and the future directions of clinical application of deep learning techniques.
Farzandipour, Mehrdad; Meidani, Zahra; Riazi, Hossein; Sadeqi Jabali, Monireh
2016-12-01
Considering the integral role of understanding users' requirements in information system success, this research aimed to determine functional requirements of nursing information systems through a national survey. Delphi technique method was applied to conduct this study through three phases: focus group method modified Delphi technique and classic Delphi technique. A cross-sectional study was conducted to evaluate the proposed requirements within 15 general hospitals in Iran. Forty-three of 76 approved requirements were clinical, and 33 were administrative ones. Nurses' mean agreements for clinical requirements were higher than those of administrative requirements; minimum and maximum means of clinical requirements were 3.3 and 3.88, respectively. Minimum and maximum means of administrative requirements were 3.1 and 3.47, respectively. Research findings indicated that those information system requirements that support nurses in doing tasks including direct care, medicine prescription, patient treatment management, and patient safety have been the target of special attention. As nurses' requirements deal directly with patient outcome and patient safety, nursing information systems requirements should not only address automation but also nurses' tasks and work processes based on work analysis.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
NASA Astrophysics Data System (ADS)
Zlotnik, Sergio
2017-04-01
Information provided by visualisation environments can be largely increased if the data shown is combined with some relevant physical processes and the used is allowed to interact with those processes. This is particularly interesting in VR environments where the user has a deep interplay with the data. For example, a geological seismic line in a 3D "cave" shows information of the geological structure of the subsoil. The available information could be enhanced with the thermal state of the region under study, with water-flow patterns in porous rocks or with rock displacements under some stress conditions. The information added by the physical processes is usually the output of some numerical technique applied to solve a Partial Differential Equation (PDE) that describes the underlying physics. Many techniques are available to obtain numerical solutions of PDE (e.g. Finite Elements, Finite Volumes, Finite Differences, etc). Although, all these traditional techniques require very large computational resources (particularly in 3D), making them useless in a real time visualization environment -such as VR- because the time required to compute a solution is measured in minutes or even in hours. We present here a novel alternative for the resolution of PDE-based problems that is able to provide a 3D solutions for a very large family of problems in real time. That is, the solution is evaluated in a one thousands of a second, making the solver ideal to be embedded into VR environments. Based on Model Order Reduction ideas, the proposed technique divides the computational work in to a computationally intensive "offline" phase, that is run only once in a life time, and an "online" phase that allow the real time evaluation of any solution within a family of problems. Preliminary examples of real time solutions of complex PDE-based problems will be presented, including thermal problems, flow problems, wave problems and some simple coupled problems.
Multisource data fusion for documenting archaeological sites
NASA Astrophysics Data System (ADS)
Knyaz, Vladimir; Chibunichev, Alexander; Zhuravlev, Denis
2017-10-01
The quality of archaeological sites documenting is of great importance for cultural heritage preserving and investigating. The progress in developing new techniques and systems for data acquisition and processing creates an excellent basis for achieving a new quality of archaeological sites documenting and visualization. archaeological data has some specific features which have to be taken into account when acquiring, processing and managing. First of all, it is a needed to gather as full as possible information about findings providing no loss of information and no damage to artifacts. Remote sensing technologies are the most adequate and powerful means which satisfy this requirement. An approach to archaeological data acquiring and fusion based on remote sensing is proposed. It combines a set of photogrammetric techniques for obtaining geometrical and visual information at different scales and detailing and a pipeline for archaeological data documenting, structuring, fusion, and analysis. The proposed approach is applied for documenting of Bosporus archaeological expedition of Russian State Historical Museum.
Qubit Manipulations Techniques for Trapped-Ion Quantum Information Processing
NASA Astrophysics Data System (ADS)
Gaebler, John; Tan, Ting; Lin, Yiheng; Bowler, Ryan; Jost, John; Meier, Adam; Knill, Emanuel; Leibfried, Dietrich; Wineland, David; Ion Storage Team
2013-05-01
We report recent results on qubit manipulation techniques for trapped-ions towards scalable quantum information processing (QIP). We demonstrate a platform-independent benchmarking protocol for evaluating the performance of Clifford gates, which form a basis for fault-tolerant QIP. We report a demonstration of an entangling gate scheme proposed by Bermudez et al. [Phys. Rev. A. 85, 040302 (2012)] and achieve a fidelity of 0.974(4). This scheme takes advantage of dynamic decoupling which protects the qubit against dephasing errors. It can be applied directly on magnetic-field-insensitive states, and provides a number of simplifications in experimental implementation compared to some other entangling gates with trapped ions. We also report preliminary results on dissipative creation of entanglement with trapped-ions. Creation of an entangled pair does not require discrete logic gates and thus could reduce the level of quantum-coherent control needed for large-scale QIP. Supported by IARPA, ARO contract No. EAO139840, ONR, and the NIST Quantum Information Program.
Mathematical models utilized in the retrieval of displacement information encoded in fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Lamberti, Luciano
2016-02-01
All the techniques that measure displacements, whether in the range of visible optics or any other form of field methods, require the presence of a carrier signal. A carrier signal is a wave form modulated (modified) by an input, deformation of the medium. A carrier is tagged to the medium under analysis and deforms with the medium. The wave form must be known both in the unmodulated and the modulated conditions. There are two basic mathematical models that can be utilized to decode the information contained in the carrier, phase modulation or frequency modulation, both are closely connected. Basic problems connected to the detection and recovery of displacement information that are common to all optical techniques will be analyzed in this paper, focusing on the general theory common to all the methods independently of the type of signal utilized. The aspects discussed are those that have practical impact in the process of data gathering and data processing.
ERIC Educational Resources Information Center
Mangina, Eleni; Kilbride, John
2008-01-01
The research presented in this paper is an examination of the applicability of IUI techniques in an online e-learning environment. In particular we make use of user modeling techniques, information retrieval and extraction mechanisms and collaborative filtering methods. The domains of e-learning, web-based training and instruction and intelligent…
NASA Astrophysics Data System (ADS)
Modica, A.; Alberghina, M. F.; Brai, M.; Bruno, M.; Di Bella, M.; Fontana, D.; Tranchina, L.
2017-06-01
In the early period, even though professional photographers worked with similar techniques and products, their artistic and commercial aims determined different choices and led them to follow different, often personal, recipes. For this reason, identification of the techniques through date and name of the photographer or through some visual features like colour, tonality and surface of the image layer, often needs further investigation to be proved. Chemical characterization, carried out in a non or micro destructive way, can be crucial to provide useful information about the original composition, degradation process, realization technique, in obtaining an indirect dating of the photograph and/or to choose the most correct conservation treatment. In our case, x-ray fluorescence (XRF) analysis was used to confirm the chemical composition of eleven historical photographs dated between the end of the 19th century and the beginning of the 20th, shot in Palermo (Sicily) by a renowned photographer of the time, and pasted on their original cardboards. The elemental identification, obtained with a non destructive approach, provided important information to distinguish among different photographic techniques in terms of distribution and characterization of chemical elements markers in the photographic surface.
NASA Astrophysics Data System (ADS)
Arunachalam, M. S.; Puli, Anil; Anuradha, B.
2016-07-01
In the present work continuous extraction of convective cloud optical information and reflectivity (MAX(Z) in dBZ) using online retrieval technique for time series data production from Doppler Weather Radar (DWR) located at Indian Meteorological Department, Chennai has been developed in MATLAB. Reflectivity measurements for different locations within the DWR range of 250 Km radii of circular disc area can be retrieved using this technique. It gives both time series reflectivity of point location and also Range Time Intensity (RTI) maps of reflectivity for the corresponding location. The Graphical User Interface (GUI) developed for the cloud reflectivity is user friendly; it also provides the convective cloud optical information such as cloud base height (CBH), cloud top height (CTH) and cloud optical depth (COD). This technique is also applicable for retrieving other DWR products such as Plan Position Indicator (Z, in dBZ), Plan Position Indicator (Z, in dBZ)-Close Range, Volume Velocity Processing (V, in knots), Plan Position Indicator (V, in m/s), Surface Rainfall Intensity (SRI, mm/hr), Precipitation Accumulation (PAC) 24 hrs at 0300UTC. Keywords: Reflectivity, cloud top height, cloud base, cloud optical depth
Long term pavement performance computed parameter : frost penetration
DOT National Transportation Integrated Search
2008-11-01
As the pavement design process moves toward mechanistic-empirical techniques, knowledge of seasonal changes in pavement structural characteristics becomes critical. Specifically, frost penetration information is necessary for determining the effect o...
Combined X-ray CT and mass spectrometry for biomedical imaging applications
NASA Astrophysics Data System (ADS)
Schioppa, E., Jr.; Ellis, S.; Bruinen, A. L.; Visser, J.; Heeren, R. M. A.; Uher, J.; Koffeman, E.
2014-04-01
Imaging technologies play a key role in many branches of science, especially in biology and medicine. They provide an invaluable insight into both internal structure and processes within a broad range of samples. There are many techniques that allow one to obtain images of an object. Different techniques are based on the analysis of a particular sample property by means of a dedicated imaging system, and as such, each imaging modality provides the researcher with different information. The use of multimodal imaging (imaging with several different techniques) can provide additional and complementary information that is not possible when employing a single imaging technique alone. In this study, we present for the first time a multi-modal imaging technique where X-ray computerized tomography (CT) is combined with mass spectrometry imaging (MSI). While X-ray CT provides 3-dimensional information regarding the internal structure of the sample based on X-ray absorption coefficients, MSI of thin sections acquired from the same sample allows the spatial distribution of many elements/molecules, each distinguished by its unique mass-to-charge ratio (m/z), to be determined within a single measurement and with a spatial resolution as low as 1 μm or even less. The aim of the work is to demonstrate how molecular information from MSI can be spatially correlated with 3D structural information acquired from X-ray CT. In these experiments, frozen samples are imaged in an X-ray CT setup using Medipix based detectors equipped with a CO2 cooled sample holder. Single projections are pre-processed before tomographic reconstruction using a signal-to-thickness calibration. In the second step, the object is sliced into thin sections (circa 20 μm) that are then imaged using both matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) and secondary ion (SIMS) mass spectrometry, where the spatial distribution of specific molecules within the sample is determined. The combination of two vastly different imaging approaches provides complementary information (i.e., anatomical and molecular distributions) that allows the correlation of distinct structural features with specific molecules distributions leading to unique insights in disease development.
Arc-Welding Spectroscopic Monitoring based on Feature Selection and Neural Networks.
Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M
2008-10-21
A new spectral processing technique designed for application in the on-line detection and classification of arc-welding defects is presented in this paper. A noninvasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed in two consecutive stages. A compression algorithm is first applied to the data, allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in previous works, giving rise to an improvement in the performance of the monitoring system.
A Holistic Approach to Networked Information Systems Design and Analysis
2016-04-15
attain quite substantial savings. 11. Optimal algorithms for energy harvesting in wireless networks. We use a Markov- decision-process (MDP) based...approach to obtain optimal policies for transmissions . The key advantage of our approach is that it holistically considers information and energy in a...Coding technique to minimize delays and the number of transmissions in Wireless Systems. As we approach an era of ubiquitous computing with information
INFORMATION STORAGE AND RETRIEVAL, A STATE-OF-THE-ART REPORT
The objective of the study was to compile relevant background and interpretive material and prepare a state-of-the-art report which would put the...to-person communications. Section III presents basic IS and R concepts and techniques. It traces the history of traditional librarianship through...the process of communication between the originators and users of information. Section V categorizes the information system operations required to
The Budget Process in Schools of Nursing: A Primer for the Novice Administrator.
ERIC Educational Resources Information Center
Starck, Patricia L.; Bailes, Barbara
1996-01-01
This primer on budgets for nursing schools includes the budgetary process; budgeting techniques; and information about various types of budgets, such as the open-ended budget, incremental budget, quota budget, and alternate-level budget. Questions about budget structure, revenue sources, and budget management and evaluation are answered. (JOW)
The Impact of Integrating Visuals in an Elementary Creative Writing Process.
ERIC Educational Resources Information Center
Bailey, Margaret; And Others
Most children's books are filled with pictures, yet when schools design curricula to teach writing, they often ignore the role of visual images in the writing process. Historically, methods for teaching writing have focused on text. Even relatively recent techniques like brainstorming and story webbing still focus on verbal information. In some…
Event Related Brain Potentials and Cognitive Processing: Implications for Navy Training.
ERIC Educational Resources Information Center
Lewis, Gregory W.; And Others
The cognitive styles, aptitudes, and abilities of 50 right-handed subjects were measured through a battery of paper-and-pencil tests to determine the feasibility of using event related brain potentials (ERPs) in the development of adaptive training techniques keyed to the information processing styles of individual students. Visual, auditory, and…
Using deliberative techniques to engage the community in policy development.
Gregory, Judy; Hartz-Karp, Janette; Watson, Rebecca
2008-07-16
This paper examines work in deliberative approaches to community engagement used in Western Australia by the Department of Planning and Infrastructure and other planning and infrastructure agencies between 2001 and 2005, and considers whether the techniques could be applied to the development of health policy in Australia. Deliberative processes were used in WA to address specific planning and infrastructure problems. Using deliberative techniques, community participants contributed to joint decision making and policy development. Outcomes from deliberative processes were seriously considered by the Minister and used to influence policy decisions. In many cases, the recommendations generated through deliberative processes were fully adopted by the Minister. The experiences in WA demonstrate that deliberative engagement processes can be successfully implemented by government and can be used to guide policy. The techniques can be adapted to suit the context and issues experienced by a portfolio, and the skills required to conduct deliberative processes can be fostered amongst the portfolio's staff. Health policy makers may be able to learn from the experiences in WA, and adopt approaches to community engagement that allow for informed deliberation and debate in the community about the future of Australia's health system.
Akbarzadeh, Rosa; Yousefi, Azizeh-Mitra
2014-08-01
Tissue engineering makes use of 3D scaffolds to sustain three-dimensional growth of cells and guide new tissue formation. To meet the multiple requirements for regeneration of biological tissues and organs, a wide range of scaffold fabrication techniques have been developed, aiming to produce porous constructs with the desired pore size range and pore morphology. Among different scaffold fabrication techniques, thermally induced phase separation (TIPS) method has been widely used in recent years because of its potential to produce highly porous scaffolds with interconnected pore morphology. The scaffold architecture can be closely controlled by adjusting the process parameters, including polymer type and concentration, solvent composition, quenching temperature and time, coarsening process, and incorporation of inorganic particles. The objective of this review is to provide information pertaining to the effect of these parameters on the architecture and properties of the scaffolds fabricated by the TIPS technique. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Zubarev, A. E.; Nadezhdina, I. E.; Brusnikin, E. S.; Karachevtseva, I. P.; Oberst, J.
2016-09-01
The new technique for generation of coordinate control point networks based on photogrammetric processing of heterogeneous planetary images (obtained at different time, scale, with different illumination or oblique view) is developed. The technique is verified with the example for processing the heterogeneous information obtained by remote sensing of Ganymede by the spacecraft Voyager-1, -2 and Galileo. Using this technique the first 3D control point network for Ganymede is formed: the error of the altitude coordinates obtained as a result of adjustment is less than 5 km. The new control point network makes it possible to obtain basic geodesic parameters of the body (axes size) and to estimate forced librations. On the basis of the control point network, digital terrain models (DTMs) with different resolutions are generated and used for mapping the surface of Ganymede with different levels of detail (Zubarev et al., 2015b).
NASA Astrophysics Data System (ADS)
Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.
2016-01-01
In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.
Principal components colour display of ERTS imagery
NASA Technical Reports Server (NTRS)
Taylor, M. M.
1974-01-01
In the technique presented, colours are not derived from single bands, but rather from independent linear combinations of the bands. Using a simple model of the processing done by the visual system, three informationally independent linear combinations of the four ERTS bands are mapped onto the three visual colour dimensions of brightness, redness-greenness and blueness-yellowness. The technique permits user-specific transformations which enhance particular features, but this is not usually needed, since a single transformation provides a picture which conveys much of the information implicit in the ERTS data. Examples of experimental vector images with matched individual band images are shown.
The Successive Contributions of Computers to Education: A Survey.
ERIC Educational Resources Information Center
Lelouche, Ruddy
1998-01-01
Shows how education has successively benefited from traditional information processing through programmed instruction and computer-assisted instruction (CAI), artificial intelligence, intelligent CAI, intelligent tutoring systems, and hypermedia techniques. Contains 29 references. (DDR)
Platform for intraoperative analysis of video streams
NASA Astrophysics Data System (ADS)
Clements, Logan; Galloway, Robert L., Jr.
2004-05-01
Interactive, image-guided surgery (IIGS) has proven to increase the specificity of a variety of surgical procedures. However, current IIGS systems do not compensate for changes that occur intraoperatively and are not reflected in preoperative tomograms. Endoscopes and intraoperative ultrasound, used in minimally invasive surgery, provide real-time (RT) information in a surgical setting. Combining the information from RT imaging modalities with traditional IIGS techniques will further increase surgical specificity by providing enhanced anatomical information. In order to merge these techniques and obtain quantitative data from RT imaging modalities, a platform was developed to allow both the display and processing of video streams in RT. Using a Bandit-II CV frame grabber board (Coreco Imaging, St. Laurent, Quebec) and the associated library API, a dynamic link library was created in Microsoft Visual C++ 6.0 such that the platform could be incorporated into the IIGS system developed at Vanderbilt University. Performance characterization, using two relatively inexpensive host computers, has shown the platform capable of performing simple image processing operations on frames captured from a CCD camera and displaying the processed video data at near RT rates both independent of and while running the IIGS system.
Coding for Efficient Image Transmission
NASA Technical Reports Server (NTRS)
Rice, R. F.; Lee, J. J.
1986-01-01
NASA publication second in series on data-coding techniques for noiseless channels. Techniques used even in noisy channels, provided data further processed with Reed-Solomon or other error-correcting code. Techniques discussed in context of transmission of monochrome imagery from Voyager II spacecraft but applicable to other streams of data. Objective of this type coding to "compress" data; that is, to transmit using as few bits as possible by omitting as much as possible of portion of information repeated in subsequent samples (or picture elements).
1990-09-01
decisionmaking. (Adapted from: R. J. Boland, Jr., "Sense-Making of Accounting Data as a Technique of Organizational Diagnosis ," Management Science, Vol 30, No...Making of Accounting Data as a Technique of Organizational Diagnosis ," Managenent Science, Vol 30, No. 7 (July 1984), pp 868-882. 49 Decisionmaking...Third Edition (Irwin, 1984). Boland, R.J., Jr., "Sense-Making of Accounting Data as a Technique of Organizational Diagnosis ," Management Science, Vol 30
Stochastic Feedforward Control Technique
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1990-01-01
Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.
Dual-Mode Electro-Optical Techniques for Biosensing Applications: A Review
Johnson, Steven
2017-01-01
The monitoring of biomolecular interactions is a key requirement for the study of complex biological processes and the diagnosis of disease. Technologies that are capable of providing label-free, real-time insight into these interactions are of great value for the scientific and clinical communities. Greater understanding of biomolecular interactions alongside increased detection accuracy can be achieved using technology that can provide parallel information about multiple parameters of a single biomolecular process. For example, electro-optical techniques combine optical and electrochemical information to provide more accurate and detailed measurements that provide unique insights into molecular structure and function. Here, we present a comparison of the main methods for electro-optical biosensing, namely, electrochemical surface plasmon resonance (EC-SPR), electrochemical optical waveguide lightmode spectroscopy (EC-OWLS), and the recently reported silicon-based electrophotonic approach. The comparison considers different application spaces, such as the detection of low concentrations of biomolecules, integration, the tailoring of light-matter interaction for the understanding of biomolecular processes, and 2D imaging of biointeractions on a surface. PMID:28880211
Dual-Mode Electro-Optical Techniques for Biosensing Applications: A Review.
Juan-Colás, José; Johnson, Steven; Krauss, Thomas F
2017-09-07
The monitoring of biomolecular interactions is a key requirement for the study of complex biological processes and the diagnosis of disease. Technologies that are capable of providing label-free, real-time insight into these interactions are of great value for the scientific and clinical communities. Greater understanding of biomolecular interactions alongside increased detection accuracy can be achieved using technology that can provide parallel information about multiple parameters of a single biomolecular process. For example, electro-optical techniques combine optical and electrochemical information to provide more accurate and detailed measurements that provide unique insights into molecular structure and function. Here, we present a comparison of the main methods for electro-optical biosensing, namely, electrochemical surface plasmon resonance (EC-SPR), electrochemical optical waveguide lightmode spectroscopy (EC-OWLS), and the recently reported silicon-based electrophotonic approach. The comparison considers different application spaces, such as the detection of low concentrations of biomolecules, integration, the tailoring of light-matter interaction for the understanding of biomolecular processes, and 2D imaging of biointeractions on a surface.
NASA Technical Reports Server (NTRS)
Messmore, J. A.
1976-01-01
The feasibility of using digital satellite imagery and automatic data processing techniques as a means of mapping swamp forest vegetation was considered, using multispectral scanner data acquired by the LANDSAT-1 satellite. The site for this investigation was the Dismal Swamp, a 210,000 acre swamp forest located south of Suffolk, Va. on the Virginia-North Carolina border. Two basic classification strategies were employed. The initial classification utilized unsupervised techniques which produced a map of the swamp indicating the distribution of thirteen forest spectral classes. These classes were later combined into three informational categories: Atlantic white cedar (Chamaecyparis thyoides), Loblolly pine (Pinus taeda), and deciduous forest. The subsequent classification employed supervised techniques which mapped Atlantic white cedar, Loblolly pine, deciduous forest, water and agriculture within the study site. A classification accuracy of 82.5% was produced by unsupervised techniques compared with 89% accuracy using supervised techniques.
Wood transportation systems-a spin-off of a computerized information and mapping technique
William W. Phillips; Thomas J. Corcoran
1978-01-01
A computerized mapping system originally developed for planning the control of the spruce budworm in Maine has been extended into a tool for planning road net-work development and optimizing transportation costs. A budgetary process and a mathematical linear programming routine are used interactively with the mapping and information retrieval capabilities of the system...
Linking Reading and Writing: Concept Mapping as an Organizing Tactic.
ERIC Educational Resources Information Center
Osman-Jouchoux, Rionda
Writers often must summarize others' texts as part of their own work. To succeed at this, they must first read and understand new information and then transform that information to fulfill a specific purpose. Concept mapping, used as a visual organizing technique, can be an effective link between the two processes. In a preliminary study, students…
Image processing and recognition for biological images
Uchida, Seiichi
2013-01-01
This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739
An Adaptive Kalman Filter Using a Simple Residual Tuning Method
NASA Technical Reports Server (NTRS)
Harman, Richard R.
1999-01-01
One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. A. H. Jazwinski developed a specialized version of this technique for estimation of process noise. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.
Visual Modelling of Data Warehousing Flows with UML Profiles
NASA Astrophysics Data System (ADS)
Pardillo, Jesús; Golfarelli, Matteo; Rizzi, Stefano; Trujillo, Juan
Data warehousing involves complex processes that transform source data through several stages to deliver suitable information ready to be analysed. Though many techniques for visual modelling of data warehouses from the static point of view have been devised, only few attempts have been made to model the data flows involved in a data warehousing process. Besides, each attempt was mainly aimed at a specific application, such as ETL, OLAP, what-if analysis, data mining. Data flows are typically very complex in this domain; for this reason, we argue, designers would greatly benefit from a technique for uniformly modelling data warehousing flows for all applications. In this paper, we propose an integrated visual modelling technique for data cubes and data flows. This technique is based on UML profiling; its feasibility is evaluated by means of a prototype implementation.
Development of a fusion approach selection tool
NASA Astrophysics Data System (ADS)
Pohl, C.; Zeng, Y.
2015-06-01
During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.
Computer-assisted techniques to evaluate fringe patterns
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Bhat, Gopalakrishna K.
1992-01-01
Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.
Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process
NASA Technical Reports Server (NTRS)
Racette, Paul
2010-01-01
Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.
Free Surface Downgoing VSP Multiple Imaging
NASA Astrophysics Data System (ADS)
Maula, Fahdi; Dac, Nguyen
2018-03-01
The common usage of a vertical seismic profile is to capture the reflection wavefield (upgoing wavefield) so that it can be used for further well tie or other interpretations. Borehole Seismic (VSP) receivers capture the reflection from below the well trajectory, traditionally no seismic image information above trajectory. The non-traditional way of processing the VSP multiple can be used to expand the imaging above the well trajectory. This paper presents the case study of using VSP downgoing multiples for further non-traditional imaging applications. In general, VSP processing, upgoing and downgoing arrivals are separated during processing. The up-going wavefield is used for subsurface illumination, whereas the downgoing wavefield and multiples are normally excluded from the processing. In a situation where the downgoing wavefield passes the reflectors several times (multiple), the downgoing wavefield carries reflection information. Its benefit is that it can be used for seismic tie up to seabed, and possibility for shallow hazards identifications. One of the concepts of downgoing imaging is widely known as mirror-imaging technique. This paper presents a case study from deep water offshore Vietnam. The case study is presented to demonstrate the robustness of the technique, and the limitations encountered during its processing.
Semantics-driven modelling of user preferences for information retrieval in the biomedical domain.
Gladun, Anatoly; Rogushina, Julia; Valencia-García, Rafael; Béjar, Rodrigo Martínez
2013-03-01
A large amount of biomedical and genomic data are currently available on the Internet. However, data are distributed into heterogeneous biological information sources, with little or even no organization. Semantic technologies provide a consistent and reliable basis with which to confront the challenges involved in the organization, manipulation and visualization of data and knowledge. One of the knowledge representation techniques used in semantic processing is the ontology, which is commonly defined as a formal and explicit specification of a shared conceptualization of a domain of interest. The work presented here introduces a set of interoperable algorithms that can use domain and ontological information to improve information-retrieval processes. This work presents an ontology-based information-retrieval system for the biomedical domain. This system, with which some experiments have been carried out that are described in this paper, is based on the use of domain ontologies for the creation and normalization of lightweight ontologies that represent user preferences in a determined domain in order to improve information-retrieval processes.
Physics-based interactive volume manipulation for sharing surgical process.
Nakao, Megumi; Minato, Kotaro
2010-05-01
This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.
Holmes, Robert R.; Singh, Vijay P.
2016-01-01
The importance of streamflow data to the world’s economy, environmental health, and public safety continues to grow as the population increases. The collection of streamflow data is often an involved and complicated process. The quality of streamflow data hinges on such things as site selection, instrumentation selection, streamgage maintenance and quality assurance, proper discharge measurement techniques, and the development and continued verification of the streamflow rating. This chapter serves only as an overview of the streamflow data collection process as proper treatment of considerations, techniques, and quality assurance cannot be addressed adequately in the space limitations of this chapter. Readers with the need for the detailed information on the streamflow data collection process are referred to the many references noted in this chapter.
Defeating Adversary Network Intelligence Efforts with Active Cyber Defense Techniques
2008-06-01
Hide Things from Hackers: Processes, Principles, and Techniques,” Journal of Information Warfare , 5 (3): 26-40 (2006). 20. Rosenau, William ...54 Additional Sources Apel , Thomas. Generating Fingerprints of Network Servers and their Use in Honeypots. Thesis. Aachen University, Aachen...Paul Williams , PhD (ENG) REPORT U ABSTRACT U c. THIS PAGE U 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF PAGES 55
ERIC Educational Resources Information Center
Molfese, Dennis L.; Molfese, Victoria J.; Kelly, Spencer
2001-01-01
This article provides an introduction to the use of event-related potential (ERP) approaches to study language processes. First, a brief history of the emergence of this technology is presented, followed by definitions, a theoretical overview, and a practical guide to conducting ERP studies. Examples of language studies that use this technique are…
Cockpit System Situational Awareness Modeling Tool
NASA Technical Reports Server (NTRS)
Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara
2004-01-01
This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.
MO-D-PinS Room/Hall E-00: MR Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2016-06-15
MRI, with its excellent soft tissue contrast and its ability to provide physiological as well as anatomical information, is becoming increasingly used in radiation therapy for treatment planning, image-guided radiation therapy, and treatment evaluation. This session will explore solutions to integrating MRI into the simulation process. Obstacles for using MRI for simulation include distortions and artifacts, image acquisition speed, complexity of imaging techniques, and lack of electron density information. Partners in Solutions presents vendor representatives who will present their approaches to meeting these challenges and others. An increased awareness of how MRI simulation works will allow physicists to better understandmore » and use this powerful technique. The speakers are all employees who are presenting information about their company’s products.« less
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan W.
2014-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Rinehart, Aidan Walker
2015-01-01
This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.
ERIC Educational Resources Information Center
Bergeron, Pierrette; Hiller, Christine A.
2002-01-01
Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…
LIFE CYCLE ENGINEERING GUIDELINES
This document provides guidelines for the implementation of LCE concepts, information, and techniques in engineering products, systems, processes, and facilities. To make this document as practical and useable as possible, a unifying LCE framework is presented. Subsequent topics ...
Empirical study on neural network based predictive techniques for automatic number plate recognition
NASA Astrophysics Data System (ADS)
Shashidhara, M. S.; Indrakumar, S. S.
2011-10-01
The objective of this study is to provide an easy, accurate and effective technology for the Bangalore city traffic control. This is based on the techniques of image processing and laser beam technology. The core concept chosen here is an image processing technology by the method of automatic number plate recognition system. First number plate is recognized if any vehicle breaks the traffic rules in the signals. The number is fetched from the database of the RTO office by the process of automatic database fetching. Next this sends the notice and penalty related information to the vehicle owner email-id and an SMS sent to vehicle owner. In this paper, we use of cameras with zooming options & laser beams to get accurate pictures further applied image processing techniques such as Edge detection to understand the vehicle, Identifying the location of the number plate, Identifying the number plate for further use, Plain plate number, Number plate with additional information, Number plates in the different fonts. Accessing the database of the vehicle registration office to identify the name and address and other information of the vehicle number. The updates to be made to the database for the recording of the violation and penalty issues. A feed forward artificial neural network is used for OCR. This procedure is particularly important for glyphs that are visually similar such as '8' and '9' and results in training sets of between 25,000 and 40,000 training samples. Over training of the neural network is prevented by Bayesian regularization. The neural network output value is set to 0.05 when the input is not desired glyph, and 0.95 for correct input.
Statistical process control based chart for information systems security
NASA Astrophysics Data System (ADS)
Khan, Mansoor S.; Cui, Lirong
2015-07-01
Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.
Subic-Wrana, Claudia; Greenberg, Leslie S; Lane, Richard D; Michal, Matthias; Wiltink, Jörg; Beutel, Manfred E
2016-09-01
Affective change has been considered the hallmark of therapeutic change in psychoanalysis. Psychoanalytic writers have begun to incorporate theoretically the advanced understanding of emotional processing and transformation of the affective neurosciences. We ask if this theoretical advancement is reflected in treatment techniques addressing the processing of emotion. We review psychoanalytic models and treatment recommendations of maladaptive affect processing in the light of a neuroscientifically informed model of achieving psychotherapeutic change by activation and reconsolidation of emotional memory. Emotions tend to be treated as other mental contents, resulting in a lack of specific psychodynamic techniques to work with emotions. Manualized technical modifications addressing affect regulation have been successfully tested in patients with personality pathology, but not for psychodynamic treatments of axis I disorders. Emotional memories need to be activated in order to be modified, therefore, we propose to include techniques into psychodynamic therapy that stimulate emotional experience.
NASA Technical Reports Server (NTRS)
Kim, Young-Joon; Pak, Kyung S.; Dunbar, R. Scott; Hsiao, S. Vincent; Callahan, Philip S.
2000-01-01
Planetary boundary layer (PBL) models are utilized to enhance directional ambiguity removal skill in scatterometer data processing. The ambiguity in wind direction retrieved from scatterometer measurements is removed with the aid of physical directional information obtained from PBL models. This technique is based on the observation that sea level pressure is scalar and its field is more coherent than the corresponding wind. An initial wind field obtained from the scatterometer measurements is used to derive a pressure field with a PBL model. After filtering small-scale noise in the derived pressure field, a wind field is generated with an inverted PBL model. This derived wind information is then used to remove wind vector ambiguities in the scatterometer data. It is found that the ambiguity removal skill can be improved when the new technique is used properly in conjunction with the median filter being used for scatterometer wind dealiasing at JPL. The new technique is applied to regions of cyclone systems which are important for accurate weather prediction but where the errors of ambiguity removal are often large.
Process mining techniques: an application to time management
NASA Astrophysics Data System (ADS)
Khowaja, Ali Raza
2018-04-01
In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.
Processing of multispectral thermal IR data for geologic applications
NASA Technical Reports Server (NTRS)
Kahle, A. B.; Madura, D. P.; Soha, J. M.
1979-01-01
Multispectral thermal IR data were acquired with a 24-channel scanner flown in an aircraft over the E. Tintic Utah mining district. These digital image data required extensive computer processing in order to put the information into a format useful for a geologic photointerpreter. Simple enhancement procedures were not sufficient to reveal the total information content because the data were highly correlated in all channels. The data were shown to be dominated by temperature variations across the scene, while the much more subtle spectral variations between the different rock types were of interest. The image processing techniques employed to analyze these data are described.
Materials characterization of propellants using ultrasonics
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Jones, David
1993-01-01
Propellant characteristics for solid rocket motors were not completely determined for its use as a processing variable in today's production facilities. A major effort to determine propellant characteristics obtainable through ultrasonic measurement techniques was performed in this task. The information obtained was then used to determine the uniformity of manufacturing methods and/or the ability to determine non-uniformity in processes.
Advanced Natural Language Processing and Temporal Mining for Clinical Discovery
ERIC Educational Resources Information Center
Mehrabi, Saeed
2016-01-01
There has been vast and growing amount of healthcare data especially with the rapid adoption of electronic health records (EHRs) as a result of the HITECH act of 2009. It is estimated that around 80% of the clinical information resides in the unstructured narrative of an EHR. Recently, natural language processing (NLP) techniques have offered…
ERIC Educational Resources Information Center
Myers, Trina; Monypenny, Richard; Trevathan, Jarrod
2012-01-01
Two significant problems faced by universities are to ensure sustainability and to produce quality graduates. Four aspects of these problems are to improve engagement, to foster interaction, develop required skills and to effectively gauge the level of attention and comprehension within lectures and large tutorials. Process-Oriented Guided Inquiry…
An Exploratory Study of User Searching of the World Wide Web: A Holistic Approach.
ERIC Educational Resources Information Center
Wang, Peiling; Tenopir, Carol; Laymman, Elizabeth; Penniman, David; Collins, Shawn
1998-01-01
Examines Web users' behaviors and needs and tests a methodology for studying users' interaction with the Web. A process-tracing technique, together with tests of cognitive style, anxiety levels, and self-report computer experience, provided data on how users interact with the Web in the process of finding factual information. (Author/AEF)
Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures
NASA Technical Reports Server (NTRS)
Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo
2014-01-01
This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.
Process-driven selection of information systems for healthcare
NASA Astrophysics Data System (ADS)
Mills, Stephen F.; Yeh, Raymond T.; Giroir, Brett P.; Tanik, Murat M.
1995-05-01
Integration of networking and data management technologies such as PACS, RIS and HIS into a healthcare enterprise in a clinically acceptable manner is a difficult problem. Data within such a facility are generally managed via a combination of manual hardcopy systems and proprietary, special-purpose data processing systems. Process modeling techniques have been successfully applied to engineering and manufacturing enterprises, but have not generally been applied to service-based enterprises such as healthcare facilities. The use of process modeling techniques can provide guidance for the placement, configuration and usage of PACS and other informatics technologies within the healthcare enterprise, and thus improve the quality of healthcare. Initial process modeling activities conducted within the Pediatric ICU at Children's Medical Center in Dallas, Texas are described. The ongoing development of a full enterprise- level model for the Pediatric ICU is also described.
NASA Astrophysics Data System (ADS)
Mirapeix, J.; García-Allende, P. B.; Cobo, A.; Conde, O.; López-Higuera, J. M.
2007-07-01
A new spectral processing technique designed for its application in the on-line detection and classification of arc-welding defects is presented in this paper. A non-invasive fiber sensor embedded within a TIG torch collects the plasma radiation originated during the welding process. The spectral information is then processed by means of two consecutive stages. A compression algorithm is first applied to the data allowing real-time analysis. The selected spectral bands are then used to feed a classification algorithm, which will be demonstrated to provide an efficient weld defect detection and classification. The results obtained with the proposed technique are compared to a similar processing scheme presented in a previous paper, giving rise to an improvement in the performance of the monitoring system.
A fault-tolerant information processing concept for space vehicles.
NASA Technical Reports Server (NTRS)
Hopkins, A. L., Jr.
1971-01-01
A distributed fault-tolerant information processing system is proposed, comprising a central multiprocessor, dedicated local processors, and multiplexed input-output buses connecting them together. The processors in the multiprocessor are duplicated for error detection, which is felt to be less expensive than using coded redundancy of comparable effectiveness. Error recovery is made possible by a triplicated scratchpad memory in each processor. The main multiprocessor memory uses replicated memory for error detection and correction. Local processors use any of three conventional redundancy techniques: voting, duplex pairs with backup, and duplex pairs in independent subsystems.
Selecting practice management information systems.
Worley, R; Ciotti, V
1997-01-01
Despite enormous advances in information systems, the process by which most medical practices select them has remained virtually unchanged for decades: the request for proposal (RFP). Unfortunately, vendors have learned ways to minimize the value of RFP checklists to where purchasers now learn little about the system functionality. The authors describe a selection methodology that replaces the RFP with scored demos, reviews of vendor user manuals and mathematically structured reference checking. In a recent selection process at a major medical center, these techniques yielded greater user buy-in and favorable contract terms as well.
Spatial Statistical Data Fusion for Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Nguyen, Hai
2010-01-01
Data fusion is the process of combining information from heterogeneous sources into a single composite picture of the relevant process, such that the composite picture is generally more accurate and complete than that derived from any single source alone. Data collection is often incomplete, sparse, and yields incompatible information. Fusion techniques can make optimal use of such data. When investment in data collection is high, fusion gives the best return. Our study uses data from two satellites: (1) Multiangle Imaging SpectroRadiometer (MISR), (2) Moderate Resolution Imaging Spectroradiometer (MODIS).
Automated image processing of Landsat II digital data for watershed runoff prediction
NASA Technical Reports Server (NTRS)
Sasso, R. R.; Jensen, J. R.; Estes, J. E.
1977-01-01
Digital image processing of Landsat data from a 230 sq km area was examined as a possible means of generating soil cover information for use in the watershed runoff prediction of Kern County, California. The soil cover information included data on brush, grass, pasture lands and forests. A classification accuracy of 94% for the Landsat-based soil cover survey suggested that the technique could be applied to the watershed runoff estimate. However, problems involving the survey of complex mountainous environments may require further attention
Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius
2002-01-01
Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921
Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius
2002-01-01
Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.
Multi-Sensor Documentation of Metric and Qualitative Information of Historic Stone Structures
NASA Astrophysics Data System (ADS)
Adamopoulos, E.; Tsilimantou, E.; Keramidas, V.; Apostolopoulou, M.; Karoglou, M.; Tapinaki, S.; Ioannidis, C.; Georgopoulos, A.; Moropoulou, A.
2017-08-01
This paper focuses on the integration of multi-sensor techniques regarding the acquisition, processing, visualisation and management of data regarding historic stone structures. The interdisciplinary methodology that is carried out here comprises of two parts. In the first part, the acquisition of qualitative and quantitative data concerning the geometry, the materials and the degradation of the tangible heritage asset each time, is discussed. The second part, refers to the analysis, management and visualization of the interrelated data by using spatial information technologies. Through the paradigm of the surveying of the ancient temple of Pythian Apollo at Acropolis of Rhodes, Rhodes Island, Greece, it is aimed to highlight the issues deriving from the separate application of documentation procedures and how the fusion of these methods can contribute effectively to ensure the completeness of the measurements for complex structures. The surveying results are further processed to be compatible and integrated with GIS. Also, the geometric documentation derivatives are combined with environmental data and the results of the application of non-destructive testing and evaluation techniques in situ and analytical techniques in lab after sampling. GIS operations are utilized to document the building materials but also to model and to analyse the decay extent and patterns. Detailed surface measurements and geo-processing analysis are executed. This integrated approach, helps the assessment of past interventions on the monument, identify main causes of damage and decay, and finally assist the decision making on the most compatible materials and techniques for protection and restoration works.
AOD furnace splash soft-sensor in the smelting process based on improved BP neural network
NASA Astrophysics Data System (ADS)
Ma, Haitao; Wang, Shanshan; Wu, Libin; Yu, Ying
2017-11-01
In view of argon oxygen refining low carbon ferrochrome production process, in the splash of smelting process as the research object, based on splash mechanism analysis in the smelting process , using multi-sensor information fusion and BP neural network modeling techniques is proposed in this paper, using the vibration signal, the audio signal and the flame image signal in the furnace as the characteristic signal of splash, the vibration signal, the audio signal and the flame image signal in the furnace integration and modeling, and reconstruct splash signal, realize the splash soft measurement in the smelting process, the simulation results show that the method can accurately forecast splash type in the smelting process, provide a new method of measurement for forecast splash in the smelting process, provide more accurate information to control splash.
Invariance algorithms for processing NDE signals
NASA Astrophysics Data System (ADS)
Mandayam, Shreekanth; Udpa, Lalita; Udpa, Satish S.; Lord, William
1996-11-01
Signals that are obtained in a variety of nondestructive evaluation (NDE) processes capture information not only about the characteristics of the flaw, but also reflect variations in the specimen's material properties. Such signal changes may be viewed as anomalies that could obscure defect related information. An example of this situation occurs during in-line inspection of gas transmission pipelines. The magnetic flux leakage (MFL) method is used to conduct noninvasive measurements of the integrity of the pipe-wall. The MFL signals contain information both about the permeability of the pipe-wall and the dimensions of the flaw. Similar operational effects can be found in other NDE processes. This paper presents algorithms to render NDE signals invariant to selected test parameters, while retaining defect related information. Wavelet transform based neural network techniques are employed to develop the invariance algorithms. The invariance transformation is shown to be a necessary pre-processing step for subsequent defect characterization and visualization schemes. Results demonstrating the successful application of the method are presented.
Diffusion processes in tumors: A nuclear medicine approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amaya, Helman, E-mail: haamayae@unal.edu.co
The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and {sup 18}F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer softwaremore » was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical {sup 18}F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.« less
Total quality management - It works for aerospace information services
NASA Technical Reports Server (NTRS)
Erwin, James; Eberline, Carl; Colquitt, Wanda
1993-01-01
Today we are in the midst of information and 'total quality' revolutions. At the NASA STI Program's Center for AeroSpace Information (CASI), we are focused on using continuous improvements techniques to enrich today's services and products and to ensure that tomorrow's technology supports the TQM-based improvement of future STI program products and services. The Continuous Improvements Program at CASI is the foundation for Total Quality Management in products and services. The focus is customer-driven; its goal, to identify processes and procedures that can be improved and new technologies that can be integrated with the processes to gain efficiencies, provide effectiveness, and promote customer satisfaction. This Program seeks to establish quality through an iterative defect prevention approach that is based on the incorporation of standards and measurements into the processing cycle.
LANDSAT information for state planning
NASA Technical Reports Server (NTRS)
Faust, N. L.; Spann, G. W.
1977-01-01
The transfer of remote sensing technology for the digital processing of LANDSAT data to state and local agencies in Georgia and other southeastern states is discussed. The project consists of a series of workshops, seminars, and demonstration efforts, and transfer of NASA-developed hardware concepts and computer software to state agencies. Throughout the multi-year effort, digital processing techniques have been emphasized classification algorithms. Software for LANDSAT data rectification and processing have been developed and/or transferred. A hardware system is available at EES (engineering experiment station) to allow user interactive processing of LANDSAT data. Seminars and workshops emphasize the digital approach to LANDSAT data utilization and the system improvements scheduled for LANDSATs C and D. Results of the project indicate a substantially increased awareness of the utility of digital LANDSAT processing techniques among the agencies contracted throughout the southeast. In Georgia, several agencies have jointly funded a program to map the entire state using digitally processed LANDSAT data.
Estimating the decomposition of predictive information in multivariate systems
NASA Astrophysics Data System (ADS)
Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele
2015-03-01
In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.
Applications integration in a hybrid cloud computing environment: modelling and platform
NASA Astrophysics Data System (ADS)
Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang
2013-08-01
With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.
NASA Astrophysics Data System (ADS)
Dostal, P.; Krasula, L.; Klima, M.
2012-06-01
Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.
ERIC Educational Resources Information Center
Mitri, Michel
2012-01-01
XML has become the most ubiquitous format for exchange of data between applications running on the Internet. Most Web Services provide their information to clients in the form of XML. The ability to process complex XML documents in order to extract relevant information is becoming as important a skill for IS students to master as querying…
Assessing Mission Impact of Cyberattacks: Report of the NATO IST-128 Workshop
2015-12-01
simulation) perspective. This would be natural, considering that the cybersecurity problem is highly adversarial in nature. Because it involves intelligent ...be formulated as a partial information game; artificial intelligence techniques might help here. Yet another style of problem formulation that...computational information processing for weapons, intelligence , communication, and logistics systems continues to increase the vulnerability of
Human Dimensions in Future Battle Command Systems: A Workshop Report
2008-04-01
information processing). These dimensions can best be described anecdotally and metaphorically as: • Battle command is a human-centric...enhance information visualization techniques in the decision tools, including multimodal platforms: video, graphics, symbols, etc. This should be...organization members. Each dimension can metaphorically represent the spatial location of individuals and group thinking in a trajectory of social norms
ERIC Educational Resources Information Center
Herrera-Viedma, Enrique; Peis, Eduardo
2003-01-01
Presents a fuzzy evaluation method of SGML documents based on computing with words. Topics include filtering the amount of information available on the Web to assist users in their search processes; document type definitions; linguistic modeling; user-system interaction; and use with XML and other markup languages. (Author/LRW)
Application of the Near Miss Strategy and Edit Distance to Handle Dirty Data
NASA Astrophysics Data System (ADS)
Varol, Cihan; Bayrak, Coskun; Wagner, Rick; Goff, Dana
In today’s information age, processing customer information in a standardized and accurate manner is known to be a difficult task. Data collection methods vary from source to source by format, volume, and media type. Therefore, it is advantageous to deploy customized data hygiene techniques to standardize the data for meaningfulness and usefulness based on the organization.
Application of the Near Miss Strategy and Edit Distance to Handle Dirty Data
NASA Astrophysics Data System (ADS)
Varol, Cihan; Bayrak, Coskun; Wagner, Rick; Goff, Dana
In today's information age, processing customer information in a standardized and accurate manner is known to be a difficult task. Data collection methods vary from source to source by format, volume, and media type. Therefore, it is advantageous to deploy customized data hygiene techniques to standardize the data for meaningfulness and usefulness based on the organization.
Cloud-based adaptive exon prediction for DNA analysis
Putluri, Srinivasareddy; Fathima, Shaik Yasmeen
2018-01-01
Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database. PMID:29515813
Saporito, Salvatore; Van Riper, David; Wakchaure, Ashwini
2017-01-01
The School Attendance Boundary Information System is a social science data infrastructure project that assembles, processes, and distributes spatial data delineating K through 12th grade school attendance boundaries for thousands of school districts in U.S. Although geography is a fundamental organizing feature of K to 12 education, until now school attendance boundary data have not been made readily available on a massive basis and in an easy-to-use format. The School Attendance Boundary Information System removes these barriers by linking spatial data delineating school attendance boundaries with tabular data describing the demographic characteristics of populations living within those boundaries. This paper explains why a comprehensive GIS database of K through 12 school attendance boundaries is valuable, how original spatial information delineating school attendance boundaries is collected from local agencies, and techniques for modeling and storing the data so they provide maximum flexibility to the user community. An important goal of this paper is to share the techniques used to assemble the SABINS database so that local and state agencies apply a standard set of procedures and models as they gather data for their regions. PMID:29151773
Natural Language Processing in Radiology: A Systematic Review.
Pons, Ewoud; Braun, Loes M M; Hunink, M G Myriam; Kors, Jan A
2016-05-01
Radiological reporting has generated large quantities of digital content within the electronic health record, which is potentially a valuable source of information for improving clinical care and supporting research. Although radiology reports are stored for communication and documentation of diagnostic imaging, harnessing their potential requires efficient and automated information extraction: they exist mainly as free-text clinical narrative, from which it is a major challenge to obtain structured data. Natural language processing (NLP) provides techniques that aid the conversion of text into a structured representation, and thus enables computers to derive meaning from human (ie, natural language) input. Used on radiology reports, NLP techniques enable automatic identification and extraction of information. By exploring the various purposes for their use, this review examines how radiology benefits from NLP. A systematic literature search identified 67 relevant publications describing NLP methods that support practical applications in radiology. This review takes a close look at the individual studies in terms of tasks (ie, the extracted information), the NLP methodology and tools used, and their application purpose and performance results. Additionally, limitations, future challenges, and requirements for advancing NLP in radiology will be discussed. (©) RSNA, 2016 Online supplemental material is available for this article.
Saporito, Salvatore; Van Riper, David; Wakchaure, Ashwini
2013-01-01
The School Attendance Boundary Information System is a social science data infrastructure project that assembles, processes, and distributes spatial data delineating K through 12 th grade school attendance boundaries for thousands of school districts in U.S. Although geography is a fundamental organizing feature of K to 12 education, until now school attendance boundary data have not been made readily available on a massive basis and in an easy-to-use format. The School Attendance Boundary Information System removes these barriers by linking spatial data delineating school attendance boundaries with tabular data describing the demographic characteristics of populations living within those boundaries. This paper explains why a comprehensive GIS database of K through 12 school attendance boundaries is valuable, how original spatial information delineating school attendance boundaries is collected from local agencies, and techniques for modeling and storing the data so they provide maximum flexibility to the user community. An important goal of this paper is to share the techniques used to assemble the SABINS database so that local and state agencies apply a standard set of procedures and models as they gather data for their regions.
Study on key techniques for camera-based hydrological record image digitization
NASA Astrophysics Data System (ADS)
Li, Shijin; Zhan, Di; Hu, Jinlong; Gao, Xiangtao; Bo, Ping
2015-10-01
With the development of information technology, the digitization of scientific or engineering drawings has received more and more attention. In hydrology, meteorology, medicine and mining industry, the grid drawing sheet is commonly used to record the observations from sensors. However, these paper drawings may be destroyed and contaminated due to improper preservation or overuse. Further, it will be a heavy workload and prone to error if these data are manually transcripted into the computer. Hence, in order to digitize these drawings, establishing the corresponding data base will ensure the integrity of data and provide invaluable information for further research. This paper presents an automatic system for hydrological record image digitization, which consists of three key techniques, i.e., image segmentation, intersection point localization and distortion rectification. First, a novel approach to the binarization of the curves and grids in the water level sheet image has been proposed, which is based on the fusion of gradient and color information adaptively. Second, a fast search strategy for cross point location is invented and point-by-point processing is thus avoided, with the help of grid distribution information. And finally, we put forward a local rectification method through analyzing the central portions of the image and utilizing the domain knowledge of hydrology. The processing speed is accelerated, while the accuracy is still satisfying. Experiments on several real water level records show that our proposed techniques are effective and capable of recovering the hydrological observations accurately.
NASA Astrophysics Data System (ADS)
Rogatkin, Dmitrii A.; Tchernyi, Vladimir V.
2003-07-01
The optical noninvasive diagnostic systems are now widely applied and investigated in different areas of medicine. One of the such techniques is the noninvasive spectrophotometry, the complex diagnostic technique consisting on elastic scattering spectroscopy, absorption spectroscopy, fluorescent diagnostics, photoplethismography, etc. Today a lot of real optical diagnostic systems indicate the technical parameters and physical data only as a result of the diagnostic procedure. But, it is clear that for the medical staff the more convenient medical information is needed. This presentation lights the general way for development a diagnostic system"s software, which can produce the full processing of the diagnostic data from a physical to a medical level. It is shown, that this process is a multilevel (3-level) procedure and the main diagnostic result for noninvasive spectrophotometry methods, the biochemical and morphological composition of the tested tissues, arises in it on a second level of calculations.
Exploring patterns of epigenetic information with data mining techniques.
Aguiar-Pulido, Vanessa; Seoane, José A; Gestal, Marcos; Dorado, Julián
2013-01-01
Data mining, a part of the Knowledge Discovery in Databases process (KDD), is the process of extracting patterns from large data sets by combining methods from statistics and artificial intelligence with database management. Analyses of epigenetic data have evolved towards genome-wide and high-throughput approaches, thus generating great amounts of data for which data mining is essential. Part of these data may contain patterns of epigenetic information which are mitotically and/or meiotically heritable determining gene expression and cellular differentiation, as well as cellular fate. Epigenetic lesions and genetic mutations are acquired by individuals during their life and accumulate with ageing. Both defects, either together or individually, can result in losing control over cell growth and, thus, causing cancer development. Data mining techniques could be then used to extract the previous patterns. This work reviews some of the most important applications of data mining to epigenetics.
Drowning in Data: Going Beyond Traditional Data Archival to Educate Data Users
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Smith, T.; Smith, D. K.; Bugbee, K.; Sinclair, L.
2017-12-01
Increasing quantities of Earth science data and information prove overwhelming to new and unfamiliar users. Data discovery and use challenges faced by these users are compounded with atmospheric science field campaign data collected by a variety of instruments and stored, visualized, processed and analyzed in different ways. To address data and user needs assessed through annual surveys and user questions, the NASA Global Hydrology Resource Center Distributed Active Archive Center (GHRC DAAC), in collaboration with a graphic designer, has developed a series of resources to help users learn about GHRC science focus areas, field campaigns, instruments, data, and data processing techniques. In this talk, GHRC data recipes, micro articles, interactive data visualization techniques, and artistic science outreach and education efforts, such as ESRI story maps and research as art, will be overviewed. The objective of this talk is to stress the importance artistic information visualization has in communicating with and educating Earth science data users.
Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C
1997-01-01
The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.
Informed use of patients' records on trusted health care services.
Sahama, Tony; Miller, Evonne
2011-01-01
Health care is an information-intensive business. Sharing information in health care processes is a smart use of data enabling informed decision-making whilst ensuring. the privacy and security of patient information. To achieve this, we propose data encryption techniques embedded Information Accountability Framework (IAF) that establishes transitions of the technological concept, thus enabling understanding of shared responsibility, accessibility, and efficient cost effective informed decisions between health care professionals and patients. The IAF results reveal possibilities of efficient informed medical decision making and minimisation of medical errors. Of achieving this will require significant cultural changes and research synergies to ensure the sustainability, acceptability and durability of the IAF.
ERIC Educational Resources Information Center
Burrows, Tracy; Findlay, Naomi; Killen, Chloe; Dempsey, Shane E.; Hunter, Sharyn; Chiarelli, Pauline; Snodgrass, Suzanne
2011-01-01
This paper describes the development of a peer review of teaching model for the Faculty of Health at the University of Newcastle, Australia. The process involved using the nominal group technique to engage Faculty academic staff to consider seven key decision points that informed the development of the peer review of teaching model. Use of the…
ERIC Educational Resources Information Center
Viegas, Aldino; Manso, Joao; Nobrega, Franklin L.; Cabrita, Eurico J.
2011-01-01
Saturation transfer difference (STD) NMR has emerged as one of the most popular ligand-based NMR techniques for the study of protein-ligand interactions. The success of this technique is a consequence of its robustness and the fact that it is focused on the signals of the ligand, without any need of processing NMR information about the receptor…
Performance-Based Logistics, Contractor Logistics Support, and Stryker
2007-06-15
automotive , armament, missile, communications, special devices, and ground equipment repair. The essential maintenance task for the FMC is to maintain...technologies and welding techniques into their production processes. Finally, GDLS’s use of progressive management techniques and supply chain information...C4ISR, MEP) per the NMC criteria in the -10 manual, the contractors system only focuses on the platform or automotive status. Thus a vehicle “up” for
Photoacoustic spectroscopy of condensed matter
NASA Technical Reports Server (NTRS)
Somoano, R. B.
1978-01-01
Photoacoustic spectroscopy is a new analytical tool that provides a simple nondestructive technique for obtaining information about the electronic absorption spectrum of samples such as powders, semisolids, gels, and liquids. It can also be applied to samples which cannot be examined by conventional optical methods. Numerous applications of this technique in the field of inorganic and organic semiconductors, biology, and catalysis have been described. Among the advantages of photoacoustic spectroscopy, the signal is almost insensitive to light scattering by the sample and information can be obtained about nonradiative deactivation processes. Signal saturation, which can modify the intensity of individual absorption bands in special cases, is a drawback of the method.
Femtosecond pulse laser-oriented recording on dental prostheses: a trial introduction.
Ichikawa, Tetsuo; Hayasaki, Yoshio; Fujita, Keiji; Nagao, Kan; Murata, Masayo; Kawano, Takanori; Chen, JianRong
2006-12-01
The purpose of this study was to evaluate the feasibility of using a femtosecond pulse laser processing technique to store information on a dental prosthesis. Commercially pure titanium plates were processed by a femtosecond pulse laser system. The processed surface structure was observed with a reflective illumination microscope, scanning electron microscope, and atomic force microscope. Processed area was an almost conical pit with a clear boundary. When laser pulse energy was 2 microJ, the diameter and depth were approximately 10microm and 0.2 microm respectively--whereby both increased with laser pulse energy. Further, depth of pit increased with laser pulse number without any thermal effect. This study showed that the femtosecond pulse processing system was capable of recording personal identification and optional additional information on a dental prosthesis.
NASA Astrophysics Data System (ADS)
Ehmann, Andreas F.; Downie, J. Stephen
2005-09-01
The objective of the International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) project is the creation of a large, secure corpus of audio and symbolic music data accessible to the music information retrieval (MIR) community for the testing and evaluation of various MIR techniques. As part of the IMIRSEL project, a cross-platform JAVA based visual programming environment called Music to Knowledge (M2K) is being developed for a variety of music information retrieval related tasks. The primary objective of M2K is to supply the MIR community with a toolset that provides the ability to rapidly prototype algorithms, as well as foster the sharing of techniques within the MIR community through the use of a standardized set of tools. Due to the relatively large size of audio data and the computational costs associated with some digital signal processing and machine learning techniques, M2K is also designed to support distributed computing across computing clusters. In addition, facilities to allow the integration of non-JAVA based (e.g., C/C++, MATLAB, etc.) algorithms and programs are provided within M2K. [Work supported by the Andrew W. Mellon Foundation and NSF Grants No. IIS-0340597 and No. IIS-0327371.
Photogrammetry for Archaeology: Collecting Pieces Together
NASA Astrophysics Data System (ADS)
Chibunichev, A. G.; Knyaz, V. A.; Zhuravlev, D. V.; Kurkov, V. M.
2018-05-01
The complexity of retrieving and understanding the archaeological data requires to apply different techniques, tools and sensors for information gathering, processing and documenting. Archaeological research now has the interdisciplinary nature involving technologies based on different physical principles for retrieving information about archaeological findings. The important part of archaeological data is visual and spatial information which allows reconstructing the appearance of the findings and relation between them. Photogrammetry has a great potential for accurate acquiring of spatial and visual data of different scale and resolution allowing to create archaeological documents of new type and quality. The aim of the presented study is to develop an approach for creating new forms of archaeological documents, a pipeline for their producing and collecting in one holistic model, describing an archaeological site. A set of techniques is developed for acquiring and integration of spatial and visual data of different level of details. The application of the developed techniques is demonstrated for documenting of Bosporus archaeological expedition of Russian State Historical Museum.
NASA Technical Reports Server (NTRS)
Collis, R. T. H.
1969-01-01
Lidar is an optical radar technique employing laser energy. Variations in signal intensity as a function of range provide information on atmospheric constituents, even when these are too tenuous to be normally visible. The theoretical and technical basis of the technique is described and typical values of the atmospheric optical parameters given. The significance of these parameters to atmospheric and meteorological problems is discussed. While the basic technique can provide valuable information about clouds and other material in the atmosphere, it is not possible to determine particle size and number concentrations precisely. There are also inherent difficulties in evaluating lidar observations. Nevertheless, lidar can provide much useful information as is shown by illustrations. These include lidar observations of: cirrus cloud, showing mountain wave motions; stratification in clear air due to the thermal profile near the ground; determinations of low cloud and visibility along an air-field approach path; and finally the motion and internal structure of clouds of tracer materials (insecticide spray and explosion-caused dust) which demonstrate the use of lidar for studying transport and diffusion processes.
Collaborative care: Using six thinking hats for decision making.
Cioffi, Jane Marie
2017-12-01
To apply six thinking hats technique for decision making in collaborative care. In collaborative partnerships, effective communications need to occur in patient, family, and health care professional meetings. The effectiveness of these meetings depends on the engagement of participants and the quality of the meeting process. The use of six thinking hats technique to engage all participants in effective dialogue is proposed. Discussion paper. Electronic databases, CINAHL, Pub Med, and Science Direct, were searched for years 1990 to 2017. Using six thinking hats technique in patient family meetings nurses can guide a process of dialogue that focuses decision making to build equal care partnerships inclusive of all participants. Nurses will need to develop the skills for using six thinking hats technique and provide support to all participants during the meeting process. Collaborative decision making can be augmented by six thinking hat technique to provide patients, families, and health professionals with opportunities to make informed decisions about care that considers key issues for all involved. Nurses who are most often advocates for patients and their families are in a unique position to lead this initiative in meetings as they network with all health professionals. © 2017 John Wiley & Sons Australia, Ltd.
Linking temporal medical records using non-protected health information data.
Bonomi, Luca; Jiang, Xiaoqian
2017-01-01
Modern medical research relies on multi-institutional collaborations which enhance the knowledge discovery and data reuse. While these collaborations allow researchers to perform analytics otherwise impossible on individual datasets, they often pose significant challenges in the data integration process. Due to the lack of a unique identifier, data integration solutions often have to rely on patient's protected health information (PHI). In many situations, such information cannot leave the institutions or must be strictly protected. Furthermore, the presence of noisy values for these attributes may result in poor overall utility. While much research has been done to address these challenges, most of the current solutions are designed for a static setting without considering the temporal information of the data (e.g. EHR). In this work, we propose a novel approach that uses non-PHI for linking patient longitudinal data. Specifically, our technique captures the diagnosis dependencies using patterns which are shown to provide important indications for linking patient records. Our solution can be used as a standalone technique to perform temporal record linkage using non-protected health information data or it can be combined with Privacy Preserving Record Linkage solutions (PPRL) when protected health information is available. In this case, our approach can solve ambiguities in results. Experimental evaluations on real datasets demonstrate the effectiveness of our technique.
Information Management for a Large Multidisciplinary Project
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.
1992-01-01
In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.
Forming electrical interconnections through semiconductor wafers
NASA Technical Reports Server (NTRS)
Anthony, T. R.
1981-01-01
An information processing system based on CMOS/SOS technology is being developed by NASA to process digital image data collected by satellites. An array of holes is laser drilled in a semiconductor wafer, and a conductor is formed in the holes to fabricate electrical interconnections through the wafers. Six techniques are used to form conductors in the silicon-on-sapphire (SOS) wafers, including capillary wetting, wedge extrusion, wire intersection, electroless plating, electroforming, double-sided sputtering and through-hole electroplating. The respective strengths and weaknesses of these techniques are discussed and compared, with double-sided sputtering and the through-hole plating method achieving best results. In addition, hollow conductors provided by the technique are available for solder refill, providing a natural way of forming an electrically connected stack of SOS wafers.
An Algebra-Based Introductory Computational Neuroscience Course with Lab.
Fink, Christian G
2017-01-01
A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.
Digital techniques for processing Landsat imagery
NASA Technical Reports Server (NTRS)
Green, W. B.
1978-01-01
An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.
Coal liquefaction process streams characterization and evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, G.; Davis, A.; Burke, F.P.
1991-12-01
This study demonstrated the use of the gold tube carbonization technique and reflectance microscopy analysis for the examination of process-derived materials from direct coal liquefaction. The carbonization technique, which was applied to coal liquefaction distillation resids, yields information on the amounts of gas plus distillate, pyridine-soluble resid, and pyridine-insoluble material formed when a coal liquid sample is heated to 450{degree}C for one hour at 5000 psi in an inert atmosphere. The pyridine-insolubles then are examined by reflectance microscopy to determine the type, amount, and optical texture of isotropic and anisotropic carbon formed upon carbonization. Further development of these analytical methodsmore » as process development tools may be justified on the basis of these results.« less
NASA Technical Reports Server (NTRS)
Casas, Joseph C.; Saylor, Mary S.; Kindle, Earl C.
1987-01-01
The major emphasis is on the advancement of remote sensing technology. In particular, the gas filter correlation radiometer (GFCR) technique was applied to the measurement of trace gas species, such as carbon monoxide (CO), from airborne and Earth orbiting platforms. Through a series of low altitude aircraft flights, high altitude aircraft flights, and orbiting space platform flights, data were collected and analyzed, culminating in the first global map of carbon monoxide concentration in the middle troposphere and stratosphere. The four major areas of this remote sensing program, known as the Measurement of Air Pollution from Satellites (MAPS) experiment, are: (1) data acquisition, (2) data processing, analysis, and interpretation algorithms, (3) data display techniques, and (4) information processing.
High-resolution electron microscope
NASA Technical Reports Server (NTRS)
Nathan, R.
1977-01-01
Employing scanning transmission electron microscope as interferometer, relative phases of diffraction maximums can be determined by analysis of dark field images. Synthetic aperture technique and Fourier-transform computer processing of amplitude and phase information provide high resolution images at approximately one angstrom.
Controlling user access to electronic resources without password
Smith, Fred Hewitt
2015-06-16
Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.
Laser-Induced Fluorescence Helps Diagnose Plasma Processes
NASA Technical Reports Server (NTRS)
Beattie, J. R.; Mattosian, J. N.; Gaeta, C. J.; Turley, R. S.; Williams, J. D.; Williamson, W. S.
1994-01-01
Technique developed to provide in situ monitoring of rates of ion sputter erosion of accelerator electrodes in ion thrusters also used for ground-based applications to monitor, calibrate, and otherwise diagnose plasma processes in fabrication of electronic and optical devices. Involves use of laser-induced-fluorescence measurements, which provide information on rates of ion etching, inferred rates of sputter deposition, and concentrations of contaminants.
Pulse-echo probe of rock permeability near oil wells
NASA Technical Reports Server (NTRS)
Narasimhan, K. Y.; Parthasarathy, S. P.
1978-01-01
Processing method involves sequential insonifications of borehole wall at number of different frequencies. Return signals are normalized in amplitude, and root-mean-square (rms) value of each signal is determined. Values can be processed to yield information on size and number density of microfractures at various depths in rock matrix by using averaging methods developed for pulse-echo technique.
White-Light Optical Information Processing and Holography.
1984-06-22
Processing, Image Deblurring , Source Encoding, Signal Sampling, Coherence Measurement, Noise Performance, / Pseudocolor Encoding. , ’ ’ * .~ 10.ASS!RACT...o 2.1 Broad Spectral Band Color Image Deblurring .. . 4 2.2 Noise Performance ...... ...... .. . 4 2.3 Pseudocolor Encoding with Three Primary...spectra. This technique is particularly suitable for linear smeared color image deblurring . 2.2 Noise Performance In this period, we have also
ERIC Educational Resources Information Center
Rehmat, Abeera P.; Owens, Marissa C.
2016-01-01
This column presents ideas and techniques to enhance your science teaching. This month's issue shares information about a unit promoting scientific literacy and the engineering design process. The integration of engineering with scientific practices in K-12 education can promote creativity, hands-on learning, and an improvement in students'…
Modern Radar Techniques for Geophysical Applications: Two Examples
NASA Technical Reports Server (NTRS)
Arokiasamy, B. J.; Bianchi, C.; Sciacca, U.; Tutone, G.; Zirizzotti, A.; Zuccheretti, E.
2005-01-01
The last decade of the evolution of radar was heavily influenced by the rapid increase in the information processing capabilities. Advances in solid state radio HF devices, digital technology, computing architectures and software offered the designers to develop very efficient radars. In designing modern radars the emphasis goes towards the simplification of the system hardware, reduction of overall power, which is compensated by coding and real time signal processing techniques. Radars are commonly employed in geophysical radio soundings like probing the ionosphere; stratosphere-mesosphere measurement, weather forecast, GPR and radio-glaciology etc. In the laboratorio di Geofisica Ambientale of the Istituto Nazionale di Geofisica e Vulcanologia (INGV), Rome, Italy, we developed two pulse compression radars. The first is a HF radar called AIS-INGV; Advanced Ionospheric Sounder designed both for the purpose of research and for routine service of the HF radio wave propagation forecast. The second is a VHF radar called GLACIORADAR, which will be substituting the high power envelope radar used by the Italian Glaciological group. This will be employed in studying the sub glacial structures of Antarctica, giving information about layering, the bed rock and sub glacial lakes if present. These are low power radars, which heavily rely on advanced hardware and powerful real time signal processing. Additional information is included in the original extended abstract.
Effect of Temperature and Deformation Rate on the Tensile Mechanical Properties of Polyimide Films
NASA Technical Reports Server (NTRS)
Moghazy, Samir F.; McNair, Kevin C.
1996-01-01
In order to study the structure-property relationships of different processed oriented polyimide films, the mechanical properties will be identified by using tensile tester Instron 4505 and structural information such as the 3-dimensional birefringence molecular symmetry axis and 3-dimensional refractive indices will be determined by using wave guide coupling techniques. The monoaxial drawing techniques utilized in this research are very useful for improving the tensile mechanical properties of aromatic polyimide films. In order to obtain high modulus/high strength polyimide films the following two techniques have been employed, cold drawing in which polyimide films are drawn at room temperature at different cross head speeds and hot drawing in which polyimide films are drawn at different temperatures and cross head speeds. In the hot drawing process the polyimide films are drawn at different temperatures until the glass transition temperature (Tg) is reached by using the environmental chamber. All of the mechanical and optical property parameters will be identified for each sample processed by both cold and hot drawing techniques.
NASA Astrophysics Data System (ADS)
Su, Zhongqing; Ye, Lin
2004-08-01
The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.
Authentication techniques for smart cards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, R.A.
1994-02-01
Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less
Quantum-classical boundary for precision optical phase estimation
NASA Astrophysics Data System (ADS)
Birchall, Patrick M.; O'Brien, Jeremy L.; Matthews, Jonathan C. F.; Cable, Hugo
2017-12-01
Understanding the fundamental limits on the precision to which an optical phase can be estimated is of key interest for many investigative techniques utilized across science and technology. We study the estimation of a fixed optical phase shift due to a sample which has an associated optical loss, and compare phase estimation strategies using classical and nonclassical probe states. These comparisons are based on the attainable (quantum) Fisher information calculated per number of photons absorbed or scattered by the sample throughout the sensing process. We find that for a given number of incident photons upon the unknown phase, nonclassical techniques in principle provide less than a 20 % reduction in root-mean-square error (RMSE) in comparison with ideal classical techniques in multipass optical setups. Using classical techniques in a different optical setup that we analyze, which incorporates additional stages of interference during the sensing process, the achievable reduction in RMSE afforded by nonclassical techniques falls to only ≃4 % . We explain how these conclusions change when nonclassical techniques are compared to classical probe states in nonideal multipass optical setups, with additional photon losses due to the measurement apparatus.
Image processing and recognition for biological images.
Uchida, Seiichi
2013-05-01
This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. © 2013 The Author Development, Growth & Differentiation © 2013 Japanese Society of Developmental Biologists.
Barker, Fiona; Mackenzie, Emma; de Lusignan, Simon
2016-11-01
To observe and analyse the range and nature of behaviour change techniques (BCTs) employed by audiologists during hearing-aid fitting consultations to encourage and enable hearing-aid use. Non-participant observation and qualitative thematic analysis using the behaviour change technique taxonomy (version 1) (BCTTv1). Ten consultations across five English NHS audiology departments. Audiologists engage in behaviours to ensure the hearing-aid is fitted to prescription and is comfortable to wear. They provide information, equipment, and training in how to use a hearing-aid including changing batteries, cleaning, and maintenance. There is scope for audiologists to use additional BCTs: collaborating with patients to develop a behavioural plan for hearing-aid use that includes goal-setting, action-planning and problem-solving; involving significant others; providing information on the benefits of hearing-aid use or the consequences of non-use and giving advice about using prompts/cues for hearing-aid use. This observational study of audiologist behaviour in hearing-aid fitting consultations has identified opportunities to use additional behaviour change techniques that might encourage hearing-aid use. This information defines potential intervention targets for further research with the aim of improving hearing-aid use amongst adults with acquired hearing loss.
Central localization of plasticity involved in appetitive conditioning in Lymnaea
Straub, Volko A.; Styles, Benjamin J.; Ireland, Julie S.; O'Shea, Michael; Benjamin, Paul R.
2004-01-01
Learning to associate a conditioned (CS) and unconditioned stimulus (US) results in changes in the processing of CS information. Here, we address directly the question whether chemical appetitive conditioning of Lymnaea feeding behavior involves changes in the peripheral and/or central processing of the CS by using extracellular recording techniques to monitor neuronal activity at two stages of the sensory processing pathway. Our data show that appetitive conditioning does not affect significantly the overall CS response of afferent nerves connecting chemosensory structures in the lips and tentacles to the central nervous system (CNS). In contrast, neuronal output from the cerebral ganglia, which represent the first central processing stage for chemosensory information, is enhanced significantly in response to the CS after appetitive conditioning. This demonstrates that chemical appetitive conditioning in Lymnaea affects the central, but not the peripheral processing of chemosensory information. It also identifies the cerebral ganglia of Lymnaea as an important site for neuronal plasticity and forms the basis for detailed cellular studies of neuronal plasticity. PMID:15537733
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
Usability engineering: domain analysis activities for augmented-reality systems
NASA Astrophysics Data System (ADS)
Gabbard, Joseph; Swan, J. E., II; Hix, Deborah; Lanzagorta, Marco O.; Livingston, Mark; Brown, Dennis B.; Julier, Simon J.
2002-05-01
This paper discusses our usability engineering process for the Battlefield Augmented Reality System (BARS). Usability engineering is a structured, iterative, stepwise development process. Like the related disciplines of software and systems engineering, usability engineering is a combination of management principals and techniques, formal and semi- formal evaluation techniques, and computerized tools. BARS is an outdoor augmented reality system that displays heads- up battlefield intelligence information to a dismounted warrior. The paper discusses our general usability engineering process. We originally developed the process in the context of virtual reality applications, but in this work we are adapting the procedures to an augmented reality system. The focus of this paper is our work on domain analysis, the first activity of the usability engineering process. We describe our plans for and our progress to date on our domain analysis for BARS. We give results in terms of a specific urban battlefield use case we have designed.
Ethical considerations regarding the implementation of new technologies and techniques in surgery.
Strong, Vivian E; Forde, Kenneth A; MacFadyen, Bruce V; Mellinger, John D; Crookes, Peter F; Sillin, Lelan F; Shadduck, Phillip P
2014-08-01
Ethical considerations relevant to the implementation of new surgical technologies and techniques are explored and discussed in practical terms in this statement, including (1) How is the safety of a new technology or technique ensured?; (2) What are the timing and process by which a new technology or technique is implemented at a hospital?; (3) How are patients informed before undergoing a new technology or technique?; (4) How are surgeons trained and credentialed in a new technology or technique?; (5) How are the outcomes of a new technology or technique tracked and evaluated?; and (6) How are the responsibilities to individual patients and society at large balanced? The following discussion is presented with the intent to encourage thought and dialogue about ethical considerations relevant to the implementation of new technologies and new techniques in surgery.
ERIC Educational Resources Information Center
Rimoldi, Horacio J. A.; And Others
A technique using information and decision-making theories to evaluate problem solving tactics is presented. In problem solving, the process of solution is evaluated by investigating the questions that the subject doing the problem solving asks. The sequence of questions asked is called a tactic. It is assumed that: (1) tactics are the observable…
NASA Technical Reports Server (NTRS)
Souther, J. W.
1981-01-01
The need to teach informational writing as a decision-making process is discussed. Situational analysis, its relationship to decisions in writing, and the need for relevant assignments are considered. Teaching students to ask the right questions is covered. The need to teach writing responsiveness is described. Three steps to get started and four teaching techniques are described. The information needs of the 'expert' and the 'manager' are contrasted.
Thomas C. Edwards; Gretchen G. Moisen; Tracey S. Frescino; Joshua L. Lawler
2002-01-01
We describe our collective efforts to develop and apply methods for using FIA data to model forest resources and wildlife habitat. Our work demonstrates how flexible regression techniques, such as generalized additive models, can be linked with spatially explicit environmental information for the mapping of forest type and structure. We illustrate how these maps of...
Advanced Image Processing Techniques for Maximum Information Recovery
2006-11-01
0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision...available information from an image. Some radio frequency and optical sensors collect large-scale sets of spatial imagery data whose content is often...Some radio frequency and optical sensors collect large- scale sets of spatial imagery data whose content is often obscured by fog, clouds, foliage
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.
2013-12-01
Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.
Image Alignment for Multiple Camera High Dynamic Range Microscopy.
Eastwood, Brian S; Childs, Elisabeth C
2012-01-09
This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera.
Image Alignment for Multiple Camera High Dynamic Range Microscopy
Eastwood, Brian S.; Childs, Elisabeth C.
2012-01-01
This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera. PMID:22545028
Temporal trade-offs in psychophysics.
Barack, David L; Gold, Joshua I
2016-04-01
Psychophysical techniques typically assume straightforward relationships between manipulations of real-world events, their effects on the brain, and behavioral reports of those effects. However, these relationships can be influenced by many complex, strategic factors that contribute to task performance. Here we discuss several of these factors that share two key features. First, they involve subjects making flexible use of time to process information. Second, this flexibility can reflect the rational regulation of information-processing trade-offs that can play prominent roles in particular temporal epochs: sensitivity to stability versus change for past information, speed versus accuracy for current information, and exploitation versus exploration for future goals. Understanding how subjects manage these trade-offs can be used to help design and interpret psychophysical studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Combining Static Analysis and Model Checking for Software Analysis
NASA Technical Reports Server (NTRS)
Brat, Guillaume; Visser, Willem; Clancy, Daniel (Technical Monitor)
2003-01-01
We present an iterative technique in which model checking and static analysis are combined to verify large software systems. The role of the static analysis is to compute partial order information which the model checker uses to reduce the state space. During exploration, the model checker also computes aliasing information that it gives to the static analyzer which can then refine its analysis. The result of this refined analysis is then fed back to the model checker which updates its partial order reduction. At each step of this iterative process, the static analysis computes optimistic information which results in an unsafe reduction of the state space. However we show that the process converges to a fired point at which time the partial order information is safe and the whole state space is explored.
Optical information processing at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Reid, Max B.; Bualat, Maria G.; Cho, Young C.; Downie, John D.; Gary, Charles K.; Ma, Paul W.; Ozcan, Meric; Pryor, Anna H.; Spirkovska, Lilly
1993-01-01
The combination of analog optical processors with digital electronic systems offers the potential of tera-OPS computational performance, while often requiring less power and weight relative to all-digital systems. NASA is working to develop and demonstrate optical processing techniques for on-board, real time science and mission applications. Current research areas and applications under investigation include optical matrix processing for space structure vibration control and the analysis of Space Shuttle Main Engine plume spectra, optical correlation-based autonomous vision for robotic vehicles, analog computation for robotic path planning, free-space optical interconnections for information transfer within digital electronic computers, and multiplexed arrays of fiber optic interferometric sensors for acoustic and vibration measurements.
NASA Technical Reports Server (NTRS)
Effinger, Michael; Beshears, Ron; Hufnagle, David; Walker, James; Russell, Sam; Stowell, Bob; Myers, David
2002-01-01
Nondestructive characterization techniques have been used to steer development and testing of CMCs. Computed tomography is used to determine the volumetric integrity of the CMC plates and components. Thermography is used to determine the near surface integrity of the CMC plates and components. For process and material development, information such as density uniformity, part delamination, and dimensional tolerance conformity is generated. The information from the thermography and computed tomography is correlated and then specimen cutting maps are superimposed on the thermography images. This enables for tighter data and potential explanation of off nominal test data. Examples of nondestructive characterization utilization to make decisions in process and material development and testing are presented.
How the blind "see" Braille: lessons from functional magnetic resonance imaging.
Sadato, Norihiro
2005-12-01
What does the visual cortex of the blind do during Braille reading? This process involves converting simple tactile information into meaningful patterns that have lexical and semantic properties. The perceptual processing of Braille might be mediated by the somatosensory system, whereas visual letter identity is accomplished within the visual system in sighted people. Recent advances in functional neuroimaging techniques, such as functional magnetic resonance imaging, have enabled exploration of the neural substrates of Braille reading. The primary visual cortex of early-onset blind subjects is functionally relevant to Braille reading, suggesting that the brain shows remarkable plasticity that potentially permits the additional processing of tactile information in the visual cortical areas.
Multisource information fusion applied to ship identification for the recognized maritime picture
NASA Astrophysics Data System (ADS)
Simard, Marc-Alain; Lefebvre, Eric; Helleur, Christopher
2000-04-01
The Recognized Maritime Picture (RMP) is defined as a composite picture of activity over a maritime area of interest. In simplistic terms, building an RAMP comes down to finding if an object of interest, a ship in our case, is there or not, determining what it is, determining what it is doing and determining if some type of follow-on action is required. The Canadian Department of National Defence currently has access to or may, in the near future, have access to a number of civilians, military and allied information or sensor systems to accomplish these purposes. These systems include automatic self-reporting positional systems, air patrol surveillance systems, high frequency surface radars, electronic intelligence systems, radar space systems and high frequency direction finding sensors. The ability to make full use of these systems is limited by the existing capability to fuse data from all sources in a timely, accurate and complete manner. This paper presents an information fusion systems under development that correlates and fuses these information and sensor data sources. This fusion system, named Adaptive Fuzzy Logic Correlator, correlates the information in batch but fuses and constructs ship tracks sequentially. It applies standard Kalman filter techniques and fuzzy logic correlation techniques. We propose a set of recommendations that should improve the ship identification process. Particularly it is proposed to utilize as many non-redundant sources of information as possible that address specific vessel attributes. Another important recommendation states that the information fusion and data association techniques should be capable of dealing with incomplete and imprecise information. Some fuzzy logic techniques capable of tolerating imprecise and dissimilar data are proposed.
An introduction to chaotic and random time series analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
1989-01-01
The origin of chaotic behavior and the relation of chaos to randomness are explained. Two mathematical results are described: (1) a representation theorem guarantees the existence of a specific time-domain model for chaos and addresses the relation between chaotic, random, and strictly deterministic processes; (2) a theorem assures that information on the behavior of a physical system in its complete state space can be extracted from time-series data on a single observable. Focus is placed on an important connection between the dynamical state space and an observable time series. These two results lead to a practical deconvolution technique combining standard random process modeling methods with new embedded techniques.
The Commercial Challenges Of Pacs
NASA Astrophysics Data System (ADS)
Vanden Brink, John A.
1984-08-01
The increasing use of digital imaging techniques create a need for improved methods of digital processing, communication and archiving. However, the commercial opportunity is dependent on the resolution of a number of issues. These issues include proof that digital processes are more cost effective than present techniques, implementation of information system support in the imaging activity, implementation of industry standards, conversion of analog images to digital formats, definition of clinical needs, the implications of the purchase decision and technology requirements. In spite of these obstacles, a market is emerging, served by new and existing companies, that may become a $500 million market (U.S.) by 1990 for equipment and supplies.
NASA Technical Reports Server (NTRS)
Casas, J. C.; Koziana, J. V.; Saylor, M. S.; Kindle, E. C.
1982-01-01
Problems associated with the development of the measurement of air pollution from satellites (MAPS) experiment program are addressed. The primary thrust of this research was the utilization of the MAPS experiment data in three application areas: low altitude aircraft flights (one to six km); mid altitude aircraft flights (eight to 12 km); and orbiting space platforms. Extensive research work in four major areas of data management was the framework for implementation of the MAPS experiment technique. These areas are: (1) data acquisition; (2) data processing, analysis and interpretation algorithms; (3) data display techniques; and (4) information production.
Text Mining in Biomedical Domain with Emphasis on Document Clustering.
Renganathan, Vinaitheerthan
2017-07-01
With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise.
Integrated analysis of remote sensing products from basic geological surveys. [Brazil
NASA Technical Reports Server (NTRS)
Dasilvafagundesfilho, E. (Principal Investigator)
1984-01-01
Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.
The design of aircraft using the decision support problem technique
NASA Technical Reports Server (NTRS)
Mistree, Farrokh; Marinopoulos, Stergios; Jackson, David M.; Shupe, Jon A.
1988-01-01
The Decision Support Problem Technique for unified design, manufacturing and maintenance is being developed at the Systems Design Laboratory at the University of Houston. This involves the development of a domain-independent method (and the associated software) that can be used to process domain-dependent information and thereby provide support for human judgment. In a computer assisted environment, this support is provided in the form of optimal solutions to Decision Support Problems.
De Paúl, Joaquín; Asla, Nagore; Pérez-Albéniz, Alicia; de Cádiz, Bárbara Torres-Gómez
2006-08-01
The objective is to know if high-risk mothers for child physical abuse differ in their evaluations, attributions, negative affect, disciplinary choices for children's behavior, and expectations of compliance. The effect of a stressor and the introduction of mitigating information are analyzed. Forty-seven high-risk and 48 matched low-risk mothers participated in the study. Mothers' information processing and disciplinary choices were examined using six vignettes depicting a child engaging in different transgressions. A four-factor design with repeated measures on the last two factors was used. High-risk mothers reported more hostile intent, global and internal attributions, more use of power assertion discipline, and less induction. A risk group by child transgression interaction and a risk group by mitigating information interaction were found. Results support the social information-processing model of child physical abuse, which suggests that high-risk mothers process child-related information differently and use more power assertive and less inductive disciplinary techniques.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
2014-07-01
technology work seeks to address gaps in the management, processing, and fusion of heterogeneous (i.e., soft and hard ) information to aid human decision...and bandwidth) to exploit the vast and growing amounts of data [16], [17]. There is also a broad research program on techniques for soft and hard ...Mott, G. de Mel, and T. Pham, “Integrating hard and soft information sources for D2D using controlled natural language,” in Proc. Information Fusion
2009-02-01
management, available at <http://www.iso.org/ iso /en/CatalogueDetailPage.CatalogueDetail?CSNUMBER=39612&ICS1=35&ICS2=40 &ICS3=>. ISO /IEC 27001 . Information...Management of the Systems Engineering Process. [ ISO /IEC 27001 ] ISO /IEC 27001 :2005. Information technology -- Security techniques -- Information security...software life cycles [ ISO /IEC 15026]. Software assurance is a key element of national security and homeland security. It is critical because dramatic
Estimation of submarine mass failure probability from a sequence of deposits with age dates
Geist, Eric L.; Chaytor, Jason D.; Parsons, Thomas E.; ten Brink, Uri S.
2013-01-01
The empirical probability of submarine mass failure is quantified from a sequence of dated mass-transport deposits. Several different techniques are described to estimate the parameters for a suite of candidate probability models. The techniques, previously developed for analyzing paleoseismic data, include maximum likelihood and Type II (Bayesian) maximum likelihood methods derived from renewal process theory and Monte Carlo methods. The estimated mean return time from these methods, unlike estimates from a simple arithmetic mean of the center age dates and standard likelihood methods, includes the effects of age-dating uncertainty and of open time intervals before the first and after the last event. The likelihood techniques are evaluated using Akaike’s Information Criterion (AIC) and Akaike’s Bayesian Information Criterion (ABIC) to select the optimal model. The techniques are applied to mass transport deposits recorded in two Integrated Ocean Drilling Program (IODP) drill sites located in the Ursa Basin, northern Gulf of Mexico. Dates of the deposits were constrained by regional bio- and magnetostratigraphy from a previous study. Results of the analysis indicate that submarine mass failures in this location occur primarily according to a Poisson process in which failures are independent and return times follow an exponential distribution. However, some of the model results suggest that submarine mass failures may occur quasiperiodically at one of the sites (U1324). The suite of techniques described in this study provides quantitative probability estimates of submarine mass failure occurrence, for any number of deposits and age uncertainty distributions.
Environmental mapping and monitoring of Iceland by remote sensing (EMMIRS)
NASA Astrophysics Data System (ADS)
Pedersen, Gro B. M.; Vilmundardóttir, Olga K.; Falco, Nicola; Sigurmundsson, Friðþór S.; Rustowicz, Rose; Belart, Joaquin M.-C.; Gísladóttir, Gudrun; Benediktsson, Jón A.
2016-04-01
Iceland is exposed to rapid and dynamic landscape changes caused by natural processes and man-made activities, which impact and challenge the country. Fast and reliable mapping and monitoring techniques are needed on a big spatial scale. However, currently there is lack of operational advanced information processing techniques, which are needed for end-users to incorporate remote sensing (RS) data from multiple data sources. Hence, the full potential of the recent RS data explosion is not being fully exploited. The project Environmental Mapping and Monitoring of Iceland by Remote Sensing (EMMIRS) bridges the gap between advanced information processing capabilities and end-user mapping of the Icelandic environment. This is done by a multidisciplinary assessment of two selected remote sensing super sites, Hekla and Öræfajökull, which encompass many of the rapid natural and man-made landscape changes that Iceland is exposed to. An open-access benchmark repository of the two remote sensing supersites is under construction, providing high-resolution LIDAR topography and hyperspectral data for land-cover and landform classification. Furthermore, a multi-temporal and multi-source archive stretching back to 1945 allows a decadal evaluation of landscape and ecological changes for the two remote sensing super sites by the development of automated change detection techniques. The development of innovative pattern recognition and machine learning-based approaches to image classification and change detection is one of the main tasks of the EMMIRS project, aiming to extract and compute earth observation variables as automatically as possible. Ground reference data collected through a field campaign will be used to validate the implemented methods, which outputs are then inferred with geological and vegetation models. Here, preliminary results of an automatic land-cover classification based on hyperspectral image analysis are reported. Furthermore, the EMMIRS project investigates the complex landscape dynamics between geological and ecological processes. This is done through cross-correlation of mapping results and implementation of modelling techniques that simulate geological and ecological processes in order to extrapolate the landscape evolution
Krishnamurthy, Krish; Hari, Natarajan
2017-09-15
The recently published CRAFT (complete reduction to amplitude frequency table) technique converts the raw FID data (i.e., time domain data) into a table of frequencies, amplitudes, decay rate constants, and phases. It offers an alternate approach to decimate time-domain data, with minimal preprocessing step. It has been shown that application of CRAFT technique to process the t 1 dimension of the 2D data significantly improved the detectable resolution by its ability to analyze without the use of ubiquitous apodization of extensively zero-filled data. It was noted earlier that CRAFT did not resolve sinusoids that were not already resolvable in time-domain (i.e., t 1 max dependent resolution). We present a combined NUS-IST-CRAFT approach wherein the NUS acquisition technique (sparse sampling technique) increases the intrinsic resolution in time-domain (by increasing t 1 max), IST fills the gap in the sparse sampling, and CRAFT processing extracts the information without loss due to any severe apodization. NUS and CRAFT are thus complementary techniques to improve intrinsic and usable resolution. We show that significant improvement can be achieved with this combination over conventional NUS-IST processing. With reasonable sensitivity, the models can be extended to significantly higher t 1 max to generate an indirect-DEPT spectrum that rivals the direct observe counterpart. Copyright © 2017 John Wiley & Sons, Ltd.
A new image enhancement algorithm with applications to forestry stand mapping
NASA Technical Reports Server (NTRS)
Kan, E. P. F. (Principal Investigator); Lo, J. K.
1975-01-01
The author has identified the following significant results. Results show that the new algorithm produced cleaner classification maps in which holes of small predesignated sizes were eliminated and significant boundary information was preserved. These cleaner post-processed maps better resemble true life timber stand maps and are thus more usable products than the pre-post-processing ones: Compared to an accepted neighbor-checking post-processing technique, the new algorithm is more appropriate for timber stand mapping.
Multichannel Doppler Processing for an Experimental Low-Angle Tracking System
1990-05-01
estimation techniques at sea. Because of clutter and noise, it is necessary to use a number of different processing algorithms to extract the required...a number of different processing algorithms to extract the required information. Consequently, the ELAT radar system is composed of multiple...corresponding to RF frequencies, f, and f2. For mode 3, the ambiguities occur at vbi = 15.186 knots and vb2 = 16.96 knots. The sea clutter, with a spectrum
Lamb wave detection of limpet mines on ship hulls.
Bingham, Jill; Hinders, Mark; Friedman, Adam
2009-12-01
This paper describes the use of ultrasonic guided waves for identifying the mass loading due to underwater limpet mines on ship hulls. The Dynamic Wavelet Fingerprint Technique (DFWT) is used to render the guided wave mode information in two-dimensional binary images because the waveform features of interest are too subtle to identify in time domain. The use of wavelets allows both time and scale features from the original signals to be retained, and image processing can be used to automatically extract features that correspond to the arrival times of the guided wave modes. For further understanding of how the guided wave modes propagate through the real structures, a parallel processing, 3D elastic wave simulation is developed using the finite integration technique (EFIT). This full field, technique models situations that are too complex for analytical solutions, such as built up 3D structures. The simulations have produced informative visualizations of the guided wave modes in the structures as well as mimicking directly the output from sensors placed in the simulation space for direct comparison to experiments. Results from both drydock and in-water experiments with dummy mines are also shown.
Hyperspectral Imaging Using Flexible Endoscopy for Laryngeal Cancer Detection
Regeling, Bianca; Thies, Boris; Gerstner, Andreas O. H.; Westermann, Stephan; Müller, Nina A.; Bendix, Jörg; Laffers, Wiebke
2016-01-01
Hyperspectral imaging (HSI) is increasingly gaining acceptance in the medical field. Up until now, HSI has been used in conjunction with rigid endoscopy to detect cancer in vivo. The logical next step is to pair HSI with flexible endoscopy, since it improves access to hard-to-reach areas. While the flexible endoscope’s fiber optic cables provide the advantage of flexibility, they also introduce an interfering honeycomb-like pattern onto images. Due to the substantial impact this pattern has on locating cancerous tissue, it must be removed before the HS data can be further processed. Thereby, the loss of information is to minimize avoiding the suppression of small-area variations of pixel values. We have developed a system that uses flexible endoscopy to record HS cubes of the larynx and designed a special filtering technique to remove the honeycomb-like pattern with minimal loss of information. We have confirmed its feasibility by comparing it to conventional filtering techniques using an objective metric and by applying unsupervised and supervised classifications to raw and pre-processed HS cubes. Compared to conventional techniques, our method successfully removes the honeycomb-like pattern and considerably improves classification performance, while preserving image details. PMID:27529255
Hyperspectral Imaging Using Flexible Endoscopy for Laryngeal Cancer Detection.
Regeling, Bianca; Thies, Boris; Gerstner, Andreas O H; Westermann, Stephan; Müller, Nina A; Bendix, Jörg; Laffers, Wiebke
2016-08-13
Hyperspectral imaging (HSI) is increasingly gaining acceptance in the medical field. Up until now, HSI has been used in conjunction with rigid endoscopy to detect cancer in vivo. The logical next step is to pair HSI with flexible endoscopy, since it improves access to hard-to-reach areas. While the flexible endoscope's fiber optic cables provide the advantage of flexibility, they also introduce an interfering honeycomb-like pattern onto images. Due to the substantial impact this pattern has on locating cancerous tissue, it must be removed before the HS data can be further processed. Thereby, the loss of information is to minimize avoiding the suppression of small-area variations of pixel values. We have developed a system that uses flexible endoscopy to record HS cubes of the larynx and designed a special filtering technique to remove the honeycomb-like pattern with minimal loss of information. We have confirmed its feasibility by comparing it to conventional filtering techniques using an objective metric and by applying unsupervised and supervised classifications to raw and pre-processed HS cubes. Compared to conventional techniques, our method successfully removes the honeycomb-like pattern and considerably improves classification performance, while preserving image details.
Edinçliler, Ayşe; Baykal, Gökhan; Saygili, Altug
2010-06-01
Use of the processed used tires in embankment construction is becoming an accepted way of beneficially recycling scrap tires due to shortages of natural mineral resources and increasing waste disposal costs. Using these used tires in construction requires an awareness of the properties and the limitations associated with their use. The main objective of this paper is to assess the different processing techniques on the mechanical properties of used tires-sand mixtures to improve the engineering properties of the available soil. In the first part, a literature study on the mechanical properties of the processed used tires such as tire shreds, tire chips, tire buffings and their mixtures with sand are summarized. In the second part, large-scale direct shear tests are performed to evaluate shear strength of tire crumb-sand mixtures where information is not readily available in the literature. The test results with tire crumb were compared with the other processed used tire-sand mixtures. Sand-used tire mixtures have higher shear strength than that of the sand alone and the shear strength parameters depend on the processing conditions of used tires. Three factors are found to significantly affect the mechanical properties: normal stress, processing techniques, and the used tire content. Copyright 2009. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey; McNeese, Michael; Hall, David
2013-05-01
Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.
2009-09-01
this information supports the decison - making process as it is applied to the management of risk. 2. Operational Risk Operational risk is the threat... reasonability . However, to make a software system fault tolerant, the system needs to recognize and fix a system state condition. To detect a fault, a fault...Tracking ..........................................51 C. DECISION- MAKING PROCESS................................................................51 1. Risk
Scheduling Operational Operational-Level Courses of Action
2003-10-01
Process modelling and analysis – process synchronisation techniques Information and knowledge management – Collaborative planning systems – Workflow...logistics – Some tasks may consume resources The military user may wish to impose synchronisation constraints among tasks A military end state can be...effects, – constrained with resource and synchronisation considerations, and – lead to the achievement of conditions set in the end state. The COA is
Nonlinear optical memory for manipulation of orbital angular momentum of light.
de Oliveira, R A; Borba, G C; Martins, W S; Barreiro, S; Felinto, D; Tabosa, J W R
2015-11-01
We report on the demonstration of a nonlinear optical memory (NOM) for storage and on-demand manipulation of orbital angular momentum (OAM) of light via higher-order nonlinear processes in cold cesium atoms. A spatially resolved phase-matching technique is used to select each order of the nonlinear susceptibility associated, respectively, with time-delayed four-, six-, and eight-wave mixing processes. For a specific configuration of the stored OAM of the incident beams, we demonstrated that the OAM of the retrieved beam can be manipulated according to the order of the nonlinear process chosen by the operator for reading out the NOM. This demonstration indicates new pathways for applications in classical and quantum information processing where OAM of light is used to encode optical information.
NASA Astrophysics Data System (ADS)
Gaikwad, Akshay; Rehal, Diksha; Singh, Amandeep; Arvind, Dorai, Kavita
2018-02-01
We present the NMR implementation of a scheme for selective and efficient quantum process tomography without ancilla. We generalize this scheme such that it can be implemented efficiently using only a set of measurements involving product operators. The method allows us to estimate any element of the quantum process matrix to a desired precision, provided a set of quantum states can be prepared efficiently. Our modified technique requires fewer experimental resources as compared to the standard implementation of selective and efficient quantum process tomography, as it exploits the special nature of NMR measurements to allow us to compute specific elements of the process matrix by a restrictive set of subsystem measurements. To demonstrate the efficacy of our scheme, we experimentally tomograph the processes corresponding to "no operation," a controlled-NOT (CNOT), and a controlled-Hadamard gate on a two-qubit NMR quantum information processor, with high fidelities.
Advanced applications of scatterometry based optical metrology
NASA Astrophysics Data System (ADS)
Dixit, Dhairya; Keller, Nick; Kagalwala, Taher; Recchia, Fiona; Lifshitz, Yevgeny; Elia, Alexander; Todi, Vinit; Fronheiser, Jody; Vaid, Alok
2017-03-01
The semiconductor industry continues to drive patterning solutions that enable devices with higher memory storage capacity, faster computing performance, and lower cost per transistor. These developments in the field of semiconductor manufacturing along with the overall minimization of the size of transistors require continuous development of metrology tools used for characterization of these complex 3D device architectures. Optical scatterometry or optical critical dimension (OCD) is one of the most prevalent inline metrology techniques in semiconductor manufacturing because it is a quick, precise and non-destructive metrology technique. However, at present OCD is predominantly used to measure the feature dimensions such as line-width, height, side-wall angle, etc. of the patterned nano structures. Use of optical scatterometry for characterizing defects such as pitch-walking, overlay, line edge roughness, etc. is fairly limited. Inspection of process induced abnormalities is a fundamental part of process yield improvement. It provides process engineers with important information about process errors, and consequently helps optimize materials and process parameters. Scatterometry is an averaging technique and extending it to measure the position of local process induced defectivity and feature-to-feature variation is extremely challenging. This report is an overview of applications and benefits of using optical scatterometry for characterizing defects such as pitch-walking, overlay and fin bending for advanced technology nodes beyond 7nm. Currently, the optical scatterometry is based on conventional spectroscopic ellipsometry and spectroscopic reflectometry measurements, but generalized ellipsometry or Mueller matrix spectroscopic ellipsometry data provides important, additional information about complex structures that exhibit anisotropy and depolarization effects. In addition the symmetry-antisymmetry properties associated with Mueller matrix (MM) elements provide an excellent means of measuring asymmetry present in the structure. The useful additional information as well as symmetry-antisymmetry properties of MM elements is used to characterize fin bending, overlay defects and design improvements in the OCD test structures are used to boost OCDs' sensitivity to pitch-walking. In addition, the validity of the OCD based results is established by comparing the results to the top down critical dimensionscanning electron microscope (CD-SEM) and cross-sectional transmission electron microscope (TEM) images.
NASA Astrophysics Data System (ADS)
Olafsen, L. J.; Olafsen, J. S.; Eaves, I. K.
2018-06-01
We report on an experimental investigation of the time-dependent spatial intensity distribution of near-infrared idler pulses from an optical parametric oscillator measured using an infrared (IR) camera, in contrast to beam profiles obtained using traditional knife-edge techniques. Comparisons show the information gained by utilizing the thermal camera provides more detail than the spatially- or time-averaged measurements from a knife-edge profile. Synchronization, averaging, and thresholding techniques are applied to enhance the images acquired. The additional information obtained can improve the process by which semiconductor devices and other IR lasers are characterized for their beam quality and output response and thereby result in IR devices with higher performance.
Insight, working through, and practice: the role of procedural knowledge.
Rosenblatt, Allan
2004-01-01
A conception of insight is proposed, based on a systems and information-processing framework and using current neuroscience concepts, as an integration of information that results in a new symbolization of experience with a significant change in self-image and a transformation of non-declarative procedural knowledge into declarative knowledge. Since procedural memory and knowledge, seen to include emotional and relationship issues, is slow to change, durable emotional and behavioral change often requires repeated practice, a need not explicitly addressed in standard psychoanalytic technique. Working through is thus seen as also encompassing nondynamic factors. The application of these ideas to therapeutic technique suggests possible therapeutic interventions beyond interpretation. An illustrative clinical vignette is presented.
NASA Technical Reports Server (NTRS)
Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.
1987-01-01
This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.
NCTM of liquids at high temperatures using polarization techniques
NASA Technical Reports Server (NTRS)
Krishnan, Shankar; Weber, J. K. Richard; Nordine, Paul C.; Schiffman, Robert A.
1990-01-01
Temperature measurement and control is extremely important in any materials processing application. However, conventional techniques for non-contact temperature measurement (mainly optical pyrometry) are very uncertain because of unknown or varying surface emittance. Optical properties like other properties change during processing. A dynamic, in-situ measurement of optical properties including the emittance is required. Intersonics is developing new technologies using polarized laser light scattering to determine surface emittance of freely radiating bodies concurrent with conventional optical pyrometry. These are sufficient to determine the true surface temperature of the target. Intersonics is currently developing a system called DAPP, the Division of Amplitude Polarimetric Pyrometer, that uses polarization information to measure the true thermodynamic temperature of freely radiating objects. This instrument has potential use in materials processing applications in ground and space based equipment. Results of thermophysical and thermodynamic measurements using laser reflection as a temperature measuring tool are presented. The impact of these techniques on thermophysical property measurements at high temperature is discussed.
Trust metrics in information fusion
NASA Astrophysics Data System (ADS)
Blasch, Erik
2014-05-01
Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.
Functional Magnetic Resonance Imaging
ERIC Educational Resources Information Center
Voos, Avery; Pelphrey, Kevin
2013-01-01
Functional magnetic resonance imaging (fMRI), with its excellent spatial resolution and ability to visualize networks of neuroanatomical structures involved in complex information processing, has become the dominant technique for the study of brain function and its development. The accessibility of in-vivo pediatric brain-imaging techniques…
Investigating Your Environment.
ERIC Educational Resources Information Center
Forest Service (USDA), Washington, DC.
The goal of this interdisciplinary curriculum is to enable students to make informed and responsible decisions about natural resources management by promoting an understanding of natural, social, and economic environments and the student's role in affecting all three. The included investigations utilize processes and techniques that help people…
Land Use Management for Solid Waste Programs
ERIC Educational Resources Information Center
Brown, Sanford M., Jr.
1974-01-01
The author discusses the problems of solid waste disposal and examines various land use management techniques. These include the land use plan, zoning, regionalization, land utilities, and interim use. Information concerning solid waste processing site zoning and analysis is given. Bibliography included. (MA)
Industry in the 80s: saving with solar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1979-11-01
This brochure is designed to acquaint industries that will be building new plants in the 1980s with the techniques of constructing energy conservative buildings and plants, various methods of using solar energy to supply heat for industrial processes, and the potential for building plants that draw all of their energy from the sun. Some organizations and information centers to contact for solar energy information are listed. (WHR)
Shaeri, Mohammad Ali; Sodagar, Amir M
2015-05-01
This paper proposes an efficient data compression technique dedicated to implantable intra-cortical neural recording devices. The proposed technique benefits from processing neural signals in the Discrete Haar Wavelet Transform space, a new spike extraction approach, and a novel data framing scheme to telemeter the recorded neural information to the outside world. Based on the proposed technique, a 64-channel neural signal processor was designed and prototyped as a part of a wireless implantable extra-cellular neural recording microsystem. Designed in a 0.13- μ m standard CMOS process, the 64-channel neural signal processor reported in this paper occupies ∼ 0.206 mm(2) of silicon area, and consumes 94.18 μW when operating under a 1.2-V supply voltage at a master clock frequency of 1.28 MHz.
Music score watermarking by clef modifications
NASA Astrophysics Data System (ADS)
Schmucker, Martin; Yan, Hongning
2003-06-01
In this paper we present a new method for hiding data in music scores. In contrast to previous published algorithms we investigate the possibilities of embedding information in clefs. Using the clef as information carrier has two advantages: First, a clef is present in each staff line which guarantees a fixed capacity. Second, the clef defines the reference system for musical symbols and music containing symbols, e.g. the notes and the rests, are not degraded by manipulations. Music scores must be robust against greyscale to binary conversion. As a consequence, the information is embedded by modifying the black and white distribution of pixels in certain areas. We evaluate simple image processing mechanisms based on erosion and dilation for embedding the information. For retrieving the watermark the b/w-distribution is extracted from the given clef. To solve the synchronization problem the watermarked clef is normalized in a pre-processing step. The normalization is based on moments. The areas used for watermarking are calculated by image segmentation techniques which consider the features of a clef. We analyze capacity and robustness of the proposed method using different parameters for our proposed method. This proposed method can be combined with other music score watermarking methods to increase the capacity of existing watermarking techniques.
Milner, Rafał; Rusiniak, Mateusz; Lewandowska, Monika; Wolak, Tomasz; Ganc, Małgorzata; Piątkowska-Janko, Ewa; Bogorodzki, Piotr; Skarżyński, Henryk
2014-01-01
Background The neural underpinnings of auditory information processing have often been investigated using the odd-ball paradigm, in which infrequent sounds (deviants) are presented within a regular train of frequent stimuli (standards). Traditionally, this paradigm has been applied using either high temporal resolution (EEG) or high spatial resolution (fMRI, PET). However, used separately, these techniques cannot provide information on both the location and time course of particular neural processes. The goal of this study was to investigate the neural correlates of auditory processes with a fine spatio-temporal resolution. A simultaneous auditory evoked potentials (AEP) and functional magnetic resonance imaging (fMRI) technique (AEP-fMRI), together with an odd-ball paradigm, were used. Material/Methods Six healthy volunteers, aged 20–35 years, participated in an odd-ball simultaneous AEP-fMRI experiment. AEP in response to acoustic stimuli were used to model bioelectric intracerebral generators, and electrophysiological results were integrated with fMRI data. Results fMRI activation evoked by standard stimuli was found to occur mainly in the primary auditory cortex. Activity in these regions overlapped with intracerebral bioelectric sources (dipoles) of the N1 component. Dipoles of the N1/P2 complex in response to standard stimuli were also found in the auditory pathway between the thalamus and the auditory cortex. Deviant stimuli induced fMRI activity in the anterior cingulate gyrus, insula, and parietal lobes. Conclusions The present study showed that neural processes evoked by standard stimuli occur predominantly in subcortical and cortical structures of the auditory pathway. Deviants activate areas non-specific for auditory information processing. PMID:24413019
Analysis of Information Content in High-Spectral Resolution Sounders using Subset Selection Analysis
NASA Technical Reports Server (NTRS)
Velez-Reyes, Miguel; Joiner, Joanna
1998-01-01
In this paper, we summarize the results of the sensitivity analysis and data reduction carried out to determine the information content of AIRS and IASI channels. The analysis and data reduction was based on the use of subset selection techniques developed in the linear algebra and statistical community to study linear dependencies in high dimensional data sets. We applied the subset selection method to study dependency among channels by studying the dependency among their weighting functions. Also, we applied the technique to study the information provided by the different levels in which the atmosphere is discretized for retrievals and analysis. Results from the method correlate well with intuition in many respects and point out to possible modifications for band selection in sensor design and number and location of levels in the analysis process.
Metallographic techniques for evaluation of thermal barrier coatings
NASA Technical Reports Server (NTRS)
Brindley, William J.; Leonhardt, Todd A.
1990-01-01
The performance of ceramic thermal barrier coatings is strongly dependent on the amount and shape of the porosity in the coating. Current metallographic techniques do not provide polished surfaces that are adequate for a repeatable interpretation of the coating structures. A technique recently developed at NASA-Lewis for preparation of thermal barrier coating sections combines epoxy impregnation, careful sectioning and polishing, and interference layering to provide previously unobtainable information on processing-induced porosity. In fact, increased contrast and less ambiguous structure developed by the method make automatic quantitative metallography a viable option for characterizing thermal barrier coating structures.
2011-01-01
Novel molecular imaging techniques are at the forefront of both preclinical and clinical imaging strategies. They have significant potential to offer visualisation and quantification of molecular and cellular changes in health and disease. This will help to shed light on pathobiology and underlying disease processes and provide further information about the mechanisms of action of novel therapeutic strategies. This review explores currently available molecular imaging techniques that are available for preclinical studies with a focus on optical imaging techniques and discusses how current and future advances will enable translation into the clinic for patients with arthritis. PMID:21345267
Optical smart packaging to reduce transmitted information.
Cabezas, Luisa; Tebaldi, Myrian; Barrera, John Fredy; Bolognini, Néstor; Torroba, Roberto
2012-01-02
We demonstrate a smart image-packaging optical technique that uses what we believe is a new concept to save byte space when transmitting data. The technique supports a large set of images mapped into modulated speckle patterns. Then, they are multiplexed into a single package. This operation results in a substantial decreasing of the final amount of bytes of the package with respect to the amount resulting from the addition of the images without using the method. Besides, there are no requirements on the type of images to be processed. We present results that proof the potentiality of the technique.
Diffraction Contrast Tomography: A Novel 3D Polycrystalline Grain Imaging Technique
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuettner, Lindsey Ann
2017-06-06
Diffraction contrast tomography (DCT) is a non-destructive way of imaging microstructures of polycrystalline materials such as metals or crystalline organics. It is a useful technique to map 3D grain structures as well as providing crystallographic information such as crystal orientation, grain shape, and strain. Understanding the internal microstructure of a material is important in understanding the bulk material properties. This report gives a general overview of the similar techniques, DCT data acquisition, and analysis processes. Following the short literature review, potential work and research at Los Alamos National Laboratory (LANL) is discussed.
NASA Technical Reports Server (NTRS)
Peuquet, Donna J.
1987-01-01
A new approach to building geographic data models that is based on the fundamental characteristics of the data is presented. An overall theoretical framework for representing geographic data is proposed. An example of utilizing this framework in a Geographic Information System (GIS) context by combining artificial intelligence techniques with recent developments in spatial data processing techniques is given. Elements of data representation discussed include hierarchical structure, separation of locational and conceptual views, and the ability to store knowledge at variable levels of completeness and precision.
The influence of multispectral scanner spatial resolution on forest feature classification
NASA Technical Reports Server (NTRS)
Sadowski, F. G.; Malila, W. A.; Sarno, J. E.; Nalepka, R. F.
1977-01-01
Inappropriate spatial resolution and corresponding data processing techniques may be major causes for non-optimal forest classification results frequently achieved from multispectral scanner (MSS) data. Procedures and results of empirical investigations are studied to determine the influence of MSS spatial resolution on the classification of forest features into levels of detail or hierarchies of information that might be appropriate for nationwide forest surveys and detailed in-place inventories. Two somewhat different, but related studies are presented. The first consisted of establishing classification accuracies for several hierarchies of features as spatial resolution was progressively coarsened from (2 meters) squared to (64 meters) squared. The second investigated the capabilities for specialized processing techniques to improve upon the results of conventional processing procedures for both coarse and fine resolution data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, G.; Davis, A.; Burke, F.P.
1991-12-01
This study demonstrated the use of the gold tube carbonization technique and reflectance microscopy analysis for the examination of process-derived materials from direct coal liquefaction. The carbonization technique, which was applied to coal liquefaction distillation resids, yields information on the amounts of gas plus distillate, pyridine-soluble resid, and pyridine-insoluble material formed when a coal liquid sample is heated to 450{degree}C for one hour at 5000 psi in an inert atmosphere. The pyridine-insolubles then are examined by reflectance microscopy to determine the type, amount, and optical texture of isotropic and anisotropic carbon formed upon carbonization. Further development of these analytical methodsmore » as process development tools may be justified on the basis of these results.« less
Liu, Fei; Zhang, Xi; Jia, Yan
2015-01-01
In this paper, we propose a computer information processing algorithm that can be used for biomedical image processing and disease prediction. A biomedical image is considered a data object in a multi-dimensional space. Each dimension is a feature that can be used for disease diagnosis. We introduce a new concept of the top (k1,k2) outlier. It can be used to detect abnormal data objects in the multi-dimensional space. This technique focuses on uncertain space, where each data object has several possible instances with distinct probabilities. We design an efficient sampling algorithm for the top (k1,k2) outlier in uncertain space. Some improvement techniques are used for acceleration. Experiments show our methods' high accuracy and high efficiency.
Estimation of the Scatterer Distribution of the Cirrhotic Liver using Ultrasonic Image
NASA Astrophysics Data System (ADS)
Yamaguchi, Tadashi; Hachiya, Hiroyuki
1998-05-01
In the B-mode image of the liver obtained by an ultrasonic imaging system, the speckled pattern changes with the progression of the disease such as liver cirrhosis.In this paper we present the statistical characteristics of the echo envelope of the liver, and the technique to extract information of the scatterer distribution from the normal and cirrhotic liver images using constant false alarm rate (CFAR) processing.We analyze the relationship between the extracted scatterer distribution and the stage of liver cirrhosis. The ratio of the area in which the amplitude of the processing signal is more than the threshold to the entire processed image area is related quantitatively to the stage of liver cirrhosis.It is found that the proposed technique is valid for the quantitative diagnosis of liver cirrhosis.
Reducing the Requirements and Cost of Astronomical Telescopes
NASA Technical Reports Server (NTRS)
Smith, W. Scott; Whitakter, Ann F. (Technical Monitor)
2002-01-01
Limits on astronomical telescope apertures are being rapidly approached. These limits result from logistics, increasing complexity, and finally budgetary constraints. In an historical perspective, great strides have been made in the area of aperture, adaptive optics, wavefront sensors, detectors, stellar interferometers and image reconstruction. What will be the next advances? Emerging data analysis techniques based on communication theory holds the promise of yielding more information from observational data based on significant computer post-processing. This paper explores some of the current telescope limitations and ponders the possibilities increasing the yield of scientific data based on the migration computer post-processing techniques to higher dimensions. Some of these processes hold the promise of reducing the requirements on the basic telescope hardware making the next generation of instruments more affordable.
NASA Technical Reports Server (NTRS)
Campbell, W. J.; Goldberg, M.
1982-01-01
NASA's Eastern Regional Remote Sensing Applications Center (ERRSAC) has recognized the need to accommodate spatial analysis techniques in its remote sensing technology transfer program. A computerized Geographic Information System to incorporate remotely sensed data, specifically Landsat, with other relevant data was considered a realistic approach to address a given resource problem. Questions arose concerning the selection of a suitable available software system to demonstrate, train, and undertake demonstration projects with ERRSAC's user community. The very specific requirements for such a system are discussed. The solution found involved the addition of geographic information processing functions to the Interactive Digital Image Manipulation System (IDIMS). Details regarding the functions of the new integrated system are examined along with the characteristics of the software.
Enhancement of time images for photointerpretation
NASA Technical Reports Server (NTRS)
Gillespie, A. R.
1986-01-01
The Thermal Infrared Multispectral Scanner (TIMS) images consist of six channels of data acquired in bands between 8 and 12 microns, thus they contain information about both temperature and emittance. Scene temperatures are controlled by reflectivity of the surface, but also by its geometry with respect to the Sun, time of day, and other factors unrelated to composition. Emittance is dependent upon composition alone. Thus the photointerpreter may wish to enhance emittance information selectively. Because thermal emittances in real scenes vary but little, image data tend to be highly correlated along channels. Special image processing is required to make this information available for the photointerpreter. Processing includes noise removal, construction of model emittance images, and construction of false-color pictures enhanced by decorrelation techniques.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Paradella, W. R.; Vitorello, I.
1982-01-01
Several aspects of computer-assisted analysis techniques for image enhancement and thematic classification by which LANDSAT MSS imagery may be treated quantitatively are explained. On geological applications, computer processing of digital data allows, possibly, the fullest use of LANDSAT data, by displaying enhanced and corrected data for visual analysis and by evaluating and assigning each spectral pixel information to a given class.